Arquivo da tag: Mídia

Desafios do “tsunami de dados” (FAPESP)

Lançado pelo Instituto Microsoft Research-FAPESP de Pesquisas em TI, o livro O Quarto Paradigma debate os desafios da eScience, nova área dedicada a lidar com o imenso volume de informações que caracteriza a ciência atual

07/11/2011

Por Fábio de Castro

Agência FAPESP – Se há alguns anos a falta de dados limitava os avanços da ciência, hoje o problema se inverteu. O desenvolvimento de novas tecnologias de captação de dados, nas mais variadas áreas e escalas, tem gerado um volume tão imenso de informações que o excesso se tornou um gargalo para o avanço científico.

Nesse contexto, cientistas da computação têm se unido a especialistas de diferentes áreas para desenvolver novos conceitos e teorias capazes de lidar com a enxurrada de dados da ciência contemporânea. O resultado é chamado de eScience.

Esse é o tema debatido no livro O Quarto Paradigma – Descobertas científicas na era da eScience, lançado no dia 3 de novembro pelo Instituto Microsoft Research-FAPESP de Pesquisas em TI.

Organizado por Tony Hey, Stewart Tansley, Kristin Tolle – todos da Microsoft Research –, a publicação foi lançada na sede da FAPESP, em evento que contou com a presença do diretor científico da Fundação, Carlos Henrique de Brito Cruz.

Durante o lançamento, Roberto Marcondes Cesar Jr., do Instituto de Matemática e Estatística (IME) da Universidade de São Paulo (USP), apresentou a palestra “eScience no Brasil”. “O Quarto Paradigma: computação intensiva de dados avançando a descoberta científica” foi o tema da palestra de Daniel Fay, diretor de Terra, Energia e Meio Ambiente da MSR.

Brito Cruz destacou o interesse da FAPESP em estimular o desenvolvimento da eScience no Brasil. “A FAPESP está muito conectada a essa ideia, porque muitos dos nossos projetos e programas apresentam essa necessidade de mais capacidade de gerenciar grandes conjuntos de dados. O nosso grande desafio está na ciência por trás dessa capacidade de lidar com grandes volumes de dados”, disse.

Iniciativas como o Programa FAPESP de Pesquisa sobre Mudanças Climáticas Globais (PFPMCG), o BIOTA-FAPESP e o Programa FAPESP de Pesquisa em Bioenergia (BIOEN) são exemplos de programas que têm grande necessidade de integrar e processar imensos volumes de dados.

“Sabemos que a ciência avança quando novos instrumentos são disponibilizados. Por outro lado, os cientistas normalmente não percebem o computador como um novo grande instrumento que revoluciona a ciência. A FAPESP está interessada em ações para que a comunidade científica tome consciência de que há grandes desafios na área de eScience”, disse Brito Cruz.

O livro é uma coleção de 26 ensaios técnicos divididos em quatro seções: “Terra e meio ambiente”, “Saúde e bem-estar”, “Infraestrutura científica” e “Comunicação acadêmica”.

“O livro fala da emergência de um novo paradigma para as descobertas científicas. Há milhares de anos, o paradigma vigente era o da ciência experimental, fundamentada na descrição de fenômenos naturais. Há algumas centenas de anos, surgiu o paradigma da ciência teórica, simbolizado pelas leis de Newton. Há algumas décadas, surgiu a ciência computacional, simulando fenômenos complexos. Agora, chegamos ao quarto paradigma, que é o da ciência orientada por dados”, disse Fay.

Com o advento do novo paradigma, segundo ele, houve uma mudança completa na natureza da descoberta científica. Entraram em cena modelos complexos, com amplas escalas espaciais e temporais, que exigem cada vez mais interações multidisciplinares.

“Os dados, em quantidade incrível, são provenientes de diferentes fontes e precisam também de abordagem multidisciplinar e, muitas vezes, de tratamento em tempo real. As comunidades científicas também estão mais distribuídas. Tudo isso transformou a maneira como se fazem descobertas”, disse Fay.

A ecologia, uma das áreas altamente afetadas pelos grandes volumes de dados, é um exemplo de como o avanço da ciência, cada vez mais, dependerá da colaboração entre pesquisadores acadêmicos e especialistas em computação.

“Vivemos em uma tempestade de sensoriamento remoto, sensores terrestres baratos e acesso a dados na internet. Mas extrair as variáveis que a ciência requer dessa massa de dados heterogêneos continua sendo um problema. É preciso ter conhecimento especializado sobre algoritmos, formatos de arquivos e limpeza de dados, por exemplo, que nem sempre é acessível para o pessoal da área de ecologia”, explicou.

O mesmo ocorre em áreas como medicina e biologia – que se beneficiam de novas tecnologias, por exemplo, em registros de atividade cerebral, ou de sequenciamento de DNA – ou a astronomia e física, à medida que os modernos telescópios capturam terabytes de informação diariamente e o Grande Colisor de Hádrons (LHC) gera petabytes de dados a cada ano.

Instituto Virtual

Segundo Cesar Jr., a comunidade envolvida com eScience no Brasil está crescendo. O país tem 2.167 cursos de sistemas de informação ou engenharia e ciências da computação. Em 2009, houve 45 mil formados nessas áreas e a pós-graduação, entre 2007 e 2009, tinha 32 cursos, mil orientadores, 2.705 mestrandos e 410 doutorandos.

“A ciência mudou do paradigma da aquisição de dados para o da análise de dados. Temos diferentes tecnologias que produzem terabytes em diversos campos do conhecimento e, hoje, podemos dizer que essas áreas têm foco na análise de um dilúvio de dados”, disse o membro da Coordenação da Área de Ciência e Engenharia da Computação da FAPESP.

Em 2006, a Sociedade Brasileira de Computação (SBC) organizou um encontro a fim de identificar os problemas-chave e os principais desafios para a área. Isso levou a diferentes propostas para que o Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq) criasse um programa específico para esse tipo de problema.

“Em 2009, realizamos uma série de workshops na FAPESP, reunindo, para discutir essa questão, cientistas de áreas como agricultura, mudanças climáticas, medicina, transcriptômica, games, governo eletrônico e redes sociais. A iniciativa resultou em excelentes colaborações entre grupos de cientistas com problemas semelhantes e originou diversas iniciativas”, disse César Jr.

As chamadas do Instituto Microsoft Research-FAPESP de Pesquisas em TI, segundo ele, têm sido parte importante do conjunto de iniciativas para promover a eScience, assim como a organização da Escola São Paulo de Ciência Avançada em Processamento e Visualização de Imagens Computacionais. Além disso, a FAPESP tem apoiado diversos projetos de pesquisa ligados ao tema.

“A comunidade de eScience em São Paulo tem trabalhado com profissionais de diversas áreas e publicado em revistas de várias delas. Isso é indicação de qualidade adquirida pela comunidade para encarar o grande desafio que teremos nos próximos anos”, disse César Jr., que assina o prefácio da edição brasileira do livro.

  • O Quarto Paradigma
    Organizadores: Tony Hey, Stewart Tansley e Kristin Tolle
    Lançamento: 2011
    Preço: R$ 60
    Páginas: 263
    Mais informações: www.ofitexto.com.br

Terra, que Tempo é Esse? (PUC)

Por Gabriela Caesar – Do Portal, 28/10/2011. Fotos: Eduardo de Holanda.

Embora a “soberania nacional e o mercado criem cenário conflitoso”, a população está consciente de que o estilo de vida precisa mudar, acredita o antropólogo Roberto da Matta. Já a jornalista Sônia Bridi pondera que “não adianta discutir ou culpar quem começou”, mas trocar o modelo de produção. Reunidos na PUC-Rio para o debate “Terra, que tempo é esse?” (assista às partes 1 e 2 abaixo), nesta segunda-feira (24), com mediação do professor Paulo Ferracioli, do Departamento de Economia, eles reforçaram a importância de um desenvolvimento mais alinhado às demandas ambientais.

O secretário estadual do Ambiente, Carlos Minc (PT-RJ), acrescentou que a negociação com grandes empresas, como a Companhia Siderúrgica Nacional (CSN), deve incluir o acompanhamento de tecnologias que possam não só diminuir as agressões ambientais, mas também resguardar a saúde dos trabalhadores. Ainda em relação a tecnologias “ecologicamente corretas”, Sônia Bridi afirmou que o estado do Rio “erra ao se decidir por ônibus, em vez de veículo leve sobre trilho”.

Diante dos aproximadamente cem estudantes que acompanhavam o debate no auditório do RDC, Roberto da Matta destacou que a mudança para um estilo de vida mais saudável e comprometido com o ambiente revela-se igualmente importante para combater outro problema, segundo ele, agravado pela globalização: a obesidade mórbida, que dá origem ao neologismo “globesidade”. Para diminuir o avanço da doença, que aumentou em um terço na China, o antropólogo é categórico ao propor um padrão social menos consumista.

Usina de contrastes e um dos principais lubrificantes do consumo mundial, a China encara o desafio de reduzir as faturas ambientais – alvo recorrente de críticas em foruns internacionais – e de saúde. Para Sônia Bridi, a locomotiva da economia global investe no longo prazo:

– Até 2020, a China terá 20 mil quilômetros de trem bala. Eles estão preocupados com isso, porque a qualidade da saúde deles está piorando muito.

O trilho do desenvolvimento responsável não passa necessariemente por grandes investimentos. O diretor do Núcleo Interdisciplinar do Meio Ambiente (Nima), Luiz Felipe Guanaes, lembrou que iniciativas como a coleta seletiva, implantada em junho deste ano no campus da PUC-Rio, também aproximam o cidadão de um maior compromisso ambiental e social. Outra oportunidade de a “comunidade se engajar na causa”, lembrou ele, será o encontro de pesquisadores e especialistas na universidade em 2012, para a Rio+20, em parceria com a ONU.

Sônia também contou bastidores da série de reportagem “Terra, que país é esse?” – que mostrou os avanços do aquecimento global e nomeou o debate. No Peru, ela e o repórter cinematográfico Paulo Zero notaram o impacto no cotidiano, até em rituais.

– Num determinado dia, próximo à festa do Corpus Christi, confrarias do país inteiro sobem certa montanha e colhem blocos de gelo. Tiveram de mudar o ritual, que vem do tempo dos incas, incorporado pelo cristianismo. Eles pararam de tirar gelo.

Paulo Zero admite que a produção jornalística, atrelada ao cumprimento de prazos “curtos”, dificulta o tratamento do assunto. Outra barreira, diz Paulo, pode ser a logística. Para a reportagem na Groelândia, por exemplo, ele e Sônia navegaram por seis horas até chegar à ilha. Se o trajeto atrapalhou, a sorte foi uma aliada.

– Chegamos à geleira e, em cinco minutos, caiu um grande bloco de gelo. Ficamos mais três horas lá e não caiu mais nenhum pedaço de gelo. Ou seja, estávamos na hora certa e no lugar certo – contou o cinegrafista.

Parte 1 (clique na imagem)

Parte 2 (clique na imagem)

The scientific finding that settles the climate-change debate (Washington Post)

By Eugene Robinson, Published: October 24

For the clueless or cynical diehards who deny global warming, it’s getting awfully cold out there.

The latest icy blast of reality comes from an eminent scientist whom the climate-change skeptics once lauded as one of their own. Richard Muller, a respected physicist at the University of California, Berkeley, used to dismiss alarmist climate research as being “polluted by political and activist frenzy.” Frustrated at what he considered shoddy science, Muller launched his own comprehensive study to set the record straight. Instead, the record set him straight.

“Global warming is real,” Muller wrote last week in The Wall Street Journal.

Rick Perry, Herman Cain, Michele Bachmann and the rest of the neo-Luddites who are turning the GOP into the anti-science party should pay attention.

“When we began our study, we felt that skeptics had raised legitimate issues, and we didn’t know what we’d find,” Muller wrote. “Our results turned out to be close to those published by prior groups. We think that means that those groups had truly been careful in their work, despite their inability to convince some skeptics of that.”

In other words, the deniers’ claims about the alleged sloppiness or fraudulence of climate science are wrong. Muller’s team, the Berkeley Earth Surface Temperature project, rigorously explored the specific objections raised by skeptics — and found them groundless.

Muller and his fellow researchers examined an enormous data set of observed temperatures from monitoring stations around the world and concluded that the average land temperature has risen 1 degree Celsius — or about 1.8 degrees Fahrenheit — since the mid-1950s.

This agrees with the increase estimated by the United Nations-sponsored Intergovernmental Panel on Climate Change. Muller’s figures also conform with the estimates of those British and American researchers whose catty e-mails were the basis for the alleged “Climategate” scandal, which was never a scandal in the first place.

The Berkeley group’s research even confirms the infamous “hockey stick” graph — showing a sharp recent temperature rise — that Muller once snarkily called “the poster child of the global warming community.” Muller’s new graph isn’t just similar, it’s identical.

Muller found that skeptics are wrong when they claim that a “heat island” effect from urbanization is skewing average temperature readings; monitoring instruments in rural areas show rapid warming, too. He found that skeptics are wrong to base their arguments on the fact that records from some sites seem to indicate a cooling trend, since records from at least twice as many sites clearly indicate warming. And he found that skeptics are wrong to accuse climate scientists of cherry-picking the data, since the readings that are often omitted — because they are judged unreliable — show the same warming trend.

Muller and his colleagues examined five times as many temperature readings as did other researchers — a total of 1.6 billion records — and now have put that merged database online. The results have not yet been subjected to peer review, so technically they are still preliminary. But Muller’s plain-spoken admonition that “you should not be a skeptic, at least not any longer” has reduced many deniers to incoherent grumbling or stunned silence.

Not so, I predict, with the blowhards such as Perry, Cain and Bachmann, who, out of ignorance or perceived self-interest, are willing to play politics with the Earth’s future. They may concede that warming is taking place, but they call it a natural phenomenon and deny that human activity is the cause.

It is true that Muller made no attempt to ascertain “how much of the warming is due to humans.” Still, the Berkeley group’s work should help lead all but the dimmest policymakers to the overwhelmingly probable answer.

We know that the rise in temperatures over the past five decades is abrupt and very large. We know it is consistent with models developed by other climate researchers that posit greenhouse gas emissions — the burning of fossil fuels by humans — as the cause. And now we know, thanks to Muller, that those other scientists have been both careful and honorable in their work.

Nobody’s fudging the numbers. Nobody’s manipulating data to win research grants, as Perry claims, or making an undue fuss over a “naturally occurring” warm-up, as Bachmann alleges. Contrary to what Cain says, the science is real.

It is the know-nothing politicians — not scientists — who are committing an unforgivable fraud.

Number of Facebook Friends Linked to Size of Brain Regions, Study Suggests (Science Daily)

ScienceDaily (Oct. 20, 2011) — Scientists funded by the Wellcome Trust have found a direct link between the number of ‘Facebook friends’ a person has and the size of particular brain regions. Researchers at University College London (UCL) also showed that the more Facebook friends a person has, the more ‘real-world’ friends they are likely to have.

The researchers are keen to stress that they have found a correlation and not a cause, however: in other words, it is not possible from the data to say whether having more Facebook friends makes the regions of the brain larger or whether some people are ‘hardwired’ to have more friends.

The social networking site Facebook has more than 800 million active users worldwide. Nearly 30 million of these are believed to be in the UK.

The site allows people to keep in touch online with a network of friends. The sizes of individual networks vary considerably, and some users have only a handful of online friends while others have over a thousand; however, whether this variability is reflected in the size of real-world social networks has not been clear.

Professor Geraint Rees, a Wellcome Trust Senior Clinical Research Fellow at UCL, said: “Online social networks are massively influential, yet we understand very little about the impact they have on our brains. This has led to a lot of unsupported speculation that the internet is somehow bad for us.

“Our study will help us begin to understand how our interactions with the world are mediated through social networks. This should allow us to start asking intelligent questions about the relationship between the internet and the brain — scientific questions, not political ones.”

Professor Rees and colleagues at the UCL Institute of Cognitive Neuroscience and the Wellcome Trust Centre for Neuroimaging studied brain scans of 125 university students — all active Facebook users — and compared them against the size of the students’ network of friends, both online and in the real world. Their findings, which they replicated in a further group of 40 students, are published October 20 in the journal Proceedings of the Royal Society B.

Professor Rees and colleagues found a strong connection between the number of Facebook friends an individual had and the amount of grey matter (the brain tissue where the processing is done) in several regions of the brain. One of these regions was the amygdala, a region associated with processing memory and emotional responses. A study published recently showed that the volume of grey matter in this area is larger in people with a larger network of real-world friends — the new study shows that the same is true for people with a larger network of online friends.
The size of three other regions — the right superior temporal sulcus, the left middle temporal gyrus and the right entorhinal cortex — also correlated with online social networks but did not appear to correlate with real-world networks.

The superior temporal sulcus has a role in our ability to perceive a moving object as biological, and structural defects in this region have been identified in some children with autism. The entorhinal cortex, meanwhile, has been linked to memory and navigation — including navigating through online social networks. Finally, the middle temporal gyrus has been shown to activate in response to the gaze of others and so is implicated in perception of social cues.

Dr Ryota Kanai, first author of the study, added: “We have found some interesting brain regions that seem to link to the number of friends we have — both ‘real’ and ‘virtual’. The exciting question now is whether these structures change over time — this will help us answer the question of whether the internet is changing our brains.”
As well as examining brain structure, the researchers also examined whether there was a link between the size of a person’s online network of friends and their real-world network. Previous studies have looked at this, but only in relatively small sample sizes.

The UCL researchers asked their volunteers questions such as ‘How many people would send a text message to you marking a celebratory event (e.g. birthday, new job, etc.)?’, ‘What is the total number of friends in your phonebook?’ and ‘How many friends have you kept from school and university that you could have a friendly conversation with now?’ The responses suggest that the size of their online networks also related to the size of their real world networks.

“Our findings support the idea that most Facebook users use the site to support their existing social relationships, maintaining or reinforcing these friendships, rather than just creating networks of entirely new, virtual friends,” adds Professor Rees.

Commenting on the study, Dr John Williams, Head of Neuroscience and Mental Health at the Wellcome Trust, said: “We cannot escape the ubiquity of the internet and its impact on our lives, yet we understand little of its impact on the brain, which we know is plastic and can change over time. This new study illustrates how well-designed investigations can help us begin to understand whether or not our brains are evolving as they adapt to the challenges posed by social media.”

Risco é coisa séria (JC)

JC e-mail 4364, de 14 de Outubro de 2011.

Artigo de Francisco G. Nóbrega enviado ao JC Email pelo autor.

A sociedade moderna está banhada em comunicação. Como “boa notícia não é notícia”, a lente psicológica humana registra sempre um cenário pior que a realidade. A percepção usual é que os riscos de todos os tipos aumentam dia a dia. A redução global da violência, por exemplo, é tema do livro recente do psicólogo da Universidade Harvard, Steven Pinker (http://www.samharris.org/blog/item/qa-with-steven-pinker). Ao arrepio do senso comum, ele demonstra, objetivamente, que estamos progredindo neste quesito.

Mas nossa mente não descança em sua aguda capacidade de detectar outras fontes de risco. Temos alguns campeões de audiência: energia nuclear para eletricidade, alimentos geneticamente modificados e aquecimento global catastrófico e antropogênico. O dano potencial das três ameaças mencionadas, objetivamente, não se concretizou de maneira alguma, embora a terceira ameaça deva se realizar no futuro, segundo seus defensores. As pessoas se encantam com o automóvel e seus acessórios, cada vez mais atraentes. Não se pensa em baní-lo, apesar de resultar em cerca de 40.000 mortos e inúmeros incapacitados cada ano, só no Brasil. David Ropeik, que pertence ao Centro Harvard para Análise de Risco, explica como facilmente se distorce o perigo real de situações. Quanto mais afastadas do senso comum (como radiação e plantas geneticamente modificadas), mais facilmente são manipuladas, por ignorância ou interesses outros, apavorando o cidadão comum. Ropeik explica como este medo sem sentido passa a ser um fator de estresse e um risco objetivo para a saúde das pessoas, devendo ser evitado.

Dentro desse universo, são justificadas as preocupações do Dr. Ferraz (“O feijão nosso de cada dia”, Jornal da Ciência, 6/10/2011). Ele é membro da CTNBio, atua na setorial vegetal/ambiental e sua área de concentração é em agroecologia, o que explica, pelo menos em parte, suas dúvidas. No entanto essas preocupações não têm a consistência sugerida pelo autor e a análise da CTNBio, que resultou na aprovação deste feijão, é confiável.

A comissão se pauta sempre pelas diretivas da legislação que são amplas, para dar conta de todas as possibilidades de risco para os consumidores e meio ambiente. No entanto o corpo técnico existe exatamente para atuar de maneira seletiva e consciente, examinando caso a caso. Os testes são examinadas com o rigor que a modificação introduzida na planta exige para plena segurança. Se as modificações são consideradas sem qualquer risco significativo, os testes são avaliados à luz deste fato.

Testes com muitos animais, altamente confiáveis estatisticamente, seriam exigidos pela comissão na eventualidade de uma planta transgênica produzir, por exemplo, uma molécula pesticida não protéica que seria em tudo semelhante a uma droga produzida pela indústria farmacêutica. Isto poderá acontecer em certo momento, já que as plantas têm capacidade de produzir os mais variados pesticidas naturais para se defenderem na natureza. A substância seria absorvida no intestino e se disseminaria por órgãos e tecidos, possivelmente exercendo efeitos sistêmicos e localizados que exigem avaliação. Isso já aconteceu, sem querer, com uma batata produzida por melhoramento convencional nos EUA. Seu consumo levou a mal estar e foi recolhida apressadamente: portava altos níveis de glicoalcalóides tóxicos para o homem, o que explicava sua excelente resistência às pragas da cultura.

No caso do feijão Embrapa, nenhuma molécula não protéica nova é produzida e o pequeno RNA que interfere com a replicação do vírus, caso alguém venha a ingerir folhas e caules, será um entre centenas ou milhares de RNAs que ingerimos diariamente com qualquer produto vegetal. O RNA introduzido, no entanto, não foi detectado no grão do feijão cozido, usando técnicas extremamente poderosas.

As variações detectadas, se estatisticamente significativas (concentração de vitamina B2 ou cisteína por exemplo) não representam risco algum. A técnica clássica de cultura de tecidos, usada para gerar variedades de qualidade em horticultura e propagação de árvores, reconhecidamente resulta em variações naturais que introduzem certas modificações desejáveis e algumas indesejáveis, que o melhorista depois seleciona. É a variação somaclonal, que também afeta os clones geneticamente modificados na sua fase de seleção.

Portanto, é no mínimo ingênuo dizer que o feijão Embrapa 5.1 “deveria ser idêntico” a variedade de origem pois as manipulações necessárias para gerar o transgênico resultam em certas alterações que, se irrelevantes, são ignoradas e se deletérias são descartadas pelos cientistas. Se fizermos as mesmas análises, cujos resultados preocupam alguns, com as muitas variedades convencionais consumidas no país, as diferenças serão impressionantes e irrelevantes para a questão “segurança”.

Como já foi comentado anteriormente, não existe base factual (bioquímica ou genética) para imaginar que o feijão Embrapa apresente risco maior do que um feijão comum ou melhorado por mutagênese química ou física, que por sinal, não é supervisionado nutricional e molecularmente antes de sua comercialização. Sem base biológica, os testes tornam-se formalidades supérfluas e o ruído experimental, principalmente com amostras pequenas, quase inevitavelmente vai gerar resultados que são irrelavantes a menos que se amplie muito o número de animais (para amostras controle e transgênicas) além de ser prudente incluir animais alimentados com outros feijões convencionais para uma idéia realista do significado das variações detectadas. Imaginem o custo dessa busca “caça fantasma”, desencadeada simplesmente devido a uma aplicação pouco esclarecida do princípio da precaução. As preocupações sem base racional, levantadas a todo momento pelos que temem a tecnologia, se aplicariam com maior lógica aos produtos convencionais.

Caso isso aconteça, do dia para a noite estaria inviabilizada a produção agrícola do planeta. Por que não fazer estudos com Rhizobium e nodulação em todos os feijões comercializados? Por que não conduzir estudos nutricionais de longo prazo com os alimentos convencionais derivados de mutagênese? Qual a razão lógica que exclui essas preocupações com as plantas convencionais? Ou a razão seria metafísica? A alteração introduzida seria “contra a natureza”, algo como o pecado original, que, em muitas interpretações, consistiu apenas em comer o fruto da “árvore do conhecimento”? Recentemente 41 cientistas suecos da área vegetal lançaram um manifesto contra a sobre-regulação da genética moderna na Europa (reproduzido no blog GenPeace: genpeace.blogspot.com). Os autores observam que, fazendo um paralelo com as exigências para os produtos farmacêuticos, a “lógica da legislação atual sugere que apenas drogas produzidas por meio de engenharia genética deveriam ser avaliadas quando a efeitos indesejáveis”.

Instilar o medo com base em suposições não ajuda a proteger a população ou o meio ambiente. Marie Curie teria dito “Na vida nada deve ser temido. Mas tudo deve ser compreendido”. Considero irresponsável usar o “princípio da precaução” como alguns o fazem. Inclusive a OMS caiu nesta armadilha, classificando os telefones celulares no grupo 2B de risco para causar câncer. A radiação destes equipamentos é cerca de um milhão de vezes inferior à energia que pode produzir radicais livres e gerar dano ao DNA. A classe 2B inclui o risco de câncer relativo ao café, resíduos da queima de combustíveis fósseis e uso de dentadura…. O que a WHO manteve viva, irresponsavelmente, é a justificativa para a dúvida, que vai legitimar pesquisas caras e irrelevantes, cujo resultado será inconclusivo, como o mega estudo anterior. Incrivelmente mais perigoso é o uso do celular enquanto se dirige.

Francisco G. da Nóbrega é professor da Universidade de São Paulo (USP).

A Map of Organized Climate Change Denial (Dot Earth, N.Y. Times)

October 2, 2011, 3:51 PM

By ANDREW C. REVKIN

Oct. 3, 9:00 p.m. | Updated 
A chart of “key components of the climate change denial machine” has been produced by Riley E. Dunlap, regents professor of sociology at Oklahoma State University, and Aaron M. McCright, an associate professor of sociology at Michigan State University. The diagram below (reproduced here with permission) is from a chapter the two researchers wrote on organized opposition to efforts to curb greenhouse gases for the new Oxford Handbook of Climate Change and Society.
That there are such well-financed and coordinated efforts is not contentious. And this is not the first attempt to map them.

But it’s important to keep in mind that not everyone skeptical of worst-case predictions of human-driven climate disruption, or everyone opposed to certain climate policies, is part of this apparatus.

And there’s plenty to chart on the other edge of the climate debate — thosegroups and outlets pursuing a traditional pollution-style approach to greenhouse gases.

[Oct. 3, 9:00 p.m. | Updated As it happens, the blogger behind Australian Climate Madness has posted a skeptics’ map of “the climate alarmism machine.” (see below) I think some, though by no means all, aspects of the map are not bad. But, as with so much of the climate debate, it is an overdrawn, overblown caricature of reality.]

It’s also important to examine whether a world without such efforts — in which citizens had a clear view of both what is known, and uncertain, about the human factor in shaping climate-related risks — would appreciably change. Some insist the answer is yes. Given the deep-rooted human bias tothe near and now and other aspects of our “inconvenient mind,” I’m not nearly so sure (although this doesn’t stop me from working on this challenge, of course).

We Need To Do More When It Comes To Having Brief, Panicked Thoughts About Climate Change (The Onion)

COMMENTARY
BY RHETT STEVENSON
SEPTEMBER 6, 2011 | ISSUE 47•36

The 20 hottest years on record have all taken place in the past quarter century. The resulting floods, wildfires, and heat waves have all had deadly consequences, and if we don’t reduce carbon emissions immediately, humanity faces bleak prospects. We can no longer ignore this issue. Beginning today, we must all do more when it comes to our brief and panicked thoughts about climate change.

Indeed, if there was ever a time when a desperate call to take action against global warming should race through our heads as we lie in bed and stare at the ceiling, that time is now.

Many well-intentioned people will take 20 seconds out of their week to consider the consequences of the lifestyle they’ve chosen, perhaps contemplating how their reliance on fossil fuels has contributed to the rapid melting of the Arctic ice cap. But if progress is what we truly want, 20 seconds is simply not enough. Not by a long shot. An issue this critical demands at least 45 seconds to a solid minute of real, concentrated panic.

And I’m not talking about letting the image of a drowning polar bear play out in your mind now and then. If we’re at all serious, we need to let ourselves occasionally be struck with grim visions of coastal cities washing away and people starving as drought-stricken farmlands fail to yield crops—and we need to do this regularly, every couple days or so, before continuing to go about our routines as usual.

This may seem like a lot to ask, but no one ever said making an effort to think about change was easy.

So if you pick up a newspaper and see an article about 10 percent of all living species going extinct by the end of the century, don’t just turn the page. Stop, peruse it for a moment, look at the photos, freak out for a few seconds, and then turn the page.

And the next time you start up your car, stop to think how the exhaust from your vehicle and millions of others like it contributes to air pollution, increasing the likelihood that a child in your neighborhood will develop asthma or other respiratory ailments. Take your time with it. Feel the full, crushing weight of that guilt. Then go ahead and drive wherever it was you wanted to go.

To do anything less is irresponsible.

Suppose you’ve just sat down in a crisply air-conditioned movie theater. Why not take the length of a preview or two to consider the building’s massive carbon footprint? Imagine those greenhouse gases trapped in the atmosphere, disrupting ecosystems and causing infectious diseases to spread rampantly, particularly in regions of the world where the poorest people live. Visualize massive storm systems cutting widespread swaths of destruction. Think of your children’s children dying horrible, unnecessary deaths.

You might even go so far as to experience actual physical symptoms: shaking, hyperventilation, perhaps even a heart palpitation. These are entirely appropriate responses to have, and the kinds of reactions each of us ought to have briefly before casting such worries aside to enjoy Conan The Barbarian.

Ultimately, however, our personal moments of distress won’t matter much unless our government intervenes with occasional mentions of climate change in important speeches, or by passing nonbinding legislation on the subject. I implore you: Spend a couple minutes each year imagining yourself writing impassioned letters to your elected representatives demanding a federal cap on emissions.

Global warming must be met with immediate, short-lasting feelings of overwhelming dread, or else life as we know it will truly cease—oh, God, there’s nothing we can do, is there? Maybe we’re already too late. What am I supposed to do? Unplug my refrigerator? I recycle, I take shorter showers than I used to, doesn’t that count for something? Devastating famines and brutal wars fought over dwindling resources? Is that my fault? Jesus, holy shit, someone do something! Tell me what to do! For the love of God, what can possibly be done?

There you have it. I’ve done my part. Now it’s your turn.

SINCE SEPTEMBER 11, 2001 . . . (SSRC)

10 years after september 11 – A SOCIAL SCIENCE RESEARCH COUNCIL ESSAY FORUM

By Veena Das

A decade of intense theorizing on the forms of violence and human degradation, on global connectivity, on demands that scholarship be done in “real time” . . . a sense of urgency . . . disciplines are aggressively asked to prove their relevance . . . a deep disquiet on the part of many radical scholars and public intellectuals that the American public is increasingly becoming complicit in projects of warfare. We ask, are our senses being so retrained now that we cannot see the suffering of others or hear their cries? We declare with anguish that whole populations are defined as nothing but targets for bombing . . . as those whose deaths do not count, and hence those dead literally need not be counted. There is a desperation to hone in on what is new—perhaps, some theorize, what we now have is “horror” and not “terror” . . . perhaps, say others, what is lost is not only meaning but any trust in what might count as real.

Despite repeated calls for invention of new vocabularies, my own sense is that we have yet to come to terms with the violence of the past and that we have allowed our scholarly terms to be defined in a manner that we are becoming trapped in, terms that are already given in the questions that we ask. After all, do we need to be reminded that the single-most important factor in the decline of the total number of wars since 1942 was the end of colonial wars? Or that in the 1990s the region in which the highest death toll occurred was sub-Saharan Africa, and that it was the indirect death through disease and malnutrition that contributed to the enormity of the violence? I use the collective first-person pronoun to include myself within this trap of not being quite able to define what the right questions should be.

Ten years ago, when I contributed a short reflection on September 11 to the SSRC’s forum, something of this disquiet I feel about the mode of theorizing was already present. I argued that in the political rhetoric that circulated right after September 11, with its talk of attacks on the values of civilization, the American nation was seen to embody universal values—hence the talk was not of many terrorisms with which several countries had lived for more than thirty years but of one grand terrorism, Islamic terrorism. If I am allowed to loop back to my words, I asked, “What could this mean except that while terrorist forms of warfare in other spaces in Africa, Asia, or the Middle East were against forms of particularism, the attack on America is seen as an attack on humanity itself?” Perhaps we should ask of ourselves now the permission to be released from the grip of this master trope of September 11 that organizes a whole discourse, both conservative and radical, in terms of terrorism as the gripping drama of our times. We might then ask, what other questions have been under discussion among different communities of scholars and how might debate be widened to take account of these discussions?

One point I might put forward as a candidate for discussion is how affect is invested in some terms that come to be the signifiers of the pressing problems of a particular decade but then are dropped as if their force has been exhausted by new discoveries. When these terms drop out of scholarly circulation, do they still have lives that are lived in other corners of the world or in the lives of individuals who continue to give them expression? Consider the history of the term “ethnic cleansing,” which came to signify and organize much discussion in the nineties as referring to the pathology of what was termed as ethno-nationalism. As is well known, the term emerged in the summer of 1992 during the tragic events of the dissolution of Yugoslavia and the emergence of new nation-states that were making claims for international recognition. Although the composite term “ethnic cleansing” came to be used only then, the idea of “cleaning” a territory by killing the local inhabitants and making it safe for military occupation was known in colonial wars as well as expressed extensively in Latin America with reference to undesirable groups, such as prostitutes, enemy collaborators, and the vagrant poor.

Norman Naimark has made the point that ethnic cleansing happens in the shadow of war. He cites the examples of the Greek expulsion as a result of the Greco-Turkish war, the intensification of ethnic cleansing when NATO bombing started in Kosovo in March 1999, and Stalin’s brutal dealings with the Chechen-Ingush and Crimean Tartars during the Second World War.1 A chilling aspect of ethnic cleansing is its totalistic character. As Naimark puts it:

The goal is to remove every member of the targeted nation; very few exceptions to ethnic cleansing are allowed. In premodern cases of assaults of one people on another, those attacked could give up, change sides, convert, pay tribute, or join the attackers. Ethnic cleansing, driven by the ideology of integral nationalism and the military and technological power of the modern state, rarely forgives, makes exceptions, or allows people to slip through the cracks.

Yet a concept that was said to be central to explaining major mass atrocities is now rarely encountered—except perhaps in international law discussions on the distinction between genocide and ethnic cleansing. Are the kinds of mass atrocities that have occurred since September 11 not amenable to discussion under any of the earlier terms? Do subjectivities shift so quickly? Are issues of intentionality as providing the criteria for distinguishing between genocide and ethnic cleansing already resolved? What is at stake in the fact that ethnic cleansing is a perpetrator’s term while genocide is a term that privileges the experience of the victims? What kind of footing in the world do enunciations made on behalf of all sides in conflicts that draw on such concepts as human rights and human dignity have?

While one can understand why the media might have moved on to other stories, have we as scholars come to terms with why some concepts disappear from our vocabularies so quickly? I want to suggest that a long-term perspective on how we come to speak of violence—the appearance and disappearance of different terms—provides a repertoire of concepts to be mined for understanding how representation of violence in the public sphere was closely tied up with the West’s self-definition that in turn defined the twists and turns in the social sciences. Ethnic cleansing in the nineties was widely understood as the violence of the other just as terrorism now is understood as the violence that the other perpetrates. September 11 and the subsequent wars in Iraq and Afghanistan then become events that need to be placed in the long history of warfare that has generated the concepts of social science—concepts that cannot be divested of their political plenitude even as we recognize that the technologies of war have changed considerably.

Are there other discussions on war that are not quite within the discursive fields that dominate the post–September 11 scenario and the notion of Islamic terrorism? I find it salutary to think that other theoretical discussions are taking place that are outside this frame of reference. For instance, the prolonged civil war in Sri Lanka, in which both Sinhala soldiers and Tamil militants engaged in killing, has led to discussions on the relation between Buddhism and violence and whether there are strains of Buddhism, especially within the Mahayana school, that make room for the exercise of violence. Interestingly, the issues here are not those of justifying warfare but rather of dealing with the anxieties about bad karma generated by the acts of violence.

A sustained analysis of what enabled such developments as samurai Zen, or soldier Zen, to appear in Japan or how it is that Buddhism could find a home within kingdoms as diverse as the Indians, the Mongols, the Chinese, and the Thai deepens our understanding of violence and nonviolence precisely because it has the potential to change the angle of our vision.2 Similar discussions from within other traditions, both religious and secular, would help to break the monopoly of concepts (biopolitics, state of exception, homo sacer) that are now routinely used to understand the world. This hope is not an expression of sheer nostalgia for non-Western concepts but a plea to cultivate some attentiveness to those discourses that are (or could be) part of the history of our disciplines. Scholarly discourse cannot simply mirror the ephemeral character of media stories—even when a particular kind of violence disappears, the institutions that were put in place for dealing with it continue to have lives of their own. The braiding of what is new and what is enduring might then define how we come to pose questions that are not simply corollaries of the common sense of our times.


Veena Das is Krieger-Eisenhower Professor of Anthropology and professor of humanities at the Johns Hopkins University. Her most recent books are Life and Words: Violence and the Descent into the Ordinaryand Sociology and Anthropology of Economic Life: The Moral Embedding of Economic Action (ed., with R. K. Das).

  1. Norman M. Naimark, Fires of Hatred: Ethnic Cleansing in Twentieth-Century Europe (Cambridge, MA: Harvard University Press, 2001).
  2. See Michael K. Jerryson and Mark Juergensmeyer, eds., Buddhist Warfare (New York: Oxford University Press, 2010).

Shooting the messenger (The Miami Herald)

Environment
Posted on Monday, 08.29.11
BY ANDREW DESSLER

Texas Gov. Rick Perry stirred up controversy on the campaign trail recently when he dismissed the problem of climate change and accused scientists of basically making up the problem.

As a born-and-bred Texan, it’s especially disturbing to hear this now, when our state is getting absolutely hammered by heat and drought. I’ve got to wonder how any resident of Texas – and particularly the governor who not so long ago was asking us to pray for rain – can be so cavalier about climate change.

As a climate scientist at Texas A&M University, I can also tell you from the data that the current heat wave and drought in Texas is so bad that calling it “extreme weather” does not do it justice. July was the single hottest month in the observational record, and the 12 months that ended in July were drier than any corresponding period in the record. I know that climate change does not cause any specific weather event. But I also know that humans have warmed the climate over the last century, and that this warming has almost certainly made the heat wave and drought more extreme than it would have otherwise been.

I am not alone in these views. There are dozens of atmospheric scientists at Texas institutions like Rice, the University of Texas, and Texas A&M, and none of them dispute the mainstream scientific view of climate change. This is not surprising, since there are only a handful of atmospheric scientists in the entire world who dispute the essential facts – and their ranks are not increasing, as Gov. Perry claimed.

And I can assure Gov. Perry that scientists are not just another special interest looking to line their own pockets. I left a job as an investment banker on Wall Street in 1988 to go to graduate school in chemistry. I certainly didn’t make that choice to get rich, and I didn’t do it to exert influence in the international arena either.

I went into science because I wanted to devote my life to the search for scientific knowledge. and to make the world a better place. That’s the same noble goal that motivates most scientists. The ultimate dream is to make a discovery so profound and revolutionary that it catapults one into the pantheon of the greatest scientific minds of history: Newton, Einstein, Maxwell, Planck, etc.

This is just one of the many reasons it is inconceivable for an entire scientific community to conspire en masse to mislead the public. In fact, if climate scientists truly wanted to maximize funding, we would be claiming that we had no idea why the climate is changing – a position that would certainly attract bipartisan support for increased research.

The economic costs of the Texas heat wave and drought are enormous. The cost to Texas alone will be many billion dollars (hundreds of dollars for every resident), and these costs will ripple through the economy so that everyone will eventually pay for it. Gov. Perry needs to squarely face the choice confronting us; either we pay to reduce emissions of greenhouse gases, or we pay for the impacts of a changing climate. There is no free lunch.

Economists have looked at this problem repeatedly over the last two decades, and virtually every mainstream economist has concluded that the costs of reducing emissions are less than the costs of unchecked climate change. The only disagreement is on the optimal level of emissions reductions.

I suppose it should not be surprising when politicians like Gov. Perry choose to shoot the messenger rather than face this hard choice. He may view this as a legitimate policy on climate change, but it’s not one that the facts support.

Read more here.

In the Land of Denial (N.Y. Times)

NY Times editorial
September 6, 2011

The Republican presidential contenders regard global warming as a hoax or, at best, underplay its importance. The most vocal denier is Rick Perry, the Texas governor and longtime friend of the oil industry, who insists that climate change is an unproven theory created by “a substantial number of scientists who have manipulated data so that they will have dollars rolling into their projects.”

Never mind that nearly all the world’s scientists regard global warming as a serious threat to the planet, with human activities like the burning of fossil fuels a major cause. Never mind that multiple investigations have found no evidence of scientific manipulation. Never mind that America needs a national policy. Mr. Perry has a big soapbox, and what he says, however fallacious, reaches a bigger audience than any scientist can command.

With one exception — make that one-and-one-half — the rest of the Republican presidential field also rejects the scientific consensus. The exception is Jon Huntsman Jr., a former ambassador to China and former governor of Utah, who recently wrote on Twitter: “I believe in evolution and trust scientists on global warming. Call me crazy.” The one-half exception is Mitt Romney, who accepted the science when he was governor of Massachusetts and argued for reducing emissions. Lately, he’s retreated into mush: “Do I think the world’s getting hotter? Yeah, I don’t know that, but I think that it is.” As for the human contribution: “It could be a little. It could be a lot.”

The others flatly repudiate the science. Ron Paul of Texas calls global warming “the greatest hoax I think that has been around for many, many years.” Michele Bachmann of Minnesota once said that carbon dioxide was nothing to fear because it is a “natural byproduct of nature” and has complained of “manufactured science.” Rick Santorum, a former senator from Pennsylvania, has called climate change “a beautifully concocted scheme” that is “just an excuse for more government control of your life.”

Newt Gingrich’s full record on climate change has been a series of epic flip-flops. In 2008, he appeared on television with Nancy Pelosi, the former House speaker, to say that “our country must take action to address climate change.” He now says the appearance was a mistake.

None of the candidates endorse a mandatory limit on emissions or, for that matter, a truly robust clean energy program. This includes Mr. Huntsman. In 2007, as Utah governor, he joined with Arnold Schwarzenegger, then the governor of California, in creating the Western Climate Initiative, a market-based cap-and-trade program aimed at reducing emissions in Western states. Cap-and-trade has since acquired a toxic political reputation, especially among Republicans, and Mr. Huntsman has backed away.

The economic downturn has made addressing climate change less urgent for voters. But the issue is not going away. The nation badly needs a candidate with a coherent, disciplined national strategy. So far, there is no Republican who fits that description.

A novela perdeu o bonde da história (Fapesp)

HUMANIDADES | COMUNICAÇÃO

Cai status do gênero como lugar privilegiado de discussão das questões nacionais
Carlos Haag
Edição Impressa – Agosto 2011
© FOTOGRAFIAS TADEU VILANI

Em 1981, durante uma crise política grave no governo Figueiredo, o todo-poderoso Golbery do Couto e Silva pediu demissão do governo. Aos jornalistas justificou-se: “Não me perguntem nada. Eu acabo de sair de Sucupira”. A referência à cidade fictícia da novela O bem-amado (1973) e à minissérie homônima (1980-1984), de Dias Gomes, num momento delicado como aquele, revela o poder, à época, das telenovelas como representação da realidade nacional e de como os brasileiros se reconheciam nessas representações. “A partir de conflitos de gênero, geração, classe e religião, a novela fez crônicas do cotidiano que a transformaram num palco privilegiado de interpretação do Brasil. O país, que se modernizava num contexto de modernização centrada no consumo, e não na afirmação da cidadania, se reconhecia na tela da TV em um universo branco e glamoroso”, explica Esther Hamburger, professora do Departamento de Cinema, Rádio e Televisão da Universidade de São Paulo (USP) e autora do estudo O Brasil antenado (Jorge Zahar Editor). Ela analisou os novos rumos do gênero na pesquisa Formação do campo intelectual e da indústria cultural no Brasil contemporâneo, apoiada pela FAPESP e coordenada pelo sociólogo da USP Sérgio Miceli. O projeto reúne, além de Esther, outros pesquisadores de várias áreas e temas.

“No Brasil que se democratizava, a novela tratou em primeira mão de assuntos que pautariam a cena política na década seguinte. Mas, hoje, ela perdeu o seu status privilegiado de problematização das questões nacionais. Não consegue mobilizar a opinião pública, não é mais totalmente nacional e tampouco a vitrine do país. É provável que não seja mais capaz de sintetizar o país”, avisa a pesquisadora. “Afinal, aquele país centralizado, passível de uma representação hegemônica, não existe mais. Novos meios como TV a cabo e a internet tiraram da novela o seu caráter de arena de problematização. A sociedade mudou e há muita diversificação. A alfabetização aumentou e a TV não é mais o único lugar para achar informações”, observa. Para Esther, no país atual não é mais possível uma novela falar para toda a nação. “Não há mais um Brasil na TV, mas vários”, avalia.

Queda – “A novela permanece estratégica na receita e na competição entre as emissoras de televisão, mas sua capacidade de polarizar audiências nacionais está em queda. O gênero abusa de mensagens de conteúdo social, enquanto perde seu diferencial estético e sua força polêmica. A nação já não é mais o tema central, porque os temas extrapolam fronteiras. Há cada vez me-nos referências a assuntos atuais e polêmicos. A opção é por campanhas politicamente corretas, muitas vezes em detrimento da dramaturgia, amarrando a criatividade dos autores”, diz Esther. Segundo a mas vários”, avaliapesquisadora, a estrutura de conflitos melodramáticos que sustenta a narrativa ainda se mantém, mas em histórias que voltam a se restringir a espaços imaginados como femininos, o público inicial dos primórdios da telenovela nacional, e de menor valor cultural. O gênero também não atrai mais tantos talentos criativos, com textos fracos e enredos repetitivos que insistem em velhos clichês e convenções que fizeram sucesso no passado. “Ainda assim, não se pode negar que a novela pode voltar a ter o impacto político e cultural de antes, influindo no comportamento e na moda. Ela ainda é um lugar onde se pode aprender algo, em especial o novo público predominante, abaixo das classes A e B”, fala.

Do apogeu à crise recente de queda de audiências foi um longo caminho. No início imperava o estilo “fantasia”, cheio de sentimentalismo, em produções dos anos 1960, como o exótico Sheik de Agadir, paradigma quebrado com o realismo de Beto Rockfeller, representação da contemporaneidade das classes médias emergentes. Nos anos 1970 romperam-se os limites do dramalhão, mas as novelas viraram vitrines do ser moderno: a moda e o comportamento. “A Globo, durante a ditadura, adotou o discurso oficial, mas entendeu que, nas novelas, ao invés de esconder os problemas, era melhor incorporá-los nas tramas, como fez em O bem-amado. Foi o início de uma crítica crescente ao processo de modernização”, lembra Mauro Porto, professor da Tulane University e autor da pesquisa Telenovelas and national identity in Brazil. O realismo tomou conta do gênero: uma pesquisa de 1988 revelou que 58% dos entrevistados queriam ver “a realidade” nas novelas e 60% desejavam que as tramas falassem da política. “Os autores, de uma geração de esquerda, se viam como responsáveis por um projeto nacional e de consciên-cia popular”, nota Porto. “As novelas registraram os dramas da urbanização, das diferenças sociais, da fragmentação da família, da liberalização das relações conjugais e dos padrões de consumo. Chegaram ao seu ápice quando falaram dos problemas da modernização como Vale tudo (1988) e Roque Santeiro (1985)”, diz Esther. Mas a TV Manchete trouxe uma leitura alternativa do país com Pantanal, pleno do exótico e do erótico, o que rompeu o ciclo político das novelas, inclusive na Globo, que se viu obrigada a emular o novo conceito. “O ‘efeito Pantanal’, porém, não deixou herdeiros e hoje foi esquecido.”

Intimidade – “Nesse percurso, a telenovela criou um repertório comum pelo qual pessoas de classes sociais, gerações, sexo, raça e regiões diferentes se reconheciam, uma ‘comunidade imaginada’ de problematização do Brasil, da intimidade com os problemas sociais, veículo ideal para se construir a cidadania, uma narrativa da nação”, analisa Maria Immacolata Lopes, professora da Escola de Comunicações e Artes (ECA-USP) e coordenadora do Núcleo de Pesquisa de Telenovelas. O modelo se desgastou e o país mudou. “Entre 1970 e 1980 houve uma mágica entre público e novela. Em Vale tudo, pela primeira vez se viu a corrupção num espaço público não político e as novelas estavam na vanguarda”, nota Esther. “Hoje a corrupção é banal, não é mais polêmica, só traz o tédio da repetição. Em 1988 era novidade; em 2011 é algo batido.” As novelas não estão mais antenadas com o país. “Mesmo a literatura contemporânea acadêmica estrangeira sobre televisão já não discute mais a telenovela brasileira e o ‘caso’ brasileiro perdeu espaço interna e externamente diante de uma renovação da ficção televisiva internacional, em especial os seriados americanos, que ganham espaço nos canais nacionais, um novo fluxo de importação de programação que as novelas haviam substituído nas décadas anteriores”, explica. Os sitcons de hoje, ao contrário do passado, quando eram “obras fechadas” e sem improviso, estão abertos aos indicadores de sucesso e podem mudar seu rumo enquanto estão no ar, trazendo alusões a elementos políticos e culturais da realidade americana e problematizando os EUA.

“Não temos a mesma audiência nacional com todas as classes e lugares. Tudo ficou mais popular e as novelas atendem esse público espectador com merchandising social, sexo, dinâmica de tramas que mudam toda hora, ação, assassinatos”, analisa. Para a pesquisadora, essa quebra na dramaturgia reduz ainda mais o escopo do público ao fazer cair o interesse de uma grande parte da au-diência. Esther cita novas alternativas como Cordel encantado, que remete às novelas fantasiosas. Há também a procura de novos autores e diretores ou o remake de antigos sucessos, como O astro, para recuperar fórmulas de sucesso do passado, mas, mesmo adaptadas, conservam sabor de “coisa velha”. “Não sabemos se os brasileiros ainda desejam o realismo, mas é certo que se cansaram das novelas urbanas no eixo Rio-São Paulo. Gostariam de conhecer novas rea-lidades e o aspecto regional antes desprezado ou caricaturado.” A renovação não é fácil, como mostra o fracasso de experimentações como Cidade de Deus ou Antonia. “Uma solução seria mostrar a violência das cidades, do tráfico, mas isso ainda é tabu nas novelas. O cinema se revelou mais ‘antenado’ ao mostrar os poderes paralelos das periferias, como em Tropa de elite. Ou, Dois filhos de Francisco, filme que traz um Brasil onde os humildes se realizam.” A novela, pela primeira vez, perdeu o bonde da história. Num escândalo recente, um colunista político não usou uma citação de novela, como Golbery, para falar do caso, mas o bordão do filme Tropa de elite: “Palocci, pede pra sair!”.

O partido anticiência (JC, O Globo)

JC e-mail 4333, de 30 de Agosto de 2011.

Artigo de Paul Krugman publicado no O Globo de hoje (30).

John Huntsman Jr., ex-governador de Utah e embaixador na China, não é um forte pré-candidato à indicação do Partido Republicano para concorrer à Presidência. E isto é muito ruim porque o desejo de Huntsman é dizer o indizível sobre o partido – especialmente que ele está se tornando o “partido anticiência”. Isto é algo enormemente importante. E deveria nos aterrorizar.

Para entender o que Huntsman defende, considere declarações recentes dos dois mais fortes pretendentes à indicação republicana: Rick Perry e Mitt Romney.

Perry, governador do Texas, fez manchetes recentemente ao fazer pouco da evolução humana como uma “simples teoria”, que tem “algumas lacunas” – uma observação que soaria como novidade para a vasta maioria dos biólogos. Mas o que mais chamou a atenção foi o que ele disse sobre mudança climática: “Penso que há um número substancial de cientistas que manipulou dados para obter dólares para seus projetos. E penso que estamos vendo, quase toda semana, ou todo dia, cientistas questionando a ideia original de que o aquecimento global provocado pelo homem é a causa da mudança climática.” É uma declaração extraordinária – ou talvez o adjetivo correto seja “vil”.

A segunda parte da declaração de Perry é falsa: o consenso científico sobre a interferência humana no aquecimento global – que inclui 97% a 98% dos pesquisadores de campo, segundo a Academia Nacional de Ciências – está se tornando mais forte à medida que aumentam as evidências sobre a mudança do clima.

De fato, se você acompanha a ciência climática sabe que o principal aspecto nos últimos anos tem sido a preocupação crescente de que as projeções sobre o futuro do clima estejam subestimando o provável aumento da temperatura. Advertências de que poderemos enfrentar mudanças cimáticas capazes de ameaçar a civilização no fim do século, antes consideradas estranhas, partem agora dos principais grupos de pesquisa.

Mas não se preocupe, sugere Perry; os cientistas estão apenas atrás de dinheiro, “manipulando dados” para criar uma falsa ameaça. Em seu livro “Fed Up”, ele despreza a ciência do clima como “uma bagunça falsa e artificial que está se desmanchando”.

Eu poderia dizer que Perry está tirando isso de uma teoria conspiratória verdadeiramente louca, que afirma que milhares de cientistas de todo o mundo estão levando dinheiro, sem que nenhum deseje quebrar o código de silêncio. Poderia apontar que múltiplas investigações em acusações de falsidade intelectual da parte dos cientistas climáticos acabaram com a absolvição dos pesquisadores de todas as acusações. Mas não se preocupe: Perry e os que pensam como ele sabem em que desejam acreditar e sua resposta a qualquer um que os contradiga é iniciar uma caça às bruxas.

Então de que modo Romney, o outro forte concorrente à indicação republicana, respondeu ao desafio de Perry? Correndo dele. No passado, Romney, ex-governador de Massachusetts, endossou fortemente a noção de que a mudança climática provocada pelo homem é uma real preocupação. Mas, na semana passada, ele suavizou isso e disse pensar que o mundo está realmente esquentando, mas “eu não conheço isto” e “não sei se isso é causado principalmente pelo homem”. Que coragem moral!

É claro, sabemos o que está motivando a súbita falta de convicção de Romney. Segundo o Public Policy Polling, somente 21% dos eleitores republicanos de Iowa acreditam no Aquecimento Global (e somente 35% creem na evolução). Dentro do Partido Republicano, ignorância deliberada tornou-se um teste decisivo para os candidatos, no qual Romney está determinado a passar a qualquer custo.

Então, é agora altamente provável que o candidato presidencial de um de nossos dois grandes partidos políticos será ou um homem que acredita no que quer acreditar, ou um homem que finge acreditar em qualquer coisa que ele ache que a base do partido quer que ele acredite.

E o caráter crescentemente anti-intelectual da direita, tanto dentro do Partido Republicano como fora dele, se estende além da questão da mudança climática.

Ultimamente, por exemplo, a seção editorial do “Wall Street Journal” passou da antiga preferência pelas ideias econômicas de “charlatães e maníacos” — pela definição famosa de um dos principais conselheiros econômicos do ex-presidente George W. Bush – para um descrédito geral do pensamento árduo sobre questões econômicas. Não prestem atenção a “teorias fantasiosas” que conflitam com o “senso comum”, diz-nos o “Journal”. Por que deveria alguém imaginar que se precisa mais do que estômago para analisar coisas como crises financeiras e recessões?

Agora, não se sabe quem ganhará a eleição presidencial do próximo ano. Mas há chances de que, mais dia menos dia, a maior nação do mundo será dirigida por um partido que é agressivamente anticiência, mesmo anticonhecimento. E, numa era de grandes desafios – ambiental, econômico e outros – é uma terrível perspectiva.

Paul Krugman é colunista do “New York Times”.

Climate Change Sparks Battles in Classroom (Science)

Science 5 August 2011: Vol. 333 no. 6043 pp. 688-689 DOI: 10.1126/science.333.6043.688

SCIENCE EDUCATION
Sara Reardon

The U.S. political debate over climate change is seeping into K-12 science classrooms, and teachers are feeling the heat.

Growth potential. Students gather acorns for a middle school science project. CREDIT: JEFF CASALE/AP IMAGES

This Spring, when the science department of Los Alamitos High School in southern California proposed an advanced class in environmental science, members of the elected school board for the small district in Orange County thought the course was a great idea. Then they read the syllabus and saw a mention of climate change.

The topic, the board decided, is a “controversial issue.” Its next step was a new policy requiring teachers to explain to the school board how they are handling such topics in class in a “balanced” fashion. And the new environmental science course, which starts this fall, will be the first affected.

Local teachers immediately deplored the board’s actions. “It’s very difficult when we, as science teachers, are just trying to present scientific facts,” says Kathryn Currie, head of the high school’s science department. And science educators around the country say such attacks are becoming all too familiar. They see climate science now joining evolution as an inviting target for those who accuse “liberal” teachers of forcing their “beliefs” upon a captive audience of impressionable children.

“Evolution is still the big one, but climate change is catching up,” says Roberta Johnson, executive director of the National Earth Science Teachers Association (NESTA) in Boulder, Colorado. An informal survey this spring of 800 NESTA members (see word cloud) found that climate change was second only to evolution in triggering protests from parents and school administrators. One teacher reported being told by school administrators not to teach climate change after a parent threatened to come to class and make a scene. Online message boards for science teachers tell similar tales.

Hot topic. Teachers can bone up on climate science in workshops and classes. CREDIT: SOURCE: ROBERTA KILLEEN JOHNSON, NATIONAL EARTH SCIENCE TEACHERS ASSOCIATION

Unlike those biology teachers who have borne the brunt of the century-long assault on evolution, however, today’s earth science teachers won’t have the protection of the First Amendment’s language about religion if climate change deniers decide to take their cause to court. But the teachers feel their arguments are equally compelling: Science courses should reflect the best scientific knowledge of the day, and offering opposing views amounts to teaching poor science.

Most science teachers don’t relish having to engage this latest threat to their profession. “They want to teach the science,” says Susan Buhr, education director at the Cooperative Institute for Research in Environmental Sciences (CIRES) in Boulder. “They’re struggling to be on top of the science in the first place.”

CIRES and NESTA offer workshops and online resources for educators seeking more information on climate change. But teachers also say that they resent devoting any of their precious classroom time to a discussion of an alleged “controversy.” And they believe that politics has no place in a science classroom.

Even so, some are being dragged against their will into a conflict they fear could turn ugly. “There seems to be a lynch-mob hate against any teacher trying to teach climate change,” says Andrew Milbauer, an environmental sciences teacher at Conserve School, a private boarding school in Land O’Lakes, Wisconsin.

Milbauer felt that wrath after receiving an invitation to participate in a public debate about climate change. The event, put on last year by Tea Party activists, proposed to pit high school teachers against professors and climate change deniers David Legates and Willie Soon in front of students from 200 high schools. Organizers said the format was designed “to expand knowledge of the global warming debate to the youth of our state.” When Milbauer and his colleagues declined to participate, organizer Kim Simac complained to the local papers about their “suspicious” behavior. Milbauer corresponded for a time on the organization’s blog until Simac wrote that Milbauer, “in his role as science teacher, is passing on to our youth this monstrous hoax as being the gospel truth.”

Milbauer regards the episode as an unfortunate but telling example of misguided science and uses it in class discussions. “I explain this is the trap the [other side] is building,” he says.

Some teachers would disagree, however. In comments in the NESTA survey, a handful of teachers called climate change “just a theory like evolution” or said they firmly believed that opposing views should be presented with equal weight.

Sowing confusion

Given the ongoing and noisy national debate over climate change, it’s not surprising that those disagreements are seeping into K-12 schools, too. Science educators are scrambling to figure out how to deliver top-quality instruction without being sucked into the maelstrom. The issue is acute in Louisiana, which enacted a law in 2008 that lists climate change along with evolution as “controversial” subjects that teachers and students alike can challenge in the classroom without fear of reprisal.

A hotter climate? The phrase “climate change” came up often when NESTA asked its teacher members what classroom concepts trigger outside concerns. SOURCE: ROBERTA KILLEEN JOHNSON, NATIONAL EARTH SCIENCE TEACHERS ASSOCIATION

When a state law suggests that established scientific theories are controversial, says Ian Binns, a science education researcher at Louisiana State University in Baton Rouge, “it tells our students and teachers that there are problems that there aren’t.” That ambiguity, he and others fear, can distort a student’s understanding of the nature of scientific inquiry. “Science is not about providing balance to every viewpoint that’s out there,” says Joshua Rosenau of the National Center for Science Education, a nonprofit organization in Oakland, California, that has begun to monitor controversies regarding climate change in addition to battles over evolution. To Rosenau, staging debates over science in schools or on the floors of Congress “is madness.”

In Los Alamitos, the course will follow the curriculum laid out by the nonprofit College Board for its Advanced Placement (AP) course in environmental science, which presents the scientific evidence for climate change. This curriculum, which prepares students to take an end-of-year test for college credit, is what irritated Jeffrey Barke, a Los Alamitos school board member and physician who led the push to revise the district’s policies after learning about the course. Barke has spoken publicly about his concern that “liberal faculty” members would use the course to present global warming as “dogma.”

Science department head Currie criticizes the board’s new policy and feels that it may confuse students when they answer multiple-choice questions relating to climate change on the final AP exam. “When a kid comes across that on the AP test, what are they supposed to bubble?” she asks. “The fact, or [Barke’s] belief that it’s not a fact?” The school board, however, has said that the new policy is simply a way to prevent political bias from entering the classroom.

Currie and her colleagues are spending the summer working up a lesson plan for the new course, but she isn’t sure what will satisfy the board. “I’m going to fight for scientific facts being presented in the classroom,” she says. “I want to keep politics out.”

Arming for battle

The extent to which politics is affecting geoscience courses around the country is hard to measure, Rosenau says: “Just like with evolution, it’s difficult to know what a given teacher in a given classroom is teaching.”

To improve the quality of that instruction, both CIRES and NESTA are trying to put up-to-date, data-rich climate science materials into the hands of teachers and students to supplement textbooks. They’re not the only ones; even government agencies such as the National Oceanic and Atmospheric Administration, spurred by language in the 2007 America COMPETES Act about their role in improving science education, have beefed up their teacher training programs.

But it’s not enough to say that “you just need to teach people more,” Rosenau says. Teachers also have to learn how to defend themselves against parents or administrators wearing “ideological blinders,” he says. CIRES has analyzed the strategies that teachers used in the creationism debates and repurposed them for discussions about climate change. That includes citing state science standards—30 states include climate science in their description of what should be taught—and enlisting the support of administrators before tackling the subject in class.

Those who have taught geoscience or environmental science may feel more confident than colleagues who teach general physical science in managing a classroom discussion. Parents and students trying to poke holes in what they are being taught often “can’t articulate what the opposing view even is,” says Karen Lionberger, director of curriculum and content development for AP Environmental Science in Duluth, Georgia.

Of course, some attacks on climate change come from well-heeled sources. In 2009, the Heartland Institute, which has received significant funding from Exxon-Mobil, expanded its audience beyond teachers and students with a pamphlet, called The Skeptic’s Handbook, mailed to the presidents of the country’s 14,000 public school boards.

Heartland Institute senior fellow James Taylor, who sent out the pamphlet, says the underlying message is that educators need “to understand that there is quite a bit that remains to be learned” about climate change. Taylor also applauds the actions of the Los Alamitos school board, saying that “if the science is unsettled on any topic, of course you should present all points of view.”

The AP course itself doesn’t take a position on the issue, Lionberger says. The handful of multiple-choice questions on the final exam relating to climate change are not “slanted in any way,” she says, and none explicitly asks whether climate change is occurring. But because AP courses can be taken for college credit, she says, “we’re going to follow what colleges and universities are doing” by teaching students about the factors that contribute to climate change and its effects on the planet. Although researchers are always adding to that pool of knowledge, she says “for now, we will fall on the side of consensus science.”

Devagar e sempre (FSP)

JC e-mail 4317, de 08 de Agosto de 2011.

Movimento ‘Slow Science’ defende o direito de cientistas fugirem da corrida pelo grande número de publicações e priorizarem qualidade da pesquisa.

Um movimento que começou na Alemanha está ganhando, aos poucos, os corredores acadêmicos. A causa é nobre: mais tempo para os cientistas fazerem pesquisa. Quem encabeça a ideia é a organização “Slow Science” (http://slow-science.org), criada por cientistas gabaritados da Alemanha.

Aderir ao movimento significa não se render à produção desenfreada de artigos em revistas especializadas, que conta muitos pontos nos sistemas de avaliação de produção científica. Hoje, quem publica em revistas científicas muito lidas e mencionadas por outros cientistas consegue mais recursos para pesquisa.

Por isso, os cientistas acabam centrando seu trabalho nos resultados (publicações). “Somos uma guerrilha de neurocientistas que luta para que o modelo midiático de produção científica seja revisto”, disse à Folha o neurocientista Jonas Obleser, do Instituto Max Planck, um dos criadores do “Slow Science”. O grupo chegou a criar um manifesto, no final do ano passado, em que proclama: “Somos cientistas, não blogamos, não tuitamos, temos nosso tempo”.

“A ciência lenta sempre existiu ao longo de séculos. Agora, precisa de proteção.” O documento está na porta da geladeira do laboratório do médico brasileiro Rachid Karam, que faz pós-doutorado na Universidade da Califórnia em San Diego.

“O manifesto faz sentido. Temos de verificar os dados antes de tirarmos conclusões precipitadas”, analisa. “A ‘Slow Science’ nos daria tempo para analisar uma hipótese em profundidade e tirar conclusões acertadas.”

De acordo com Obleser, o número de cientistas simpatizantes do movimento está crescendo, “especialmente na América Latina”. “Mas não é preciso se filiar formalmente. Basta imprimir o manifesto e montar guarda no seu departamento”, diz.

O Slow Science é um braço do já conhecido “Slow Food”, que defende uma alimentação mais lenta e saudável, tanto no preparo quanto no consumo dos alimentos. Na ciência, a ideia é pregar a pesquisa que não se paute só pelo resultado rápido.

Ceticismo – “É improvável que o ritmo de fazer pesquisa seja diminuído por meio de um acordo mundial em que cada cientista assume o compromisso de desacelerar seus trabalhos”, diz o especialista em cientometria (medição da produtividade científica) Rogério Meneghini. Ele é coordenador científico do Projeto SciELO, que reúne publicações da América Latina com acesso livre.

Para Meneghini, o “Slow Science” é um movimento “anêmico” num contexto em que a rapidez do fluxo de ideias e informações acelera as descobertas. “Parece uma reivindicação de um velho movimento com uma roupagem nova. É certamente a sensação de quem está perdendo as pernas para correr”, conclui.
(Folha de São Paulo)

Let’s Take Back the Sky! (City Pulse)

by Brian McKenna
November 7, 2001

Saturday, Nov. 3, just hours before the planned confrontation with the enemy, our intelligence assessed its radar, consulted U.S. satellite imagery and identified the front. It would be a good day for bombing, “sunny with a high of 58 degrees.”

The U.S. war on Afghanistan?

No, Stormtracker 6’s weather prediction efforts for the Spartan/Wolverine football game, Michigan’s civil war.

Historically, weather forecasting came of age with the D-Day assault on Normandy Beach in 1944. A half-century later weather-work still retains its militant glow.

Consider the language. Channel 10’s “Sky Team” and “Stormtracker 6” punctuate their TV reports with alerts, watches, warnings, outbreaks, damage, hazards and threats. Like the Joint Chiefs, they monitor the scene with satellites, radar, chase vans, web cams and computerized maps. On occasion, they’ll interrupt our TV viewing with dire warnings of impending disaster, using the shrill three-note cry of the Emergency Broadcast System, originally intended for nuclear alerts. Channel 6’s WLNS will even e-mail you the warnings upon request.

The shift in terminology from the innocuous “weather report” to the ominous “Stormtracker 6” serves notice of a perennial threat.

There are rarely serious storm-related casualties in Lansing, yet Channels 6 and 10 have three full-time weathercasters apiece (yet not a single full-time environmental reporter).

What’s going on? According to several media critics, the latent function of the weather forecast is to reassure you that our boys (the “Sky Team”) are patrolling the heavens and carefully tracking any potential invaders. It’s 11 o’clock prayers, a psychological tonic. All is right with the world as you lay your head upon the pillow.

It’s no mistake that TV weather borrows the metaphorical ammunition of football and war. For, at its heart, U.S. culture is awash in fear and aggression. Has been for decades. And the “cultural cops,” be they Marines, Spartans or middle-aged weathermen with video map-clickers, are on guard, making us safe from “The Other.” Be they terrorists, a football rival or a storm.

Weather has become “the discourse of reactionary time,” says Alex Cockburn, social critic. Weather is supposed to be about our ability to “undergo or endure the action of the elements” in the open air. But weather reporters usually restrict analysis of those elements to the “natural” ones like H2O, lightning and tornadoes. Missing is coverage of human-made elements or compounds like sulfur dioxide, nitrogen dioxide, mercury and hundreds of toxic chemicals spewing from General Motors car assembly plants, the Lansing Board of Water and Light’s coal fired utility or our car exhausts. These airborne elements – totaling hundreds of thousands of tons per year — account for untold levels of Lansing-area disease from cancer, hypertension and asthma.

There is some positive political movement around the edges of weather reporting. The cultural pressure on weathermen to report allergy alerts, ozone action days and high ultraviolet radiation days has highlighted the fact that, like it or not, weathermen are influential educators about nature and the environment.

Ironically, some local weathermen yearn to be seen as environmentalists. Channel 6 meteorologists highlight their relationship with the Ebersole Environmental Center, where once a month they escort a class of Lansing’s public school students for a nature study. Their Web site even has several links to interesting “Science and Astronomy” sites. Sadly, these fact-filled portals into the ecological and astronomical worlds are marginal to the TV show, where a de-politicized rhetoric of temperatures, clouds and the obvious abound.

Let’s imagine that TV weatherfolk really covered “the elements” in all their ecological diversity. Let’s fantasize about weathermen who enlighten, not just put us to sleep. Here are two items that I would have reported on last week:

October 2001 was the fourth wettest on record. It rained 5.69 inches. That equates to 123.5 million gallons of raw sewage that overflowed into the Grand River last month, a record amount for October.

On Thursday, Nov. 1, an environmental group named PEER (Public Employees for Environmental Responsibility) released the third report that was suppressed by the Ingham County Health Department (the others are on water and food). It found an asthma epidemic among African American youth and particularly high asthma rates in the 48915 area code. (See the report at: http://www.peer.org/michigan/Ingham_air.pdf)

I’d include stories or guest spots by naturewatch folk in every broadcast. Wouldn’t it be nice to know that the red salamander had just come out of hibernation that day? Or that the full moon was rising on the “Give Peace a Chance” concert next Saturday night?

http://www.lansingcitypulse.com/lansing/archives/011107/health/index.html

She’s Alive… Beautiful… Finite… Hurting… Worth Dying for.

This is a non-commercial attempt to highlight the fact that world leaders, irresponsible corporates and mindless ‘consumers’ are combining to destroy life on earth. It is dedicated to all who died fighting for the planet and those whose lives are on the line today. The cut was put together by Vivek Chauhan, a young film maker, together with naturalists working with the Sanctuary Asia network (www.sanctuaryasia.com).

Why Global Warming Slowed in the 2000’s: Another Possible Explanation (Climate Central)

Published: July 21st, 2011
By Michael D. Lemonick

The world is getting progressively warmer, and the vast majority of evidence points to greenhouse gases spewed into the atmosphere by humans — carbon dioxide (CO2), especially — as the main culprit. But while the buildup of greenhouse gases has been steadily increasing, the warming goes in fits and starts. From one year to the next it might get a little warmer or a lot warmer, or even cooler.

That’s because greenhouse gases aren’t the whole story. Natural variations in sunlight and ocean currents; concentrations of particles in the air, manmade and otherwise; and even plain old weather variations can speed the warming up or slow it down, even as the underlying temperature trend continues upward. And while none of those factors is likely to change that trend over the long haul, scientists really want to understand how they affect projections of where our climate is heading.

The latest attempt to do so just appeared in Science Express, the online counterpart of the journal Science, where a team of climate scientists is reporting on their investigations of airborne particles, or aerosols, in the stratosphere. It’s well known, says co-author John Daniel, of the National Oceanic and Atmospheric Administration’s Earth System Research Laboratory in Boulder, Colo., that these particles have a cooling effect, since they reflect sunlight that would otherwise warm the planet.

Mt. Pinatubo’s erruption in the Philippines, in 1991. Credit: USGS.

It’s also well known that major volcanic eruptions, like Mt. Pinatubo’s in the Philippines in 1991, can pump lots of aerosols into the stratosphere — and indeed, Pinatubo alone temporarily cooled the planet for about two years. The explosion of Mt. Tambora in 1815 had even more catastrophic effects, which you can imagine given that 1816 came to be known as “the year without a summer.” But what lots of people thought, says Daniel, “is that since there haven’t been any eruptions on that scale recently, aerosols have become relatively unimportant for climate.”

That, says the study, is not true: even without major eruptions, aerosols in the stratosphere increased by about 7 percent per year from 2000 to 2010. Plug that figure into climate models, and they predict a reduction in the warming you’d otherwise expect from the rise in greenhouse gases by up to 20 percent.

In the real world, as it happens, the rise in temperature slowed during that same decade. “That,” says Daniel, “was the motivation for doing this research. It could have just been natural climate variability, but we wondered if it could be something else.” Some climate scientists attribute the slowdown to heat being temporarily stored in the deep oceans, but stratospheric aerosols could clearly be part of the answer as well.

Whether these aerosols are natural or manmade, however, is something the scientists didn’t address. Just last week, a paper in Proceedings of the National Academy of Sciences (PNAS) suggested the cause was a construction boom of coal-fired power plants in China over the same decade. The new study doesn’t necessarily contradict that. “Human emissions could play a role,” says Daniel, although the PNAS study was talking about aerosols in the lower atmosphere, not the stratosphere. “But even in the absences of colossal volcanic eruptions,” he says, “smaller eruptions could still add up.”

The other difference between the two studies is that the one from last week looked at the relatively slow temperature rise over the most recent decade and tried to tease out what might have changed since the previous decades, when the warming was faster. The new one took actual observations of aerosols and tried to predict what the temperature rise should be. That sort of approach tends to produce more credible results, since an incorrect prediction would stick out like a sore thumb.

Where the two studies emphatically agree is that if the level of aerosols goes down — due to a lull in eruptions, or a reduction in coal-plant pollution, or both — the pace of warming would likely pick up. That would mean that current projections for up to a 4.5°C increase in global average surface temperatures by the end of the century might turn out to be an underestimate. And if aerosol levels increase, the temperature in 2100 could be lower than everyone expects.

80 Percent of World Climate Data Are Not Computerized and Readily Available (Science Daily)

Science News

ScienceDaily (July 20, 2011) — In order to gain a better knowledge of climate variations, such as those caused by global warming, and be able to tackle them, we need to understand what happened in the recent past. This is the conclusion of a research study led by the Rovira i Virgili University (URV), which shows that the scientific community today is only able to access and analyse 20% of the recorded climate information held. The remaining data are not accessible in digital format.

Some climate data in Europe go back to the 17th Century, but “not even 20% of the information recorded in the past is available to the scientific community,” Manola Brunet, lead author of the study and a researcher at the URV’s Centre for Climate Change, said.

This situation is even worse in continents such as Africa and South America, where weather observations did not begin until the middle of the 19th Century. These are the results of a study published in Climate Research, which highlights the need to urgently recover all the information recorded in perishable formats.

“Failure to decipher the messages in the climate records of the past will result in socioeconomic problems, because we will be unable to deal with the current and future impacts of climate change and a hotter world,” says Brunet.

Spain, along with the USA, Canada, Holland and Norway, is one of a small number of countries which allows partial access to its historic climate data. The rest of the world does not make these data available to the scientific community or the general public, despite recommendations to this effect by the World Meteorological Organization (WMO).

In order to overcome the political and legal hurdles posed by this currently poor access, “governments should adopt a resolution within the United Nations on opening up their historical climate data,” the researcher suggests.

Predicting heat waves

Weather services in all countries are faced with the overwhelming job of converting all their paper-based historical climate information, which is stored in archives, libraries and research centres, into digital format. The wide range of forms in which the information is held makes access harder, as do the purposes for which the meteorological service itself was actually created.

“The main objective is to provide a weather service to public, who want to know what the weather will be like the next day,” explains Brunet. This has led to climate science (which studies the range of atmospheric conditions characterising a region rather than focusing on weather forecasting) becoming the great ‘victim’, receiving fewer funds with which to digitise, develop and standardise data.

However, climate services do play a significant role in some European countries, the United States and Canada. It was these services that were able to explain last summer’s heat wave in Eastern Europe and put it into context, as well as the high temperatures recorded on the Old Continent in 2003.

“If we had access to all the historical data recorded, we would be able to evaluate the frequency with which these phenomena are likely to occur in the future with a higher degree of certainty,” the expert explains.

This kind of information is of scientific, social and economic interest, with insurance companies setting their premiums according to expected climate changes, for example. City councils and governments also “want to understand climate conditions and how these will change in future in order to improve land zoning and prevent urban development from taking place in areas likely to be affected by flooding,” concludes Brunet.

Science and truth have been cast aside by our desire for controversy (Guardian)

Last week’s report into media science coverage highlighted an over-reliance on pointless dispute

Robin McKie
The Observer, Sunday 24 July 2011

Thomas Huxley, the British biologist who so vociferously, and effectively, defended Darwin’s theory of natural selection in the 19th century, had a basic view of science. “It is simply common sense at its best – rigidly accurate in observation and merciless to fallacy in logic.”

It is as neat a description as you can get and well worth remembering when considering how science is treated by the UK media and by the BBC in particular. Last week, a study, written by geneticist Steve Jones, warned that far too often the corporation had failed to appreciate the nature of science and to make a distinction “between well-established fact and opinion”. In doing so, the corporation had given free publicity to marginal belief, he said.

Jones was referring to climate change deniers, anti-MMR activists, GM crop opponents and other fringe groups who have benefited from wide coverage despite the paucity of evidence that supports their beliefs. By contrast, scientists, as purveyors of common sense, have found themselves sidelined because producers wanted to create controversy and so skewed discussions to hide researchers’ near unanimity of views in these fields. In this way, the British public has been misled into thinking there is a basic division among scientists over global warming or MMR.

It is a problem that can be blamed on the media that believe, with some justification, that adversarial dispute is the best way to cover democracy in action. It serves us well with politics and legal affairs, but falls down badly when it comes to science because its basic processes, which rely heavily on internal criticism and disproof, are so widely misunderstood.

Yet there is nothing complicated about the business, says Robert May, the former UK government science adviser. “In the early stages of research, ideas are like hillocks on a landscape. So you design experiments to discriminate among them. Most hillocks shrink and disappear until, in the end, you are left with a single towering pinnacle of virtual certitude.”

The case of manmade climate change is a good example, adds May. “A hundred years ago, scientists realised carbon dioxide emissions could affect climate. Twenty years ago, we thought they were now having an impact. Today, after taking more and more measurements, we can see there is no other explanation for the behaviour of the climate. Humans are changing it. Of course, deniers disagree, but that’s because they hold fixed positions that have nothing to do with science.”

It is the scientist, not the denier, who is the real sceptic, adds Paul Nurse, president of the Royal Society. “When you carry out research, you cannot afford to cherry-pick data or ignore inconvenient facts. You have to be brutal. You also have to be sceptical about your own ideas and attack them. If you don’t, others will.”

When an idea reaches the stage where it’s almost ready to become a paper, it has therefore been subjected to savage scrutiny by its own authors and by their colleagues – and that is before writing has started. Afterwards, the paper goes to peer review where there is a further round of critical appraisal by a separate group of researchers. What emerges is a piece of work that has already been robustly tested – a point that is again lost in the media.

Over the centuries, this process has been honed to near perfection. By proposing and then attacking ideas and by making observations to test them, humanity has built up a remarkable understanding of the universe. The accuracy of Einstein’s theories of relativity, Crick and Watson’s double helix structure of DNA and plate tectonics were all revealed this way, though no scientist would admit these discoveries are the last word, as the palaeontologist Stephen Jay Gould once pointed out: “In science, ‘fact’ can only mean ‘confirmed to such a degree that it would be perverse to withhold provisional assent’,” he admitted.

Certainly, things can go wrong, as Huxley acknowledged. Science may be organised common sense but all too often a beautiful theory created this way has been skewered by “a single ugly fact”, as he put it. Think of Fred Hoyle’s elegant concept of a steady state universe that is gently expanding and eternal. The idea was at one time considered to be philosophically superior to its rival, the big bang theory that proposed the cosmos erupted into existence billions of years ago. The latter idea explained the expansion of the universe by recourse to a vast explosion. The former accounted for this expansion in more delicate, intriguing terms.

The steady state theory continued to hold its own until, in 1964, radio-astronomers Arno Penzias and Robert Woodrow Wilson noted interference on their radio telescope at the Bell Labs in New Jersey and tried to eliminate it. The pair went as far as shovelling out the pigeon droppings in the telescope and had the guilty pigeons shot (each blamed the other for giving the order). Yet the noise persisted. Only later did the two scientists realise what they were observing. The static hiss they were picking up was caused by a microwave radiation echo that had been set off when the universe erupted into existence after its big bang birth.

That very ugly fact certainly ruined Hoyle’s beautiful theory and, no doubt, his breakfast when he read about it in his newspaper. But then the pursuit of truth has always been a tricky and cruel business. “It is true that some things come along like that to throw scientists into a tizz but it doesn’t happen very often,” adds Jones. “The trouble is, the BBC thinks it happens every day.”

And this takes us to the nub of the issue: how should science be reported and recorded? How can you take a topic such as climate change, about which there is virtual unanimity of views among scientists, and keep it in the public’s eye. The dangers of rising greenhouse gas emissions have dramatic implications after all. But simply reporting every tiny shrinkage in polar ice sheets or rise in sea levels will only alienate readers or viewers, a point acknowledged by May. “Newspapers, radio and TV have a duty to engage and there is no point in doing a lot of excellent reporting on a scientific issue if it is boring or trivial. The alternative is to trivialise or distort, thus subordinating substance in the name of attraction. It is a paradox for which I can see no answer.”

Jones agrees. “What we don’t want to do is go back to the days when fawning reporters asked great figures to declaim on scientific issues – or political ones, for that matter. On the other hand, we cannot continue to distort views in the name balance,” It is a tricky business, but as former Times editor Charlie Wilson once told a member of staff upset at a task’s complexity: “Of course, it’s hard. If it was easy we would get an orang-utan to do it.”

Jones, in highlighting a specific problem for the BBC, has opened up a far wider, far more important issue – the need to find ways to understand how science works and to appreciate its insights and complexities. It certainly won’t be easy.