Arquivo da tag: ciência

Here comes the story of the Dylan fans (ki.se)

Updated on 2014-09-25. Published on 2014-09-25

Dylan fans: Jonas Frisén, Konstantinos Meletis, Jon Lundberg, Kenneth Chien and Eddie Weitzberg. Photo: Gustav Mårtensson

An internal contest has been ongoing between a little band of researchers at Karolinska Institutet. And the one who succeeds in quoting Bob Dylan in most scientific articles before going into retirement is the winner.

The story begins 17 years ago. Jon Lundberg and Eddie Weitzberg, today both professors at the Department of Physiology and Pharmacology at KI, had an article published in Nature Medicine with the title: ‘Nitric Oxide and Inflammation: The answer is blowing in the wind’.

“We both really like Bob Dylan so when we set about writing an article concerning the measurement of nitric oxide gas in both the respiratory tracts and the intestine, with the purpose of detecting inflammation, the title came up and it fitted there perfectly,” says Eddie Weitzberg.

Some years later they saw an article written by Jonas Frisén, Professor at the Department of Cell and Molecular Biology, together with Konstantinos Meletis, Research Assistant at the Department of Neuroscience. The subject of the article was whether blood cells can change and become nerve cells.

“The title was ‘Blood on the tracks: a simple twist of fate’; this is the name of the album on the one hand, and a song of Bob Dylan on the other, and the article contained additional Dylan references,” points out Eddie Weitzberg.

Jon Lundberg and Eddie Weitzberg then succeeded in introducing ‘The times they are a-changin’ into the title in a separate article and, at the same time, sent an email to Jonas Frisén and announced the launch of an internal competition.

“The one who has written most articles with Dylan quotes, before going into retirement, wins a lunch at the Solna restaurant Jöns Jacob,” explains Jon Lundberg.

Jonas Frisén and a colleague responded with the article ‘Eph receptors tangled up in two’ in Cell Cycle the same year, 2010, the title of which is inspired by Bob Dylan’s song ‘Tangled up in blue’. The following year, Jon Lundberg and Eddie Weitzberg countered with ‘Dietary nitrate – a slow train coming’ in The Journal of Physiology.

“This article also concluded with a paraphrase of Dylan: ‘We know something is happening, but we don’t know what it is – Do we, Dr Jones?’ where we jokingly addressed a British colleague with the same surname,” says Jon Lundberg.

Moreover, Kenneth Chien, Professor of Cardiovascular Research at the Department of Cell and Molecular Biology and the Department of Medicine, Huddinge, has also been quoting Bob Dylan but – until very recently – was completely unaware of the articles of the others. ‘Tangled up in blue: Molecular cardiology in the postmolecular era’was published in Circulation 1997; the same year that Lundberg’s and Eddie Weitzberg’s first article with a  Dylan quote was published.

When the five researchers met up in August to have their photo taken for this article, Bob Dylan is the obvious subject of conversation. They discuss eagerly who has read the Bob Dylan autobiography entitled Chronicles and enquire about the internal competition.

“The contest is open for everyone,” says Jon Lundberg. He goes on to explain that they usually draw attention to one another’s new articles via email.

The researchers also point out that it is primarily in review articles and commentaries that it is possible to use quotes since these articles are often slightly lighter in tone (less heavyweight) than others.

“But it’s important that the quote is linked to the scientific content, that it reinforces the message and raises the quality of the article as such, not the reverse,” says Jonas Frisén.

What then is so special about Bob Dylan? Eddie Weitzberg thinks he merits a Nobel prize for Literature while Kenneth Chien compares him to a modern Shakespeare, though in music. But the researchers also draw parallels between Bob Dylan’s music and the world of research.

“A musician who merely continues down the same highway for 30 years is not one who many want to listen to. Good music is innovative, like Bob Dylan’s. And the same thing applies to good research. A researcher must also try to find new and different paths,” says Konstantinos Meletis.

Text: Lisa Reimegård

In the photo:

Jonas Frisén, Professor of stem cell research at the Department of Cell and Molecular Biology. Member of The Nobel Assembly at Karolinska Institutet.

Konstantinos Meletis, Research Associate at the Department of Neuroscience, Karolinska Institutet.

Jon Lundberg, Professor of Nitric Oxide Pharmacologics at the Department of Physiology and Pharmacology, Karolinska Institutet.

Kenneth Chien, Professor at the Department of Cell and Molecular Biology, and the Department of Medicine.Karolinska Institutet. Before Dr. Kenneth Chien  was recruited to Karolinska Institutet he was a Professor in the Department of Stem Cell and Regenerative Biology at Harvard University in Cambridge.

Eddie Weitzberg, Professor of Anesthesiology and Intensive Care Medicine at the Department of Physiology and Pharmacology, Karolinska Institutet.

Para IPCC, evitar aquecimento de 2°C ainda é possível (Folha de S.Paulo)

Desde segunda-feira, representantes de países debatem em Copenhague a redação final do 4º Relatório Síntese do IPCC

O último documento a ser produzido pelo painel do clima da ONU no ano deve adotar um tom menos pessimista com relação à possibilidade de o planeta evitar um acréscimo de temperatura superior a 2°C neste século, limite considerado perigoso.

Veja a matéria completa: http://www1.folha.uol.com.br/fsp/cienciasaude/193066-para-ipcc-evitar-aquecimento-de-2c-ainda-e-possivel.shtml

(Folha de S.Paulo)

*   *   *

Ano de 2014 pode ser o mais quente da história do planeta

Meses de 2014, à exceção de fevereiro, tiveram as mais altas temperaturas desde 1880, quando começaram os relatórios da Administração Nacional Oceânica e Atmosférica

Quem enfrentou o calor senegalês de janeiro, o inverno de araque em julho e um outubro com cara de verão sentiu na pele os sintomas de um planeta mais aquecido. Não foi uma simples sensação: de fato, é grande a possibilidade de 2014 desbancar 2010 e se tornar o ano mais quente da história, tanto em terra como nas superfícies dos oceanos.

A estimativa é da Administração Nacional Oceânica e Atmosférica (NOAA), agência americana de estudos meteorológicos que computa dados climáticos desde 1880. À exceção de fevereiro, todos os meses do ano até agora bateram recordes como os mais quentes de que se tem notícia.

O conteúdo na íntegra está disponível em: http://zh.clicrbs.com.br/rs/noticias/noticia/2014/10/ano-de-2014-pode-ser-o-mais-quente-da-historia-do-planeta-4631023.html

(Zero Hora)

Antropoceno, Capitaloceno, Cthulhuceno: o que caracteriza uma nova época? (ClimaCom)

28/10/2014

A proposta de formalização de uma nova época da Terra levanta questões sobre utilidade, responsabilidade e formas alternativas de narrar a história do mundo em que vivemos

Por Daniela Klebis

Os impactos das ações humanas sobre o planeta nos últimos 200 anos têm sido tão profundos que podem justificar a definição de nova época para a Terra, o Antropoceno. No último dia 17 de outubro, a Comissão Internacional sobre Estratigrafia (ICS, na sigla inglês), reuniu-se em Berlim para dar continuidade às discussões sobre a formalização dessa nova época terrena, cuja decisão final será votada somente em 2016. A despeito dos processos burocráticos, o termo já foi informalmente assimilado por filósofos, arqueólogos, historiadores, ambientalistas e cientistas do clima e, nesse meio, o debate segue, para além da reunião de evidências físicas, no sentido de compreender sua utilidade: estamos prontos para assumir a época dos humanos?

A história da Terra se divide em escalas de tempo geológicas, que são definidas pela ICS, com sede em Paris, na França. Essas escalas de tempo começam com grandes espaços de tempos chamados éons, que se dividem em eras (como a Mezozóica), e então em períodos (Jurássico, Neogeno),  épocas e por fim, em idades. Quem acenou pela primeira vez a necessidade de definir uma nova época, baseada nos impactos indeléveis das ações humanas sobre a paisagem terrestre foi o químico atmosférico Paul J. Crutzen, prêmio Nobel de química em 1995. Cutzen sugeriu o termo Antropoceno durante o encontro  do Programa Internacional de Geofera e Biosfera (IGBP, na sigla em inglês), no México, em 2000. O evento tinha por objetivo discutir os problemas do Holoceno, a época em que nos encontramos há cerca de 11700 anos,desde o fim da era glacial.

A hipótese sustentada pelos defensores da nova denominação baseia-se nas observações sobre as mudanças iniciadas pelo homem sobre o ambiente desde 1800, cujas evidências geológicas  possuem impacto a  longo prazo na história da Terra.  E quais são as evidências que podem justificar a adoção do termo Antropoceno?  “O que nós humanos mais fizemos nesses dois séculos foi criar coisas que não existiram pelos 4,5 bilhões de anos da história da Terra”, denuncia o geólogo Jan Zalasiewicz, presidente do grupo de trabalho sobre o Antropoceno da ICS, em colóquio em Sidney, na Autrália, em março deste ano.

antropoceno1

Minerais sintéticos, fibras de carbono, plásticos, concreto, são alguns exemplos de novos elementos criados pelo homem. O concreto, um material produzido pela mistura de cimento, areia, pedra e água, vem se espalhando na superfície de nosso planeta a uma velocidade de 2 bilhões de quilômetros por ano, conforme aponta o geólogo.  Abaixo da superfície, escavações em busca de minérios e petróleo já abriram mais de 50 milhões de quilômetros em buracos subterrâneos.

Além das mudanças físicas, a emissão exagerada de dióxido de carbono e outros gases de efeito estufa, resultantes da ação humana, provocam mudanças químicas na atmosfera, como aquecimento global, descongelamento de calotas polares e acifidificação dos oceanos. A biosfera é também analisada, já que mudanças resultantes da perda de habitats, atividades predatórias e invasão de especies também provocam mudanças na composição química e física dos ambientes.

As evidências do impacto da ação humana,que vêm sendo consistentemente apontadas em estudos climáticos, foram reforçadas pelo 5º. Relatório do Painel Intercontinental de Mudanças Climáticas (IPCC), publicado no início do ano, com um consenso de 97% dos cientistas. Mais recentemente, no dia 30 de setembro, um relatório publicado no publicado pela WWF (World Wildlife Fund, em inglês), em parceria com a Sociedade Zoológica de Londres, apontou ainda que, nos últimos 40 anos, 52% da população de animais vertebrados na Terra desapareceu. Ao mesmo tempo, os seres humanos dobraram em quantidade. “Estamos empurrando a biosfera para a sua 6ª. extinção em massa”, alerta Hans-Otto Pörtner, do Instituto Alfred Wegener de Pesquisa Marinha e Polar, em Bremerhaven, Alemanha, e co-autor do capítulo sobre ecossistema do relatório do IPCC publicado nesse ano. Pörtner refere-se às cinco grandes extinções em massa registradas nos últimos 540 milhões de anos, caracterizadas por palentólogos como períodos em que mais de 75% das espécies foram extintas do planeta em um curto intervalo geológico.

“Há 200 anos, a coisas começaram a mudar o suficiente para visivelmente impactar o planeta: a população cresceu, assim como as emissões de CO2”, destaca Zalasiwicz. Segundo ele, o uso de energia cresceu 90 vezes entre 1800 e 2010, e já queimamos cerca de 200 milhões de anos de fósseis, entre carvão, óleo e gás. “Os humanos correspondem a 1/3 de todos os vertebrados da terra. Mas a dominação sem precedentes sobre todos os outros seres vivos, faz dessa a er a humana”, conclui.

Eileen Crist pesquisadora do Departamento de Ciências e Tecnologia na Sociedade, no Virginia Tech, no EUA, desafia a escolha do termo, defendendo que o discurso do Antropoceno deixa de questionar a soberania humana para propor, ao contrário, abordagens tecnológicas que poderiam tornar o domínio humano sustentável. “Ao afirmar a centralidade do homem – tanto como uma força causal quanto como objeto de preocupação – o Antropoceno encolhe o espaço discursivo para desafiar a dominação da biosfera, oferecendo, ao invés disso, um campo técnico-científico para a sua racionalização e um apelo pragmático para nos resignarmos à sua atualidade”, argumenta a pesquidadora em um artigo publicado em 2013.

O Antropoceno, dessa forma, entrelaça uma série de temas na formatação de seu discurso, como, por exemplo, o aumento acelerado da população que chegará a superar os 10 bilhões de habitantes; o crescimento econômico e a cultura de consumo enquanto modelo social dominante; a tecnologia como destino inescapável e, ao mesmo tempo, salvação da vida humana na Terra; e, ainda, o pressuposto de que o impacto humano é natural e contingente da nossa condição de seres providos de inteligência superior. Crist aponta que esse discurso mascara a opção de racionalizar o regime totalitátio do humano no planeta. “Como discurso coeso, ele bloqueia formas alternativas de vida humana na Terra”, indica.

antropoceno2

Relacionalidade

Donna Haraway, professora emérita da Universidade da Califórina em Santa Cruz, EUA, comentou, em participação no Colóquio Os Mil Nomes de Gaia, em setembro, que essa discussão é um dos “modos de buscar palavras que soam muito grandes, porém, não são grandes o suficiente para compreender a continuidade e a precariedade de viver e morrer nessa Terra”. Haraway é também umas das críticas do termo Antropoceno. Segundo ela, o Antropoceno implica um homem individual, que se desenvolve, e desenvolve uma nova paisagem de mundo, estranho a todas as outras formas de vida: uma percepção equivocada de um ser que seria capaz existir sem se relacionar com o resto do planeta. “Devemos compreender que para ser um, devemos ser muitos. Nos tornamos com outros seres”, comenta.

Para Haraway, épreciso, problematizar essa percepção, e endereçar a responsabilidade pelas mudanças, que está justamente no sistema capitalista que criamos. Este sim tem impulsionado a exploração, pelos homens, da Terra: “A história inteira poderia ser Capitaloceno, e não Antropoceno”, diz. Tal percepção, de acordo com a filósofa, pemite-nos resistir ao senso inescapabilidade presente nesse discurso, como Crist mencionou acima. “Estamos cercados pelo perigo de assumir que tudo está acabado, que nada pode acontecer”, diz.

Haraway aponta, entretanto, que é necessário evocar um senso de continuidade (ongoingness,em inglês),a partir de outras possibilidades narrativas e de pensamento.Uma delas, seria o Cthulhuceno, criado pela filósofa. A expressão vem de um conto de H.P.Lovecraft, O chamado de Cthulhu, que fala sobre humanos que têm suas mentes deterioradas quando, em rituais ao deus Cthulhu – uma mistura de homem, dragão e polvo que vive adormecido sob as águas do Pacífico Sul – conseguem vislumbrar uma realidade diferente da que conheciam.  No início da história, o autor norte-americano descreve o seguinte: “A coisa mais misericordiosa do mundo, acho eu, é a incapacidade da mente humana de correlacionar tudo que ela contém”.  A partir desse contexto, Donna Haraway explica que é necessário “desestabilizar mundos de pensamentos, com mundos de pensamentos”. O Cthulhuceno não é sobre adotar uma transcendência, uma ideia de vida ou morte: “trata-se de abraçar a continuidade sinuosa do mundo terreno, no seu passado​​, presente e futuro. Entretanto, tal continuidade implica em assumir que existe um problema muito grande e que ele precisa ser enfrentado. Devemos lamentar o que aconteceu, pois não deveria ter ocorrido. Mas não temos que continuar no mesmo caminho”, sugere.

Some Fear Ebola Outbreak Could Make Nation Turn to Science (The New Yorker)

Borowitz Report
OCTOBER 16, 2014
BY ANDY BOROWITZ

Borowitz-Ebola-Scientists-690CREDIT PHOTOGRAPH BY WILLIAM THOMAS CAIN/GETTY

NEW YORK (The Borowitz Report)—There is a deep-seated fear among some Americans that an Ebola outbreak could make the country turn to science.

In interviews conducted across the nation, leading anti-science activists expressed their concern that the American people, wracked with anxiety over the possible spread of the virus, might desperately look to science to save the day.

“It’s a very human reaction,” said Harland Dorrinson, a prominent anti-science activist from Springfield, Missouri. “If you put them under enough stress, perfectly rational people will panic and start believing in science.”

Additionally, he worries about a “slippery slope” situation, “in which a belief in science leads to a belief in math, which in turn fosters a dangerous dependence on facts.”

At the end of the day, though, Dorrinson hopes that such a doomsday scenario will not come to pass. “Time and time again through history, Americans have been exposed to science and refused to accept it,” he said. “I pray that this time will be no different.”

Crise da água afronta a ciência brasileira (Mundo Sustentável)

23/10/2014 – 03h29

por André Trigueiro*

Sistema Cantareira atinge volume zero em 2014 mes de junho20140515 0002 1024x682 Crise da água afronta a ciência brasileira

Não foi por falta de aviso.

Além do seu incomensurável capital natural, o Brasil construiu ao longo do tempo um robusto estoque de conhecimento científico a respeito de seus biomas, ecossistemas e bacias hidrográficas.

Gente do calibre de José Lutzenberger, Augusto Ruschi e Aziz Ab’Saber (dentre tantos outros que descortinaram novos e importantes horizontes de investigação científica) revelaram que a natureza se comporta como um sofisticado sistema interligado, onde certos gêneros de intervenção, aparentemente inofensivos, podem causar gigantescos estragos.

Não fosse a genialidade e o respeito que impuseram a partir de seus trabalhos científicos, seriam massacrados pelos poderosos da época.

Não foram poucos os políticos inescrupulosos e empresários gananciosos que tentaram a todo custo “desconstruir” (para usar uma palavra da moda) suas reputações.

Deixaram um legado reconhecidamente importante que deveria inspirar uma nova ética no modelo de desenvolvimento, especialmente mais cuidado na forma como certas políticas públicas são concebidas e aplicadas.

Portanto, é curioso imaginar o que Lutz, Ruschi e Ab’Saber diriam hoje sobre essa crise hídrica sem precedentes na história do Brasil?

Em 1980, ao publicar o livro com o sugestivo título “O Fim do Futuro?”, José Lutzenberger denunciava que “a perda da capa vegetal protetora, além de significar o desaparecimento dos habitats essenciais à sobrevivência da fauna e das espécies vegetais mais especializadas e preciosas, causa o desequilíbrio hídrico dos corpos d`água (…) Estamos preparando para o nosso país o mesmo destino que o do cordão subsaariano”.

Um dos primeiros a prever a escassez de água no mundo, Augusto Ruschi denunciava em sucessivos alertas, como nesse texto de 1986, os impactos causados pelo desmatamento sobre a vazão de água dos rios, especialmente na Amazônia:

“Há 35 anos, escrevi que estávamos caminhando para construir na Amazônia o segundo maior deserto do mundo. Hoje, a previsão vai se confirmando. No primeiro ano, depois que desmatam, é uma beleza: o solo continua fértil, produz-se muito. Mas, depois, a matéria orgânica é lixiviada para as profundezas do solo e planta nenhuma vai lá embaixo buscá-la. Forma-se o cerrado, depois a caatinga, e finalmente, o deserto”.

Um dos mais respeitados cientistas brasileiros, Aziz Ab’Saber denunciou abertamente o absurdo do novo Código Florestal ter sido aprovado há quase três anos no Congresso Nacional sem o respaldo da ciência. E previu consequências trágicas para os recursos hídricos.

“Trata-se de desconhecimento entristecedor sobre a ordem de grandeza das redes hidrográficas do território intertropical brasileiro” (…) Em face do gigantismo do território e da situação real em que se encontram os seus macro biomas – Amazônia Brasileira, Brasil Tropical Atlântico, Cerrados do Brasil Central, Planalto das Araucárias, e Pradarias Mistas do Brasil Subtropical – e de seus numerosos minibiomas, faixas de transição e relictos de ecossistemas, qualquer tentativa de mudança do Código Florestal tem que ser conduzido por pessoas competentes bioeticamente sensíveis”.

Como se sabe, não foi assim que aconteceu. Prevaleceram os interesses da bancada ruralista.

Em tempo: o desmatamento na Amazônia entre agosto e setembro aumentou 191%, segundo dados apurados pelo Instituto Imazon.

E os candidatos à Presidência, o que dizem?

Bem, a cada novo dia de campanha eleitoral o Brasil tem menos água e menos floresta. E as prioridades continuam sendo outras.

Mas o legado de Lutz, Ruschi e Ab’Saber segue incomodando. Até que alguém resolva prestar atenção e evitar uma catástrofe ainda maior.

Ouça o comentário sobre este assunto na Rádio CBN.

* André Trigueiro é jornalista com pós-graduação em Gestão Ambiental pela Coppe-UFRJ onde hoje leciona a disciplina geopolítica ambiental, professor e criador do curso de Jornalismo Ambiental da PUC-RJ, autor do livro Mundo Sustentável – Abrindo Espaço na Mídia para um Planeta em Transformação, coordenador editorial e um dos autores dos livros Meio Ambiente no Século XXI, e Espiritismo e Ecologia, lançado na Bienal Internacional do Livro, no Rio de Janeiro, pela Editora FEB, em 2009. É apresentador do Jornal das Dez e editor chefe do programa Cidades e Soluções, da Globo News. É também comentarista da Rádio CBN e colaborador voluntário da Rádio Rio de Janeiro.

** Publicado originalmente no site Mundo Sustentável.

(Mundo Sustentável)

Disponíveis em versão português diretrizes internacionais sobre uso de animais (Jornal da Ciência)

Uma das diretrizes é melhorar o relato da investigação feita com animais

O Conselho Nacional de Controle de Experimentação Animal (CONCEA) divulgou, versão em português, as diretrizes  elaboradas pelo Centro para Substituição, Aperfeiçoamento e Redução de Animais em Pesquisa (NC3Rs, na sigla em inglês) informando como relatar, em artigos científicos, dados relevantes sobre o uso animal para fins científicos, seguindo os parâmetros internacionais.

As diretrizes ARRIVE guidelines (Animal Research: Reporting of In Vivo Experiments) foram desenvolvidas como parte de uma iniciativa do NC3Rs para melhorar o desenho, a análise e o manuscrito de investigação com animais – maximizando a informação publicada e minimizando estudos desnecessários. As diretrizes foram publicadas na revista PLOS Biology em Junho 2010 e são atualmente endossadas por revistas científicas, agências de financiamento e sociedades científicas.

Uma das diretrizes é melhorar o relato da investigação feita com animais. Outra é melhorar a comunicação das observações científicas para toda comunidade científica.

Acesse as diretrizes ARRIVE em português.

(Jornal da Ciência)

Plantio de florestas é estratégia de enfrentamento do aquecimento global (Fapesp)

08 de outubro de 2014

Por Karina Toledo

Agência FAPESP – Em um artigo publicado na seção de opinião do jornal norte-americano The New York Times, em 19 de setembro, Nadine Unger, professora da Yale University, afirmou serem fracas as evidências científicas sobre os benefícios proporcionados pelo reflorestamento e pela redução do desmatamento na mitigação das mudanças climáticas.

O texto causou forte reação na comunidade científica. No dia 22 de setembro, um grupo formado por 31 pesquisadores – vários deles membros do Painel Intergovernamental de Mudanças Climáticas (IPCC) da Organização das Nações Unidas (ONU) – divulgou uma carta aberta na qual discordam veementemente das declarações feitas por Unger.

Uma versão resumida do texto foi publicada na seção de opinião do The New York Times no dia 23 de setembro, mesma data em que começou em Nova York a Cúpula da Organização das Nações Unidas (ONU) sobre o Clima.

Na carta resposta, o grupo de cientistas contesta a afirmação de Unger, de que estaria incorreta a “sabedoria convencional” segundo a qual o plantio de árvores auxilia no combate ao aquecimento global. Na avaliação dela, a medida poderia até mesmo agravar o problema climático.

De acordo com os cientistas, as florestas promovem um efeito de resfriamento do clima porque armazenam vastas quantidades de carbono em troncos, galhos, folhas e são capazes de manter esse elemento químico fora da atmosfera enquanto permanecerem intactas e saudáveis.

Segundo o grupo, as florestas também resfriam a atmosfera porque convertem a energia solar em vapor d’água, o que aumenta a refletividade da radiação solar por meio da formação de nuvens, fato negligenciado no trabalho de Unger. Concordam, em parte, com a afirmação da professora de Química Atmosférica em Yale, de que “as cores escuras das árvores absorvem maior quantidade de energia solar e aumentam a temperatura da superfície terrestre”.

Unger afirmou que plantar árvores nos trópicos poderia promover o resfriamento, mas em regiões mais frias causaria aquecimento.

“Ela (Unger) aponta corretamente que florestas refletem menos energia solar do que a neve, as pedras, as pastagens ou o solo, mas ignora o efeito das florestas de aumentar a refletividade do céu acima da terra, por meio das nuvens. Esse efeito é maior nos trópicos”, afirmaram os cientistas.

Unger disse não haver consenso científico em relação aos impactos da mudança de uso da terra promovida pela expansão da agricultura e se o desmatamento resultante teria contribuído para esfriar ou aquecer o planeta.

“Não podemos prever com certeza que o reflorestamento em larga escala ajudaria a controlar as temperaturas em elevação”, disse ela. Argumentos semelhantes já haviam sido apresentados pela cientista em artigo publicado em agosto na Nature Climate Change.

Ainda segundo Unger, os compostos orgânicos voláteis (VOCs, na sigla em inglês) emitidos pelas árvores em resposta a estressores ambientais interagem com poluentes oriundos da queima de combustíveis fósseis aumentando a produção de gases-estufa como metano e ozônio.

Por último, a cientista de Yale afirmou que o carbono sequestrado pelas árvores durante seu crescimento retorna à atmosfera quando elas morrem e que o oxigênio produzido durante a fotossíntese é consumido pela vegetação durante a respiração noturna. “A Amazônia é um sistema fechado que consome seu próprio carbono e oxigênio”, argumentou.

Benefícios indiscutíveis

A carta resposta divulgada pelos cientistas ressalta que os próprios estudos de Unger mostraram que qualquer potencial efeito de resfriamento promovido pela redução das emissões de compostos orgânicos voláteis resultante do corte de árvores seria superado pelo efeito de aquecimento promovido pelas emissões de carbono causadas pelo desmatamento.

“Esta semana, as negociações das Nações Unidas sobre o clima abordam a importância de dar continuidade aos esforços para frear a degradação das florestas tropicais, que são uma contribuição essencial e barata para a mitigação das mudanças climáticas. A base científica para essa importante peça da solução do problema climático é sólida. Nós discordamos fortemente da mensagem central da professora Unger. Concordamos, no entanto, com a afirmação feita por ela de que as florestas oferecem benefícios indiscutíveis para a biodiversidade”, concluem os cientistas.

O grupo de autores é liderado por Daniel Nepstad, diretor executivo do Earth Innovation Institute, dos Estados Unidos, um dos fundadores do Instituto de Pesquisa Ambiental da Amazônia (Ipam) e um dos autores do quinto relatório divulgado pelo IPCC.

Também fazem parte do grupo Reynaldo Victoria, professor da Universidade de São Paulo (USP) e membro da coordenação do Programa FAPESP de Pesquisa sobre Mudanças Climáticas Globais, e Paulo Artaxo, professor da USP e um dos autores do quinto relatório do IPCC.

“O artigo divulgado por Unger na revista Nature Climate Change tem erros elementares e não leva em conta aspectos fundamentais, como a importância das florestas tropicais na formação de nuvens, que altera a refletividade da superfície e também atua no controle do ciclo hidrológico”, disse Artaxo à Agência FAPESP.

“Esse episódio mostra como a ciência, quando negligencia aspectos importantes, pode ser muito prejudicial do ponto de vista de políticas públicas. Reflorestamento e redução do desmatamento são umas das melhores estratégias de redução dos efeitos do aquecimento global”, afirmou.

Can Big Data Tell Us What Clinical Trials Don’t? (New York Times)

CreditIllustration by Christopher Brand

When a helicopter rushed a 13-year-old girl showing symptoms suggestive of kidney failure to Stanford’s Packard Children’s Hospital, Jennifer Frankovich was the rheumatologist on call. She and a team of other doctors quickly diagnosed lupus, an autoimmune disease. But as they hurried to treat the girl, Frankovich thought that something about the patient’s particular combination of lupus symptoms — kidney problems, inflamed pancreas and blood vessels — rang a bell. In the past, she’d seen lupus patients with these symptoms develop life-threatening blood clots. Her colleagues in other specialties didn’t think there was cause to give the girl anti-clotting drugs, so Frankovich deferred to them. But she retained her suspicions. “I could not forget these cases,” she says.

Back in her office, she found that the scientific literature had no studies on patients like this to guide her. So she did something unusual: She searched a database of all the lupus patients the hospital had seen over the previous five years, singling out those whose symptoms matched her patient’s, and ran an analysis to see whether they had developed blood clots. “I did some very simple statistics and brought the data to everybody that I had met with that morning,” she says. The change in attitude was striking. “It was very clear, based on the database, that she could be at an increased risk for a clot.”

The girl was given the drug, and she did not develop a clot. “At the end of the day, we don’t know whether it was the right decision,” says Chris Longhurst, a pediatrician and the chief medical information officer at Stanford Children’s Health, who is a colleague of Frankovich’s. But they felt that it was the best they could do with the limited information they had.

A large, costly and time-consuming clinical trial with proper controls might someday prove Frankovich’s hypothesis correct. But large, costly and time-consuming clinical trials are rarely carried out for uncommon complications of this sort. In the absence of such focused research, doctors and scientists are increasingly dipping into enormous troves of data that already exist — namely the aggregated medical records of thousands or even millions of patients to uncover patterns that might help steer care.

The Tatonetti Laboratory at Columbia University is a nexus in this search for signal in the noise. There, Nicholas Tatonetti, an assistant professor of biomedical informatics — an interdisciplinary field that combines computer science and medicine — develops algorithms to trawl medical databases and turn up correlations. For his doctoral thesis, he mined the F.D.A.’s records of adverse drug reactions to identify pairs of medications that seemed to cause problems when taken together. He found an interaction between two very commonly prescribed drugs: The antidepressant paroxetine (marketed as Paxil) and the cholesterol-lowering medication pravastatin were connected to higher blood-sugar levels. Taken individually, the drugs didn’t affect glucose levels. But taken together, the side-effect was impossible to ignore. “Nobody had ever thought to look for it,” Tatonetti says, “and so nobody had ever found it.”

The potential for this practice extends far beyond drug interactions. In the past, researchers noticed that being born in certain months or seasons appears to be linked to a higher risk of some diseases. In the Northern Hemisphere, people with multiple sclerosis tend to be born in the spring, while in the Southern Hemisphere they tend to be born in November; people with schizophrenia tend to have been born during the winter. There are numerous correlations like this, and the reasons for them are still foggy — a problem Tatonetti and a graduate assistant, Mary Boland, hope to solve by parsing the data on a vast array of outside factors. Tatonetti describes it as a quest to figure out “how these diseases could be dependent on birth month in a way that’s not just astrology.” Other researchers think data-mining might also be particularly beneficial for cancer patients, because so few types of cancer are represented in clinical trials.

As with so much network-enabled data-tinkering, this research is freighted with serious privacy concerns. If these analyses are considered part of treatment, hospitals may allow them on the grounds of doing what is best for a patient. But if they are considered medical research, then everyone whose records are being used must give permission. In practice, the distinction can be fuzzy and often depends on the culture of the institution. After Frankovich wrote about her experience in The New England Journal of Medicine in 2011, her hospital warned her not to conduct such analyses again until a proper framework for using patient information was in place.

In the lab, ensuring that the data-mining conclusions hold water can also be tricky. By definition, a medical-records database contains information only on sick people who sought help, so it is inherently incomplete. Also, they lack the controls of a clinical study and are full of other confounding factors that might trip up unwary researchers. Daniel Rubin, a professor of bioinformatics at Stanford, also warns that there have been no studies of data-driven medicine to determine whether it leads to positive outcomes more often than not. Because historical evidence is of “inferior quality,” he says, it has the potential to lead care astray.

Yet despite the pitfalls, developing a “learning health system” — one that can incorporate lessons from its own activities in real time — remains tantalizing to researchers. Stefan Thurner, a professor of complexity studies at the Medical University of Vienna, and his researcher, Peter Klimek, are working with a database of millions of people’s health-insurance claims, building networks of relationships among diseases. As they fill in the network with known connections and new ones mined from the data, Thurner and Klimek hope to be able to predict the health of individuals or of a population over time. On the clinical side, Longhurst has been advocating for a button in electronic medical-record software that would allow doctors to run automated searches for patients like theirs when no other sources of information are available.

With time, and with some crucial refinements, this kind of medicine may eventually become mainstream. Frankovich recalls a conversation with an older colleague. “She told me, ‘Research this decade benefits the next decade,’ ” Frankovich says. “That was how it was. But I feel like it doesn’t have to be that way anymore.”

Futures of the Past – The Appendix

Futures of the Past

“Futures of the Past” is an issue about how past generations have reckoned their collective futures. But it’s also about how the razor’s edge of the present comes up against the haziness of futurity, and what happens when that hazy future becomes inscribed, remembered, and—eventually—forgotten. We’re interested here in the work that the future does in shaping history—as a utopian dream, a set of collective anxieties, or simply as a story that we tell about where we come from and where we hope to end up.


Chapter 1: Bad Predictions


Chapter 2: Futures Past


Chapter 3: The Politics of the Future

The cultural side of science communication (Northwestern University)

30-Sep-2014

Hilary Hurd Anyaso

New research explores how culture affects our conceptions of nature

EVANSTON, Ill. — Do we think of nature as something that we enjoy when we visit a national park and something we need to “preserve?” Or do we think of ourselves as a part of nature? A bird’s nest is a part of nature, but what about a house?

The answers to these questions reflect different cultural orientations. They are also reflected in our actions, our speech and in cultural artifacts.

A new Northwestern University study, in partnership with the University of Washington, the American Indian Center of Chicago and the Menominee tribe of Wisconsin, focuses on science communication and how that discipline necessarily involves language and other media-related artifacts such as illustrations. The challenge is to identify effective ways of communicating information to culturally diverse groups in a way that avoids cultural polarization, say the authors.

“We suggest that trying to present science in a culturally neutral way is like trying to paint a picture without taking a perspective,” said Douglas Medin, lead author of the study and professor of psychology in the Weinberg College of Arts and Sciences and the School of Education and Social Policy at Northwestern.

This research builds on the broader research on cultural differences in the understanding of and engagement with science.

“We argue that science communication — for example, words, photographs and illustrations — necessarily makes use of artifacts, both physical and conceptual, and these artifacts commonly reflect the cultural orientations and assumptions of their creators,” write the authors.

“These cultural artifacts both reflect and reinforce ways of seeing the world and are correlated with cultural differences in ways of thinking about nature. Therefore, science communication must pay attention to culture and the corresponding different ways of looking at the world.”

Medin said their previous work reveals that Native Americans traditionally see themselves as a part of nature and tend to focus on ecological relationships. In contrast, European-Americans tend to see humans as apart from nature and focus more on taxonomic relationships.

“We show that these cultural differences are also reflected in media, such as children’s picture books,” said Medin, who co-authored the study with Megan Bang of the University of Washington. “Books authored and illustrated by Native Americans are more likely to have illustrations of scenes that are close-up, and the text is more likely to mention the plants, trees and other geographic features and relationships that are present compared with popular children’s books not done by Native Americans.

“The European-American cultural assumption that humans are not part of ecosystems is readily apparent in illustrations,” he said.

The authors went to Google images and entered “ecosystems,” and 98 percent of the images did not have humans present. A fair number of the remaining 2 percent had children outside the ecosystem, observing it through a magnifying glass and saying, “I spy an ecosystem.”

“These results suggest that formal and informal science communications are not culturally neutral but rather embody particular cultural assumptions that exclude people from nature,” Medin said.

Medin and his research team have developed a series of “urban ecology” programs at the American Indian Center of Chicago, and these programs suggest that children can learn about the rest of nature in urban settings and come to see humans as active players in the world ecosystems.

Concea abre consulta pública para guia de uso de animais (MCTI)

Sociedade pode sugerir mudanças em propostas de manuais para pesquisa e ensino com primatas e estudos clínicos fora das instalações convencionais.

O Conselho Nacional de Controle de Experimentação Animal (Concea) abriu nesta quinta-feira (25), ao publicar  no Diário Oficial da União (DOU), uma consulta pública de 21 dias para dois capítulos do Guia Brasileiro de Produção e Utilização de Animais para Atividades de Ensino ou Pesquisa Científica.

Aprovado por etapas, o guia em elaboração contempla tópicos destinados a aves, cães, gatos, lagomorfos (como coelhos e lebres) e roedores, entre outros grupos taxonômicos.

Os capítulos sob consulta tratam de “primatas não humanos” e “estudos clínicos conduzidos a campo”. Sugestões de mudanças nos textos devem ser detalhadas e justificadas por meio do preenchimento de formulários disponíveis na página do conselho e, então, encaminhadas ao endereço eletrônico consultapubl.concea@mcti.gov.br.

“Essa participação da sociedade é importante porque o guia será a base para a definição dos requisitos necessários para a solicitação do licenciamento de atividades de pesquisa e ensino com animais, sem o qual o uso de determinada espécie não será permitido, conforme estabelecido na Lei Arouca”, destaca o coordenador do Concea, José Mauro Granjeiro.

Os dois capítulos devem incorporar considerações da sociedade antes da 26ª Reunião Ordinária do Concea, em 26 e 27 de novembro, quando a instância colegiada planeja apreciar o conteúdo e aprovar os documentos finais, a serem publicados no DOU. Nos meses seguintes, outros trechos do guia têm previsão de passar por consulta pública, abrangendo outros grupos taxonômicos como peixes, ruminantes, equinos, suínos, répteis e anfíbios.

Também nesta quinta, foi publicada uma lista com 17 métodos para substituir ou reduzir o uso de animais em testes toxicológicos. Divididos em sete grupos, as técnicas servem para medir o potencial de irritação e corrosão da pele e dos olhos, fototoxicidade, absorção e sensibilização cutânea, toxicidade aguda e genotoxicidade.

Primatas – Com 73 páginas, o capítulo acerca de primatas não humanos aborda a relevância desse conjunto de animais em análises sobre doenças virais e pesquisas biomédicas. O texto associa a “estreita relação filogenética com o homem” à utilização para estudos comparativos em enfermidades humanas.

O guia detalha requisitos mínimos para as instalações, da estrutura física dos alojamentos às áreas de criação e experimentação, passando por condições ambientais, além de procedimentos de manejo, como alimentação adequada, higienização de gaiolas e objetos, formas de contenção física, enriquecimento ambiental e medicina preventiva. Métodos experimentais, cuidados veterinários e princípios de bem-estar animal também compõem o capítulo sobre primatas.

“De uma forma geral, independentemente da finalidade da criação de primatas, o alojamento deve ser composto por um recinto complexo e estimulante, que promova a boa saúde e o bem-estar psicológico e que forneça plena oportunidade de interação social, exercício e manifestação a uma variedade de comportamentos e habilidades inerentes à espécie”, indica o texto. “O recinto satisfatório deve fornecer aos animais um espaço suficiente para que eles mantenham seus hábitos normais de locomoção e de comportamento”.

Estudos a campo – A intenção do outro documento sob consulta pública é orientar pesquisadores e definir requisitos mínimos necessários para a condução de “estudos clínicos conduzidos a campo” – aqueles realizados fora das instalações de uso animal –, quanto a aspectos éticos ligados ao manejo e ao bem-estar das espécies.

“Considerando que uma das missões do Concea é garantir que os animais utilizados em qualquer tipo de pesquisa científica tenham sua integridade e bem-estar preservados, a condução dos estudos fora dos ambientes controlados das instalações para utilização de animais em atividades de ensino ou pesquisa devem se adequar às regras aplicáveis”, afirma o guia.

Criado em 2008, o Concea é uma instância colegiada multidisciplinar de caráter normativo, consultivo, deliberativo e recursal. Dentre as suas competências destacam-se, além do credenciamento das instituições que desenvolvam atividades no setor, a formulação de normas relativas à utilização humanitária de animais com finalidade de ensino e pesquisa científica, bem como o estabelecimento de procedimentos para instalação e funcionamento de centros de criação, de biotérios e de laboratórios de experimentação animal.

(MCTI)

What’s next for climate science beyond the IPCC? (Sci Dev Net)

23/09/14

Audio

In lead to December’s 20th UN Conference of Parties on climate change, scientists and policymakers are reflecting on the future of climate science. Many are questioning whether the existing mechanisms that feed scientific evidence into international politics are working well enough.

In this interview Ilan Kelman argues that, despite its important work, the Intergovernmental Panel on Climate Change, with its consensus-based approach, is no longer suited to the new challenges posed by climate change.

Indian scientists significantly more religious than UK scientists (Science Daily)

Date: September 24, 2014

Source: Rice University

Summary: Indian scientists are significantly more religious than United Kingdom scientists, according to the first cross-national study of religion and spirituality among scientists.


Indian scientists are significantly more religious than United Kingdom scientists, according to the first cross-national study of religion and spirituality among scientists.

The U.K. and India results from Religion Among Scientists in International Context (RASIC) study were presented at the Policies and Perspectives: Implications From the Religion Among Scientists in International Context Study conference held today in London. Rice’s Religion and Public Life Program and Baker Institute for Public Policy sponsored the conference. The U.K. results were also presented at the Uses and Abuses of Biology conference Sept. 22 at Cambridge University’s Faraday Institute in Cambridge, England.

The surveys and in-depth interviews with scientists revealed that while 65 percent of U.K. scientists identify as nonreligious, only 6 percent of Indian scientists identify as nonreligious. In addition, while only 12 percent of scientists in the U.K. attend religious services on a regular basis — once a month or more — 32 percent of scientists in India do.

Elaine Howard Ecklund, Rice’s Autrey Professor of Sociology and the study’s principal investigator, said the U.K. and India data are being released simultaneously because of the history between the U.K. and India. She noted that their differences are quite interesting to compare.

“India and the U.K. are at the same time deeply intertwined historically while deeply different religiously,” Ecklund said. “There is a vastly different character of religion among scientists in the U.K. than in India — potentially overturning the view that scientists are universal carriers of secularization.”

Despite the number of U.K. scientists identifying themselves as nonreligious, 49 percent of U.K. survey respondents acknowledged that there are basic truths in many religions. In addition, 11 percent of U.K. survey respondents said they do believe in God without any doubt, and another 8 percent said they believe in a higher power of some kind.

Ecklund noted that although the U.K. is known for its secularism, scientists in particular are significantly more likely to identify as not belonging to a religion than members of the general population.

“According to available data, only 50 percent of the general U.K. population responded that they did not belong to a religion, compared with 65 percent of U.K. scientists in the survey,” Ecklund said. “In addition, 47 percent of the U.K. population report never attending religious services compared with 68 percent of scientists.”

According to the India survey, 73 percent of scientists responded that there are basic truths in many religions, 27 percent said they believe in God and 38 percent expressed belief in a higher power of some kind. However, while only 4 percent of the general Indian population said they never attend religious services, 19 percent of Indian scientists said they never attend.

“Despite the high level of religiosity evident among Indian scientists when it comes to religious affiliation, we can see here that when we look at religious practices, Indian scientists are significantly more likely than the Indian general population to never participate in a religious service or ritual, even at home,” Ecklund said.

Although there appear to be striking differences in the religious views of U.K. and Indian scientists, less than half of both groups (38 percent of U.K. scientists and 18 percent of Indian scientists) perceived conflict between religion and science.

“When we interviewed Indian scientists in their offices and laboratories, many quickly made it clear that there is no reason for religion and science to be in conflict; for some Indian scientists, religious beliefs actually lead to a deeper sense of doing justice through their work as scientists,” Ecklund said. “And even many U.K. scientists who are themselves not personally religious still do not think there needs to be a conflict between religion and science.”

The U.K. survey included 1,581 scientists, representing a 50 percent response rate. The India survey included 1,763 scientists from 159 universities and/or research institutions. Both surveys also utilized population data from the World Values Survey to make comparisons with the general public. In addition, the researchers conducted nearly 200 in-depth interviews with U.K. and Indian scientists, many of these in person.

The complete study will include a survey of 22,000 biologists and physicists at different points in their careers at top universities and research institutes in the U.S., U.K., Turkey, Italy, France, India, Hong Kong and Taiwan — nations that have very different approaches to the relationship between religious and state institutions, different levels of religiosity and different levels of scientific infrastructure. Respondents were randomly selected from a sampling frame of nearly 50,000 scientists and compiled by undergraduate and graduate students at Rice University through an innovative sampling process. The study will also include qualitative interviews with 700 scientists. The entire RASIC study will be completed by the end of 2015.

When their research has social implications, how should climate scientists get involved? (The Guardian)

Scientists prefer to stick to research, but sometimes further involvement is warranted

Thursday 4 September 2014 14.00 BST

Laboratory technician in a lab; the natural habitat of scientists.Laboratory technician in a lab; the natural habitat of scientists. Photograph: David Burton/Alamy

First, at the end of this post is a question to my readers wherein I ask for feedback. So, please read to the end.

Most scientists go into their studies because they want to understand the world. They want to know why things happen; also how to describe phenomena, both mathematically and logically. But, as scientists carry out their research, often their findings have large social implications. What do they do when that happens?

Well traditionally, scientists just “stick to the facts” and report. They try to avoid making recommendations, policy or otherwise, that are relevant to the findings. But, as we see the social implications of various issues grow larger (environmental, energy, medical, etc.) it becomes harder for scientists to sit out in more public discussions about what should be done. In fact, researchers who have a clear handle on the issue and the pros and cons of different choices have very valuable perspectives to provide society.

But what does involvement look like? For some scientists, it may be helping reporters gather information for stories that may appear online, in print, radio, or television. In another manifestation, it might be writing for themselves (like my blog here at the Guardian). Others may write books, meet with legislators, or partake in public demonstrations.

Each of these levels of engagement has professional risks. We scientists need to protect our professional reputations. That reputation requires that we are completely objective in our science. As a scientist becomes more engaged in advocacy, they risk being viewed by their colleagues as non-objective in their science.

Of course, this isn’t true. It is possible (and easy) to convey the science but also convey the importance of taking action. I do this on a daily basis. But I will go further here. It is essential for scientists to speak out. With the necessary expertise to make informed decisions, it is out obligation to society. Of course, each scientist has to decide how to become engaged. We don’t get many kudos for engagement, it takes time and money out of our research, you will never get tenured for having a more public presence, and you will likely receive po)rly-writen hate mail – but it still is needed for informed decision making.

One very public activity some scientists engage in is public events and demonstrations. A large such event is going to occur this September in New York (September 21 – the Peoples’ Climate March). Just a few days before the UN Climate Summit, the Climate March hopes to bring thousands of people from faith, business, health, agriculture, and science communities together. Scientists will certainly be there – and those scientists should be lauded. I am encouraging my colleagues to participate in events like this.

Okay so now the poll (sort of). I have been writing this blog for over a year – something like 60 posts. Approximately half those posts are on actual science, breaking new studies that shed light on our ever expanding understanding of the Earth’s climate. Another sizeable number of posts are on reviews of books, movies, projects, and others. A third category deals with how climate impacts different locations around the globe. In this group, I’ve written about climate change in Uganda, Kenya, and Cameroon – climate change effects that I’ve witnessed with my own eyes. A fourth category that I just started focuses on specific scientists telling how they got into climate change. Finally, I write a few posts on debunking bad science and misguided public statements.

In truth, I prefer the harder science, but frankly these do not get as many page views as the debunking posts. I am here asking for suggested topics of future posts. I have a few in queue but I always look for engaging topics and angles. You can send them to me at my university email address,jpabraham@stthomas.edu. Also, feel free to comment on the current mix of stories. Is 50% hard science the right mix? Is it too much? Too little? Is my writing to technical? Not technical enough? Let me hear your thoughts.

Wittgenstein’s forgotten lesson (Prospect Magazine)

Wittgenstein’s philosophy is at odds with the scientism which dominates our times. Ray Monk explains why his thought is still relevant.

by Ray Monk / July 20, 1999 / Leave a comment

Published in July 1999 issue of Prospect Magazine

Ludwig Wittgenstein is regarded by many, including myself, as the greatest philosopher of this century. His two great works, Tractatus Logico-Philosophicus (1921) and Philosophical Investigations (published posthumously in 1953) have done much to shape subsequent developments in philosophy, especially in the analytic tradition. His charismatic personality has fascinated artists, playwrights, poets, novelists, musicians and even movie-makers, so that his fame has spread far beyond the confines of academic life.

And yet in a sense Wittgenstein’s thought has made very little impression on the intellectual life of this century. As he himself realised, his style of thinking is at odds with the style that dominates our present era. His work is opposed, as he once put it, to “the spirit which informs the vast stream of European and American civilisation in which all of us stand.” Nearly 50 years after his death, we can see, more clearly than ever, that the feeling that he was swimming against the tide was justified. If we wanted a label to describe this tide, we might call it “scientism,” the view that every intelligible question has either a scientific solution or no solution at all. It is against this view that Wittgenstein set his face.

Scientism takes many forms. In the humanities, it takes the form of pretending that philosophy, literature, history, music and art can be studied as if they were sciences, with “researchers” compelled to spell out their “methodologies”—a pretence which has led to huge quantities of bad academic writing, characterised by bogus theorising, spurious specialisation and the development of pseudo-technical vocabularies. Wittgenstein would have looked upon these developments and wept.

There are many questions to which we do not have scientific answers, not because they are deep, impenetrable mysteries, but simply because they are not scientific questions. These include questions about love, art, history, culture, music-all questions, in fact, that relate to the attempt to understand ourselves better. There is a widespread feeling today that the great scandal of our times is that we lack a scientific theory of consciousness. And so there is a great interdisciplinary effort, involving physicists, computer scientists, cognitive psychologists and philosophers, to come up with tenable scientific answers to the questions: what is consciousness? What is the self? One of the leading competitors in this crowded field is the theory advanced by the mathematician Roger Penrose, that a stream of consciousness is an orchestrated sequence of quantum physical events taking place in the brain. Penrose’s theory is that a moment of consciousness is produced by a sub-protein in the brain called a tubulin. The theory is, on Penrose’s own admission, speculative, and it strikes many as being bizarrely implausible. But suppose we discovered that Penrose’s theory was correct, would we, as a result, understand ourselves any better? Is a scientific theory the only kind of understanding?

Well, you might ask, what other kind is there? Wittgenstein’s answer to that, I think, is his greatest, and most neglected, achievement. Although Wittgenstein’s thought underwent changes between his early and his later work, his opposition to scientism was constant. Philosophy, he writes, “is not a theory but an activity.” It strives, not after scientific truth, but after conceptual clarity. In the Tractatus, this clarity is achieved through a correct understanding of the logical form of language, which, once achieved, was destined to remain inexpressible, leading Wittgenstein to compare his own philosophical propositions with a ladder, which is thrown away once it has been used to climb up on.

In his later work, Wittgenstein abandoned the idea of logical form and with it the notion of ineffable truths. The difference between science and philosophy, he now believed, is between two distinct forms of understanding: the theoretical and the non-theoretical. Scientific understanding is given through the construction and testing of hypotheses and theories; philosophical understanding, on the other hand, is resolutely non-theoretical. What we are after in philosophy is “the understanding that consists in seeing connections.”

Non-theoretical understanding is the kind of understanding we have when we say that we understand a poem, a piece of music, a person or even a sentence. Take the case of a child learning her native language. When she begins to understand what is said to her, is it because she has formulated a theory? We can say that if we like—and many linguists and psychologists have said just that—but it is a misleading way of describing what is going on. The criterion we use for saying that a child understands what is said to her is that she behaves appropriately-she shows that she understands the phrase “put this piece of paper in the bin,” for example, by obeying the instruction.

Another example close to Wittgenstein’s heart is that of understanding music. How does one demonstrate an understanding of a piece of music? Well, perhaps by playing it expressively, or by using the right sort of metaphors to describe it. And how does one explain what “expressive playing” is? What is needed, Wittgenstein says, is “a culture”: “If someone is brought up in a particular culture-and then reacts to music in such-and-such a way, you can teach him the use of the phrase ‘expressive playing.’” What is required for this kind of understanding is a form of life, a set of communally shared practices, together with the ability to hear and see the connections made by the practitioners of this form of life.

What is true of music is also true of ordinary language. “Understanding a sentence,” Wittgenstein says in Philosophical Investigations, “is more akin to understanding a theme in music than one may think.” Understanding a sentence, too, requires participation in the form of life, the “language-game,” to which it belongs. The reason computers have no understanding of the sentences they process is not that they lack sufficient neuronal complexity, but that they are not, and cannot be, participants in the culture to which the sentences belong. A sentence does not acquire meaning through the correlation, one to one, of its words with objects in the world; it acquires meaning through the use that is made of it in the communal life of human beings.

All this may sound trivially true. Wittgenstein himself described his work as a “synopsis of trivialities.” But when we are thinking philosophically we are apt to forget these trivialities and thus end up in confusion, imagining, for example, that we will understand ourselves better if we study the quantum behaviour of the sub-atomic particles inside our brains, a belief analogous to the conviction that a study of acoustics will help us understand Beethoven’s music. Why do we need reminding of trivialities? Because we are bewitched into thinking that if we lack a scientific theory of something, we lack any understanding of it.

One of the crucial differences between the method of science and the non-theoretical understanding that is exemplified in music, art, philosophy and ordinary life, is that science aims at a level of generality which necessarily eludes these other forms of understanding. This is why the understanding of people can never be a science. To understand a person is to be able to tell, for example, whether he means what he says or not, whether his expressions of feeling are genuine or feigned. And how does one acquire this sort of understanding? Wittgenstein raises this question at the end of Philosophical Investigations. “Is there,” he asks, “such a thing as ‘expert judgment’ about the genuineness of expressions of feeling?” Yes, he answers, there is.

But the evidence upon which such expert judgments about people are based is “imponderable,” resistant to the general formulation characteristic of science. “Imponderable evidence,” Wittgenstein writes, “includes subtleties of glance, of gesture, of tone. I may recognise a genuine loving look, distinguish it from a pretended one… But I may be quite incapable of describing the difference… If I were a very talented painter I might conceivably represent the genuine and simulated glance in pictures.”

But the fact that we are dealing with imponderables should not mislead us into believing that all claims to understand people are spurious. When Wittgenstein was once discussing his favourite novel, The Brothers Karamazov, with Maurice Drury, Drury said that he found the character of Father Zossima impressive. Of Zossima, Dostoevsky writes: “It was said that… he had absorbed so many secrets, sorrows, and avowals into his soul that in the end he had acquired so fine a perception that he could tell at the first glance from the face of a stranger what he had come for, what he wanted and what kind of torment racked his conscience.” “Yes,” said Wittgenstein, “there really have been people like that, who could see directly into the souls of other people and advise them.”

“An inner process stands in need of outward criteria,” runs one of the most often quoted aphorisms of Philosophical Investigations. It is less often realised what emphasis Wittgenstein placed on the need for sensitive perception of those “outward criteria” in all their imponderability. And where does one find such acute sensitivity? Not, typically, in the works of psychologists, but in those of the great artists, musicians and novelists. “People nowadays,” Wittgenstein writes in Culture and Value, “think that scientists exist to instruct them, poets, musicians, etc. to give them pleasure. The idea that these have something to teach them-that does not occur to them.”

At a time like this, when the humanities are institutionally obliged to pretend to be sciences, we need more than ever the lessons about understanding that Wittgenstein—and the arts—have to teach us.

Survey confirms scientific consensus on human-caused global warming (Skeptical Science)

Posted on 11 August 2014 by Bart Verheggen

This is a repost from Bart Verheggen’s blog.

  • A survey among more than 1800 climate scientists confirms that there iswidespread agreement that global warming is predominantly caused by humangreenhouse gases.
  • This consensus strengthens with increased expertise, as defined by the number of self-reported articles in the peer-reviewed literature.
  • The main attribution statement in IPCC AR4 may lead to an underestimate of thegreenhouse gas contribution to warming, because it implicitly includes the lesser known masking effect of cooling aerosols.
  • Self-reported media exposure is higher for those who are skeptical of a significant human influence on climate.

In 2012, while temporarily based at the Netherlands Environmental Assessment Agency (PBL), my colleagues and I conducted a detailed survey about climate science. More than 1800 international scientists studying various aspects of climate change, including e.g.climate physics, climate impacts and mitigation, responded to the questionnaire. The main results of the survey have now been published in Environmental Science and Technology(doi: 10.1021/es501998e).

Level of consensus regarding attribution

The answers to the survey showed a wide variety of opinions, but it was clear that a large majority of climate scientists agree that anthropogenic greenhouse gases are the dominant cause of global warming. Consistent with other research, we found that the consensus is strongest for scientists with more relevant expertise and for scientists with more peer-reviewed publications. 90% of respondents with more than 10 climate-related peer-reviewed publications (about half of all respondents), agreed that anthropogenicgreenhouse gases (GHG) are the dominant driver of recent global warming. This is based on two different questions, of which one was phrased in similar terms as the quintessentialattribution statement in IPCC AR4 (stating that more than half of the observed warming since the 1950s is very likely caused by GHG).

Verheggen et al - Figure 1 - GHG contribution to global warming

Literature analyses (e.g. Cook et al., 2013; Oreskes et al., 2004) generally find a stronger consensus than opinion surveys such as ours. This is related to the stronger consensus among highly published – and arguably the most expert – climate scientists. The strength of literature surveys lies in the fact that they sample the prime locus of scientific evidence and thus they provide the most direct measure of the consilience of evidence. On the other hand, opinion surveys such as ours can achieve much more specificity about what exactly is agreed upon and where the disagreement lies. As such, these two methods for quantifying scientific consensus are complementary. Our questions possibly set a higher bar for what’s considered the consensus position than some other studies. Furthermore, contrarian viewpoints were likely overrepresented in our study compared with others.

No matter how you slice it, scientists overwhelmingly agree that recent global warming is to a great extent human caused.

 

Consensus results - from Fig 3 Verheggen et al

The concept of ‘consensus’ has been discussed a lot lately. Whereas the presence of widespread agreement is obviously not proof of a theory being correct, it can’t be dismissed as irrelevant either: As the evidence accumulates and keeps pointing in the same general direction, the experts’ opinion will converge to reflect that, i.e. a consensus emerges. A theory either rises to the level of consensus or it is abandoned, though it may take considerable time for the scientific community to accept a theory, and even longer for the public at large.

Greenhouse warming versus aerosol cooling

By phrasing Question 1 analogously to the well-known attribution statement of AR4 we found something peculiar: Respondents who were more aware of the cooling effect ofaerosols in greater numbers assessed the greenhouse gas contribution to recent warming to be larger than the observed warming (consistent with the IPCC assessments). We concluded that the AR4 attribution statement may lead people to underestimate the isolated greenhouse gas contribution. The comparable AR5 statement is an improvement in this respect.

Media exposure

Respondents were also asked about the frequency of being featured in the media regarding their views on climate change. Respondents who thought climate sensitivity was low (less than 1.75 degrees C per doubling of CO2) reported the most frequent media coverage. Likewise, those who thought greenhouse gases had only made an insignificant contribution to observed warming reported the most frequent media coverage. This shows that contrarian opinions are amplified in the media in relation to their prevalence in the scientific community. This is related to what is sometime referred to as “false balance” in media reporting and may partly explain the divergence between public and scientific opinion regarding climate change (the so-called “consensus gap”).

Verheggen et al - Figure S13c - media exposure vs ECS estimate

Survey respondents

Respondents were selected based on a few criteria: Having authored articles with the key words ‘global warming’ and/or ‘global climate change’, covering the 1991–2011 period via the Web of Science. This is the same database used by Cook et al in their recent ERL study (PS: John Cook is co-author on this current study as well). Respondents were also selected based on inclusion in the climate scientist database assembled by Jim Prall, as well as by surveying the recent climate science literature. Prall’s database includes signatories of public statements disapproving of mainstream climate science. They were included in our survey to ensure that the main criticisms of climate science would be included. This last group amounts to less than 5% of the total number of respondents, about half of whom only published in the gray literature on climate change.

Survey questions

Detailed questions were posed about a variety of physical climate science issues, which are discussed in the public debate about climate change. Answer options reflected a variety of viewpoints, all of which were phrased as specific and neutral as possible. Before executing the survey, questions and answers (pdf) were reviewed by physical and social scientists and climate change public commentators with a wide range of opinions (see acknowledgements for a list of names), to minimize the chance of bias.

Comments on the survey by respondents varied: some said it was slanted towards the ‘alarmist’ side (“Obviously these questions were posed by warmists”), but more respondents commented that they thought it was slanted towards the ‘skeptical’ side (“I suspect this survey comes from the denial lobby”).

——————-

Reference: Bart Verheggen, Bart Strengers, John Cook, Rob van Dorland, Kees Vringer, Jeroen Peters, Hans Visser, and Leo Meyer, Scientists’ Views about Attribution of Global Warming, Environmental Science and Technology, 2014. DOI:10.1021/es501998e. Supporting Information available here. The article  is open access.

An FAQ for this article is here.

Ciência a serviço da exploração da natureza e dos trabalhadores (Portal do Meio Ambiente)

PUBLICADO 30 JULHO 2014.

Mesa: A destruição tem preço? Pode-se confiar nas garantias da Ciência? Exploração petroleira (de Yasuni a Coari / Juruá); Mineração (de Carajás a Madre de Dios). Lindomar Padilha (CIMI); Barbara Silva (militante da comunicação comunitária na Pan Amazônia), Raimundo G. Neto (CEPASP/Movimento dos Atingidos por Mineração); Simeon Velarde (Vanguardia Amazónica-Peru), Ana Patrícia (COMIN)

Na manhã do dia 24 de julho, ocorreu a mesa com o tema “A destruição tem preço? Pode-se confiar nas garantias da Ciência? Exploração petroleira (de Yasuni a Coari / Juruá); Mineração (de Carajás a Madre de Dios).”

Barbara Silva, militante da comunicação comunitária na Pan-Amazônia, destacou a ação da Petrobrás na Amazônia Equatoriana e seus impactos na floresta e em comunidades equatorianas: “A Petrobrás age em outros países de um modo diferente. Ela faz no Equador, Bolívia e Colômbia o que ela não faz no Brasil: invade terras indígenas, frauda laudos técnicos, contamina água e solos, afetando a saúde e a economia de populações inteiras” .

Barbara Silva (militante da comunicação comunitária na Pan-Amazônia)

Silva ainda nos convoca a pensar a relação homem e natureza a partir de um termo que vai além da ideia de cuidar da natureza: “A austeridade imprime uma ação sobre o cuidado que é necessário a natureza. Pensar sobre o que queremos para a região amazônica é pensar no modo que vivemos. Consumir menos é uma ação individual que reflete nossa ação de cuidado com a natureza”, finalizou.

“Precisamos avançar é na ‘perda de inocência’, o Estado Brasileiro não é a favor do povo trabalhadores brasileiro, nem ontem, nem hoje.”, aponta Raimundo Neto (CEPASP/Movimento dos Atingidos por Mineração), após realizar um panorama das políticas e projetos de mineração no Pará.

Lindomar Padilha (CIMI); Simeon Velarde (Vanguardia Amazónica-Peru), Ana Patrícia (COMIN)

Simeon Velarde, da Vanguardia Amazónica-Peru, diz que a empresa petroleira Pluspetrol contamina os rios da amazônia peruana, mas diz que é de forma responsável. “O Peru é rico em matéria primas, em petróleo, gás, minério e essa realidade produz um crescimento econômico interessante para o país, mas esse crescimento não se redistribui socialmente. Eles dizem que vão fazer escolas, programas de inclusão de jovens, mas isso não acontece. O presidente vai aos meios de comunicações para defender essas empresas, pois com elas o país terá mais desenvolvimento, e segue mentindo à população”.

Fotos: Talita Oliveira

Fonte: ADUFAC.

Animais: ciência em benefício da vida (O Globo)

JC e-mail 4993, de 21 de julho de 2014

Artigo de Paulo Gadelha e Wilson Savino publicado em O Globo

A percepção pública sobre as ciências e a capacidade de influenciar as políticas para seu desenvolvimento são condições essenciais da cidadania no mundo contemporâneo. Em especial, é no campo das implicações éticas que esse desafio se torna imperativo. A experimentação animal é, nesse sentido, um caso exemplar.

Nos anos recentes, temos convivido com rejeição de algumas parcelas da sociedade ao uso de animais na ciência. Muitas vezes, estes movimentos encontram ressonância também no ambiente jurídico. Existem grandes expectativas por um mundo em que o uso de animais para a experimentação científica não seja mais necessário. A comunidade científica também compartilha deste desejo. No entanto, nos argumentos que circulam, muita desinformação ainda vigora. Esclarecer o que é verdade e o que é mito se torna fundamental para que a sociedade possa se posicionar sobre o assunto.

No atual estágio da ciência mundial, e em particular no campo da saúde humana, o uso de animais permanece imprescindível para a elucidação de processos biológicos, a descoberta de novos medicamentos, vacinas e tratamentos para doenças. O aumento na expectativa e a melhoria na qualidade de vida que vemos na população se devem, em muito, às inovações médicas que dependeram e ainda dependem, em grande parte, do uso de animais.

Para o futuro, é impossível elucidar o funcionamento do cérebro , os mecanismos das doenças neurodegenerativas, a exemplo do Alzheimer, e garantir a eficácia e segurança de novos tratamentos para essas doenças que estarão cada vez mais presentes com o envelhecimento da população, sem a utilização de animais. O mesmo se aplica a uma multiplicidade de casos, entre os quais o Ebola e outras doenças emergentes.

Um mito muito comum é a ideia de que todas as pesquisas poderiam abrir mão do uso de animais. Apesar dos grandes esforços neste sentido, esta afirmativa não é verdade. A ciência tem investido no desenvolvimento de métodos alternativos, como o cultivo de células e tecidos e os modelos virtuais que recorrem à bioinformática para prever as reações dos organismos.

No entanto, ainda estamos longe de uma solução que reproduza de forma precisa as complexas interações do organismo: estes métodos são aplicáveis apenas em determinadas etapas da pesquisa e em situações específicas. A ciência brasileira também integra este empenho. Um exemplo disso é a criação do Centro Brasileiro de Validação de Métodos Alternativos (BraCVAM), que a Fiocruz lidera em parceria com a Agência Nacional de Vigilância Sanitária (Anvisa).

Outro mito comum é a ideia de que os cientistas utilizam animais de forma indiscriminada. Além do imperativo ético, o uso responsável e o foco no bem-estar dos animais é uma exigência legal. A ciência está submetida a diversas instâncias de regulamentação e a rigoroso controle das atividades de pesquisa. A redução do sofrimento por meio do uso de anestésicos e analgésicos, a escolha de técnicas adequadas e a necessidade de acompanhamento por veterinários são protocolos obrigatórios. Com foco na tríade substituição-redução-refinamento, o uso só é permitido quando não há alternativa conhecida, autorizando-se o menor número de animais necessário para resultados válidos e buscando-se, sempre que possível, o refinamento de técnicas e procedimentos para resultados mais precisos.

A sociedade tem protagonismo fundamental em cobrar que as instituições científicas pautem sua atuação na ética no uso de animais e é saudável para a democracia que esta vigilância atenta seja exercida. No entanto, parar a experimentação animal em pesquisas, hoje, significaria um retrocesso para a ciência e uma perda para a saúde da população e para o próprio campo da veterinária. Cabe aos pesquisadores e às instituições manterem seu compromisso de responsabilidade e ética com os animais, firmes no propósito de beneficiar a sociedade.

Paulo Gadelha é presidente da Fiocruz e Wilson Savino é diretor do Instituto Oswaldo Cruz.

(O Globo)

Artigo_OGLOBO14-07-2014

The Turning Point: New Hope for the Climate (Rolling Stone)

It’s time to accelerate the shift toward a low-carbon future

JUNE 18, 2014

In the struggle to solve the climate crisis, a powerful, largely unnoticed shift is taking place. The forward journey for human civilization will be difficult and dangerous, but it is now clear that we will ultimately prevail. The only question is how quickly we can accelerate and complete the transition to a low-carbon civilization. There will be many times in the decades ahead when we will have to take care to guard against despair, lest it become another form of denial, paralyzing action. It is true that we have waited too long to avoid some serious damage to the planetary ecosystem – some of it, unfortunately, irreversible. Yet the truly catastrophic damages that have the potential for ending civilization as we know it can still – almost certainly – be avoided. Moreover, the pace of the changes already set in motion can still be moderated significantly.

Global Warming’s Terrifying New Math

There is surprising – even shocking – good news: Our ability to convert sunshine into usable energy has become much cheaper far more rapidly than anyone had predicted. The cost of electricity from photovoltaic, or PV, solar cells is now equal to or less than the cost of electricity from other sources powering electric grids in at least 79 countries. By 2020 – as the scale of deployments grows and the costs continue to decline – more than 80 percent of the world’s people will live in regions where solar will be competitive with electricity from other sources.

No matter what the large carbon polluters and their ideological allies say or do, in markets there is a huge difference between “more expensive than” and “cheaper than.” Not unlike the difference between 32 degrees and 33 degrees Fahrenheit. It’s not just a difference of a degree, it’s the difference between a market that’s frozen up and one that’s liquid. As a result, all over the world, the executives of companies selling electricity generated from the burning of carbon-based fuels (primarily from coal) are openly discussing their growing fears of a “utility death spiral.”

Germany, Europe’s industrial powerhouse, where renewable subsidies have been especially high, now generates 37 percent of its daily electricity from wind and solar; and analysts predict that number will rise to 50 percent by 2020. (Indeed, one day this year, renewables created 74 percent of the nation’s electricity!)

Scorched Earth: How Climate Change Is Spreading Drought Throughout the Globe

What’s more, Germany’s two largest coal-burning utilities have lost 56 percent of their value over the past four years, and the losses have continued into the first half of 2014. And it’s not just Germany. Last year, the top 20 utilities throughout Europe reported losing half of their value since 2008. According to the Swiss bank UBS, nine out of 10 European coal and gas plants are now losing money.

In the United States, where up to 49 percent of the new generating capacity came from renewables in 2012, 166 coal-fired electricity-generating plants have either closed or have announced they are closing in the past four and a half years. An additional 183 proposed new coal plants have been canceled since 2005.

To be sure, some of these closings have been due to the substitution of gas for coal, but the transition under way in both the American and global energy markets is far more significant than one fossil fuel replacing another. We are witnessing the beginning of a massive shift to a new energy-distribution model – from the “central station” utility-grid model that goes back to the 1880s to a “widely distributed” model with rooftop solar cells, on-site and grid battery storage, and microgrids.

The principal trade group representing U.S. electric utilities, the Edison Electric Institute, has identified distributed generation as the “largest near-term threat to the utility model.” Last May, Barclays downgraded the entirety of the U.S. electric sector, warning that “a confluence of declining cost trends in distributed solar­photovoltaic-power generation and residential­scale power storage is likely to disrupt the status quo” and make utility investments less attractive.

See the 10 Dumbest Things Said About Global Warming

This year, Citigroup reported that the widespread belief that natural gas – the supply of which has ballooned in the U.S. with the fracking of shale gas – will continue to be the chosen alternative to coal is mistaken, because it too will fall victim to the continuing decline in the cost of solar and wind electricity. Significantly, the cost of battery storage, long considered a barrier to the new electricity system, has also been declining steadily – even before the introduction of disruptive new battery technologies that are now in advanced development. Along with the impressive gains of clean-energy programs in the past decade, there have been similar improvements in our ability to do more with less. Since 1980, the U.S. has reduced total energy intensity by 49 percent.

It is worth remembering this key fact about the supply of the basic “fuel”: Enough raw energy reaches the Earth from the sun in one hour to equal all of the energy used by the entire world in a full year.

In poorer countries, where most of the world’s people live and most of the growth in energy use is occurring, photovoltaic electricity is not so much displacing carbon-based energy as leapfrogging it altogether. In his first days in office, the government of the newly elected prime minister of India, Narendra Modi (who has authored an e-book on global warming), announced a stunning plan to rely principally upon photovoltaic energy in providing electricity to 400 million Indians who currently do not have it. One of Modi’s supporters, S.L. Rao, the former utility regulator of India, added that the industry he once oversaw “has reached a stage where either we change the whole system quickly, or it will collapse.”

Nor is India an outlier. Neighboring Bangladesh is installing nearly two new rooftop PV systems every minute — making it the most rapidly growing market for PVs in the world. In West and East Africa, solar-electric cells are beginning what is widely predicted to be a period of explosive growth.

At the turn of the 21st century, some scoffed at projections that the world would be installing one gigawatt of new solar electricity per year by 2010. That goal was exceeded 17 times over; last year it was exceeded 39 times over; and this year the world is on pace to exceed that benchmark as much as 55 times over. In May, China announced that by 2017, it would have the capacity to generate 70 gigawatts of photovoltaic electricity. The state with by far the biggest amount of wind energy is Texas, not historically known for its progressive energy policies.

The cost of wind energy is also plummeting, having dropped 43 percent in the United States since 2009 – making it now cheaper than coal for new generating capacity. Though the downward cost curve is not quite as steep as that for solar, the projections in 2000 for annual worldwide wind deployments by the end of that decade were exceeded seven times over, and are now more than 10 times that figure. In the United States alone, nearly one-third of all new electricity-generating capacity in the past five years has come from wind, and installed wind capacity in the U.S. has increased more than fivefold since 2006.

For consumers, this good news may soon get even better. While the cost of carbon­based energy continues to increase, the cost of solar electricity has dropped by an average of 20 percent per year since 2010. Some energy economists, including those who produced an authoritative report this past spring for Bernstein Research, are now predicting energy-price deflation as soon as the next decade.

For those (including me) who are surprised at the speed with which this impending transition has been accelerating, there are precedents that help explain it. Remember the first mobile-telephone handsets? I do; as an inveterate “early adopter” of new technologies, I thought those first huge, clunky cellphones were fun to use and looked cool (they look silly now, of course). In 1980, a few years before I bought one of the early models, AT&T conducted a global market study and came to the conclusion that by the year 2000 there would be a market for 900,000 subscribers. They were not only wrong, they were way wrong: 109 million contracts were active in 2000. Barely a decade and a half later, there are 6.8 billion globally. 
These parallels have certainly caught the attention of the fossil-fuel industry and its investors: Eighteen months ago, the Edison Electric Institute described the floundering state of the once-proud landline-telephone companies as a grim predictor of what may soon be their fate.

 

The utilities are fighting back, of course, by using their wealth and the entrenched political power they have built up over the past century. In the United States, brothers Charles and David Koch, who run Koch Industries, the second-largest privately owned corporation in the U.S., have secretively donated at least $70 million to a number of opaque political organizations tasked with spreading disinformation about the climate crisis and intimidating political candidates who dare to support renewable energy or the pricing of carbon pollution.

A Call to Arms: An invitation to Demand Action on Climate Change

They regularly repeat shopworn complaints about the inadequate, intermittent and inconsistent subsidies that some governments have used in an effort to speed up the deployment of renewables, while ignoring the fact that global subsidies for carbon-based energy are 25 times larger than global subsidies for renewables.

One of the most effective of the groups financed by the Koch brothers and other carbon polluters is the American Legislative Exchange Council, or ALEC, which grooms conservative state legislators throughout the country to act as their agents in introducing legislation written by utilities and carbon-fuel lobbyists in a desperate effort to slow, if not stop, the transition to renewable energy.

The Kochs claim to act on principles of low taxation and minimal regulation, but in their attempts to choke the development of alternative energy, they have induced the recipients of their generous campaign contributions to contradict these supposedly bedrock values, pushing legislative and regulatory measures in 34 states to discourage solar, or encourage carbon energy, or both. The most controversial of their initiatives is focused on persuading state legislatures and public-utility commissions to tax homeowners who install a PV solar cell on their roofs, and to manipulate the byzantine utility laws and regulations to penalize renewable energy in a variety of novel schemes.

The chief battleground in this war between the energy systems of the past and future is our electrical grid. For more than a century, the grid – along with the regulatory and legal framework governing it – has been dominated by electric utilities and their centralized, fossil-fuel-powered­ electricity-generation plants. But the rise of distributed alternate energy sources allows consumers to participate in the production of electricity through a policy called net metering. In 43 states, homeowners who install solar PV to systems on their rooftops are permitted to sell electricity back into the grid when they generate more than they need.

These policies have been crucial to the growth of solar power. But net metering represents an existential threat to the future of electric utilities, the so-called utility death spiral: As more consumers install solar panels on their roofs, utilities will have to raise prices on their remaining customers to recover the lost revenues. Those higher rates will, in turn, drive more consumers to leave the utility system, and so on.

But here is more good news: The Koch brothers are losing rather badly. In Kansas, their home state, a poll by North Star Opinion Research reported that 91 percent of registered voters support solar and wind. Three-quarters supported stronger policy encouragement of renewable energy, even if such policies raised their electricity bills.

In Georgia, the Atlanta Tea Party joined forces with the Sierra Club to form a new organization called – wait for it – the Green Tea Coalition, which promptly defeated a Koch-funded scheme to tax rooftop solar panels.

Meanwhile, in Arizona, after the state’s largest utility, an ALEC member, asked the public-utility commission for a tax of up to $150 per month for solar households, the opposition was fierce and well-organized. A compromise was worked out – those households would be charged just $5 per month – but Barry Goldwater Jr., the leader of a newly formed organization called TUSK (Tell Utilities Solar won’t be Killed), is fighting a new attempt to discourage rooftop solar in Arizona. Characteristically, the Koch brothers and their allies have been using secretive and deceptive funding in Arizona to run television advertisements attacking “greedy” owners of rooftop solar panels – but their effort has thus far backfired, as local journalists have exposed the funding scam.

Even though the Koch-funded forces recently scored a partial (and almost certainly temporary) victory in Ohio, where the legislature voted to put a hold on the state’s renewable-portfolio standard and study the issue for two years, it’s clear that the attack on solar energy is too little, too late. Last year, the Edison Electric Institute warned the utility industry that it had waited too long to respond to the sharp cost declines and growing popularity of solar: “At the point when utility investors become focused on these new risks and start to witness significant customer- and earnings-erosion trends, they will respond to these challenges. But, by then, it may be too late to repair the utility business model.”

The most seductive argument deployed by the Koch brothers and their allies is that those who use rooftop solar electricity and benefit from the net-metering policies are “free riders” – that is, they are allegedly not paying their share of the maintenance costs for the infrastructure of the old utility model, including the grid itself. This deceptive message, especially when coupled with campaign contributions, has persuaded some legislators to support the proposed new taxes on solar panels.

But the argument ignores two important realities facing the electric utilities: First, most of the excess solar electricity is supplied by owners of solar cells during peak-load hours of the day, when the grid’s capacity is most stressed – thereby alleviating the pressure to add expensive new coal- or gas-fired generating capacity. But here’s the rub: What saves money for their customers cuts into the growth of their profits and depresses their stock prices. As is often the case, the real conflict is between the public interest and the special interest.

The second reality ignored by the Koch brothers is the one they least like to discuss, the one they spend so much money trying to obfuscate with their hired “merchants of doubt.” You want to talk about the uncompensated use of infrastructure? What about sewage infrastructure for 98 million tons per day of gaseous, heat-trapping waste that is daily released into our skies, threatening the future of human civilization? Is it acceptable to use the thin shell of atmosphere surrounding our planet as an open sewer? Free of charge? Really?

 

This, after all, is the reason the climate crisis has become an existential threat to the future of human civilization. Last April, the average CO2 concentrations in the Earth’s atmosphere exceeded 400 parts-per-million on a sustained basis for the first time in at least 800,000 years and probably for the first time in at least 4.5 million years (a period that was considerably warmer than at present).

According to a cautious analysis by the influential climate scientist James Hansen, the accumulated man-made global-warming pollution already built up in the Earth’s atmosphere now traps as much extra heat energy every day as would be released by the explosion of 400,000 Hiroshima-class nuclear bombs. It’s a big planet, but that’s a lot of energy.

And it is that heat energy that is giving the Earth a fever. Denialists hate the “fever” metaphor, but as the American Association for the Advancement of Science (AAAS) pointed out this year, “Just as a 1.4­degree-fever change would be seen as significant in a child’s body, a similar change in our Earth’s temperature is also a concern for human society.”

Thirteen of the 14 hottest years ever measured with instruments have occurred in this century. This is the 37th year in a row that has been hotter than the 20th-century average. April was the 350th month in a row hotter than the average in the preceding century. The past decade was by far the warmest decade ever measured.

Many scientists expect the coming year could break all of these records by a fair margin because of the extra boost from the anticipated El Niño now gathering in the waters of the eastern Pacific. (The effects of periodic El Niño events are likely to become stronger because of global warming, and this one is projected by many scientists to be stronger than average, perhaps on the scale of the epic El Niño of 1997 to 1998.)

The fast-growing number of extreme-weather events, connected to the climate crisis, has already had a powerful impact on public attitudes toward global warming. A clear majority of Americans now acknowledge thatman-made pollution is responsible. As the storms, floods, mudslides, droughts, fires and other catastrophes become ever more destructive, the arcane discussions over how much of their extra-destructive force should be attributed to global warming have become largely irrelevant. The public at large feels it viscerally now. As Bob Dylan sang, “You don’t need a weatherman to know which way the wind blows.”

Besides, there is a simple difference between linear cause and effect and systemic cause and effect. As one of the world’s most-respected atmospheric scientists, Kevin Trenberth, has said, “The environment in which all storms form has changed owing to human activities.”

For example, when Supertyphoon Haiyan crossed the Pacific toward the Philippines last fall, the storm gained strength across seas that were 5.4 degrees Fahrenheit warmer than they used to be because of greenhouse­gas pollution. As a result, Haiyan went from being merely strong to being the most powerful and destructive ocean-based storm on record to make landfall. Four million people were displaced (more than twice as many as by the Indian Ocean tsunami of 10 years ago), and there are still more than 2 million Haiyan refugees desperately trying to rebuild their lives.

When Superstorm Sandy traversed the areas of the Atlantic Ocean windward of New York and New Jersey in 2012, the water temperature was nine degrees Fahrenheit warmer than normal. The extra convection energy in those waters fed the storm and made the winds stronger than they would otherwise have been. Moreover, the sea level was higher than it used to be, elevated by the melting of ice in the frozen regions of the Earth and the expanded volume of warmer ocean waters.

Five years earlier, denialists accused me of demagogic exaggeration in an animated scene in my documentary An Inconvenient Truth that showed the waters of the Atlantic Ocean flooding into the 9/11 Ground Zero Memorial site. But in Sandy’s wake, the Atlantic did in fact flood Ground Zero – many years before scientists had expected that to occur.

Similarly, the inundation of Miami Beach by rising sea levels has now begun, and freshwater aquifers in low-lying areas from South Florida to the Nile Delta to Bangladesh to Indochina are being invaded by saltwater pushed upward by rising oceans. And of course, many low-lying islands – not least in the Bay of Bengal – are in danger of disappearing altogether. Where will the climate refugees go? Similarly, the continued melting of mountain glaciers and snowpacks is, according to the best scientists, already “affecting water supplies for as many as a billion people around the world.”

Just as the extreme-weather events we are now experiencing are exactly the kind that were predicted by scientists decades ago, the scientific community is now projecting far worse extreme-weather events in the years to come. Eighty percent of the warming in the past 150 years (since the burning of carbon-based fuels gained momentum) has occurred in the past few decades. And it is worth noting that the previous scientific projections consistently low-balled the extent of the global­warming consequences that later took place – for a variety of reasons rooted in the culture of science that favor conservative estimates of future effects.

In an effort to avoid these cultural biases, the AAAS noted this year that not only are the impacts of the climate crisis “very likely to become worse over the next 10 to 20 years and beyond,” but “there is a possibility that temperatures will rise much higher and impacts will be much worse than expected. Moreover, as global temperature rises, the risk increases that one or more important parts of the Earth’s climate system will experience changes that may be abrupt, unpredictable and potentially irreversible, causing large damages and high costs.”

Just weeks after that report, there was shock and, for some, a temptation to despair when the startling news was released in May by scientists at both NASA and the University of Washington that the long-feared “collapse” of a portion of the West Antarctic ice sheet is not only under way but is also now “irreversible.” Even as some labored to understand what the word “collapse” implied about the suddenness with which this catastrophe will ultimately unfold, it was the word “irreversible” that had a deeper impact on the collective psyche.

Just as scientists 200 years ago could not comprehend the idea that species had once lived on Earth and had subsequently become extinct, and just as some people still find it hard to accept the fact that human beings have become a sufficiently powerful force of nature to reshape the ecological system of our planet, many – including some who had long since accepted the truth about global warming – had difficulty coming to grips with the stark new reality that one of the long-feared “tipping points” had been crossed. And that, as a result, no matter what we do, sea levels will rise by at least an additional three feet.

The uncertainty about how long the process will take (some of the best ice scientists warn that a rise of 10 feet in this century cannot be ruled out) did not change the irreversibility of the forces that we have set in motion. But as Eric Rignot, the lead author of the NASA study, pointed out in The Guardian, it’s still imperative that we take action: “Controlling climate warming may ultimately make a difference not only about how fast West Antarctic ice will melt to sea, but also whether other parts of Antarctica will take their turn.”

The news about the irreversible collapse in West Antarctica caused some to almost forget that only two months earlier, a similar startling announcement had been made about the Greenland ice sheet. Scientists found that the northeastern part of Greenland – long thought to be resistant to melting – has in fact been losing more than 10 billion tons of ice per year for the past decade, making 100 percent of Greenland unstable and likely, as with West Antarctica, to contribute to significantly more sea-level rise than scientists had previously thought.

 

The heating of the oceans not only melts the ice and makes hurricanes, cyclones and typhoons more intense, it also evaporates around 2 trillion gallons of additional water vapor into the skies above the U.S. The warmer air holds more of this water vapor and carries it over the landmasses, where it is funneled into land-based storms that are releasing record downpours all over the world.

For example, an “April shower” came to Pensacola, Florida, this spring, but it was a freak – another rainstorm on steroids: two feet of rain in 26 hours. It broke all the records in the region, but as usual, virtually no media outlets made the connection to global warming. Similar “once in a thousand years” storms have been occurring regularly in recent years all over the world, including in my hometown of Nashville in May 2010.

All-time record flooding swamped large portions of England this winter, submerging thousands of homes for more than six weeks. Massive downpours hit Serbia and Bosnia this spring, causing flooding of “biblical proportions” (a phrase now used so frequently in the Western world that it has become almost a cliché) and thousands of landslides. Torrential rains in Afghanistan in April triggered mudslides that killed thousands of people – almost as many, according to relief organizations, as all of the Afghans killed in the war there the previous year.

In March, persistent rains triggered an unusually large mudslide in Oso, Washington, killing more than 40 people. There are literally hundreds of other examples of extreme rainfall occurring in recent years in the Americas, Europe, Asia, Africa and Oceania.

In the planet’s drier regions, the same extra heat trapped in the atmosphere by man-made global-warming pollution has also been driving faster evaporation of soil moisture and causing record-breaking droughts. As of this writing, 100 percent of California is in “severe,” “extreme” or “exceptional” drought. Record fires are ravaging the desiccated landscape. Experts now project that an increase of one degree Celsius over pre-industrial temperatures will lead to as much as a 600-­percent increase in the median area burned by forest fires in some areas of the American West – including large portions of Colorado. The National Research Council has reported that fire season is two and a half months longer than it was 30 years ago, and in California, firefighters are saying that the season is now effectively year-round.

Drought has been intensifying in many other dry regions around the world this year: Brazil, Indonesia, central and northwest Africa and Madagascar, central and western Europe, the Middle East up to the Caspian Sea and north of the Black Sea, Southeast Asia, Northeast Asia, Western Australia and New Zealand.

Syria is one of the countries that has been in the bull’s-eye of climate change. From 2006 to 2010, a historic drought destroyed 60 percent of the country’s farms and 80 percent of its livestock – driving a million refugees from rural agricultural areas into cities already crowded with the million refugees who had taken shelter there from the Iraq War. As early as 2008, U.S. State Department cables quoted Syrian government officials warning that the social and economic impacts of the drought are “beyond our capacity as a country to deal with.” Though the hellish and ongoing civil war in Syria has multiple causes – including the perfidy of the Assad government and the brutality on all sides – their climate-related drought may have been the biggest underlying trigger for the horror.

The U.S. military has taken notice of the strategic dangers inherent in the climate crisis. Last March, a Pentagon advisory committee described the climate crisis as a “catalyst for conflict” that may well cause failures of governance and societal collapse. “In the past, the thinking was that climate change multiplied the significance of a situation,” said retired Air Force Gen. Charles F. Wald. “Now we’re saying it’s going to be a direct cause of instability.”

Pentagon spokesman Mark Wright told the press, “For DOD, this is a mission reality, not a political debate. The scientific forecast is for more Arctic ice melt, more sea-level rise, more intense storms, more flooding from storm surge and more drought.” And in yet another forecast difficult for congressional climate denialists to rebut, climate experts advising the military have also warned that the world’s largest naval base, in Norfolk, Virginia, is likely to be inundated by rising sea levels in the future.

And how did the Republican-dominated House of Representatives respond to these grim warnings? By passing legislation seeking to prohibit the Department of Defense from taking any action to prepare for the effects of climate disruption.

There are so many knock-on consequences of the climate crisis that listing them can be depressing – diseases spreading, crop yields declining, more heat waves affecting vulnerable and elderly populations, the disappearance of summer-ice cover in the Arctic Ocean, the potential extinction of up to half of all the living species, and so much more. And that in itself is a growing problem too, because when you add it all up, it’s no wonder that many feel a new inclination to despair.

So, clearly, we will just have to gird ourselves for the difficult challenges ahead. There is indeed, literally, light at the end of the tunnel, but there is a tunnel, and we are well into it.

In November 1936, Winston Churchill stood before the United Kingdom’s House of Commons and placed a period at the end of the misguided debate over the nature of the “gathering storm” on the other side of the English Channel: “Owing to past neglect, in the face of the plainest warnings, we have entered upon a period of danger. . . . The era of procrastination, of half measures, of soothing and baffling expedience of delays is coming to its close. In its place, we are entering a period of consequences. . . . We cannot avoid this period; we are in it now.”

Our civilization is confronting this existential challenge at a moment in our historical development when our dominant global ideology – democratic capitalism – has been failing us in important respects.

Democracy is accepted in theory by more people than ever before as the best form of political organization, but it has been “hacked” by large corporations (defined as “persons” by the Supreme Court) and special interests corrupting the political system with obscene amounts of money (defined as “speech” by the same court).

Capitalism, for its part, is accepted by more people than ever before as a superior form of economic organization, but is – in its current form – failing to measure and include the categories of “value” that are most relevant to the solutions we need in order to respond to this threatening crisis (clean air and water, safe food, a benign climate balance, public goods like education and a greener infrastructure, etc.).

Pressure for meaningful reform in democratic capitalism is beginning to build powerfully. The progressive introduction of Internet-based communication – social media, blogs, digital journalism – is laying the foundation for the renewal of individual participation in democracy, and the re-elevation of reason over wealth and power as the basis for collective decision­making. And the growing levels of inequality worldwide, combined with growing structural unemployment and more frequent market disruptions (like the Great Recession), are building support for reforms in capitalism.

Both waves of reform are still at an early stage, but once again, Churchill’s words inspire: “If you’re going through hell, keep going.” And that is why it is all the more important to fully appreciate the incredible opportunity for salvation that is now within our grasp. As the satirical newspaper The Onion recently noted in one of its trademark headlines: “Scientists Politely Remind World That Clean Energy Technology Ready to Go Whenever.”

We have the policy tools that can dramatically accelerate the transition to clean energy that market forces will eventually produce at a slower pace. The most important has long since been identified: We have to put a price on carbon in our markets, and we need to eliminate the massive subsidies that fuel the profligate emissions of global-warming pollution.

We need to establish “green banks” that provide access to capital investment necessary to develop renewable energy, sustainable agriculture and forestry, an electrified transportation fleet, the retrofitting of buildings to reduce wasteful energy consumption, and the full integration of sustainability in the design and architecture of cities and towns. While the burning of fossil fuels is the largest cause of the climate crisis, deforestation and “factory farming” also play an important role. Financial and technological approaches to addressing these challenges are emerging, but we must continue to make progress in converting to sustainable forestry and agriculture.

In order to accomplish these policy shifts, we must not only put a price on carbon in markets, but also find a way to put a price on climate denial in our politics. We already know the reforms that are needed – and the political will to enact them is a renewable resource. Yet the necessary renewal can only come from an awakened citizenry empowered by a sense of urgency and emboldened with the courage to reject despair and become active. Most importantly, now is the time to support candidates who accept the reality of the climate crisis and are genuinely working hard to solve it – and to bluntly tell candidates who are not on board how much this issue matters to you. If you are willing to summon the resolve to communicate that blunt message forcefully – with dignity and absolute sincerity – you will be amazed at the political power an individual can still wield in America’s diminished democracy.

Something else is also new this summer. Three years ago, in these pages, I criticized the seeming diffidence of President Obama toward the great task of solving the climate crisis; this summer, it is abundantly evident that he has taken hold of the challenge with determination and seriousness of purpose.

He has empowered his Environmental Protection Agency to enforce limits on CO2 emissions for both new and, as of this June, existing sources of CO2. He has enforced bold new standards for the fuel economy of the U.S. transportation fleet. He has signaled that he is likely to reject the absurdly reckless Keystone XL-pipeline proposal for the transport of oil from carbon­intensive tar sands to be taken to market through the United States on its way to China, thus effectively limiting their exploitation. And he is even now preparing to impose new limits on the release of methane pollution.

All of these welcome steps forward have to be seen, of course, in the context of Obama’s continued advocacy of a so-called all-of-the-above energy policy – which is the prevailing code for aggressively pushing more drilling and fracking for oil and gas. And to put the good news in perspective, it is important to remember that U.S. emissions – after declining for five years during the slow recovery from the Great Recession – actually increased by 2.4 percent in 2013.

 

Nevertheless, the president is clearly changing his overall policy emphasis to make CO2 reductions a much higher priority now and has made a series of inspiring speeches about the challenges posed by climate change and the exciting opportunities available as we solve it. As a result, Obama will go to the United Nations this fall and to Paris at the end of 2015 with the credibility and moral authority that he lacked during the disastrous meeting in Copenhagen four and a half years ago.

The international treaty process has been so fraught with seemingly intractable disagreements that some parties have all but given up on the possibility of ever reaching a meaningful treaty.

Ultimately, there must be one if we are to succeed. And there are signs that a way forward may be opening up. In May, I attended a preparatory session in Abu Dhabi, UAE, organized by United Nations Secretary General Ban Ki-moon to bolster commitments from governments, businesses and nongovernmental organizations ahead of this September’s U.N. Climate Summit. The two-day meeting was different from many of the others I have attended. There were welcome changes in rhetoric, and it was clear that the reality of the climate crisis is now weighing on almost every nation. Moreover, there were encouraging reports from around the world that many of the policy changes necessary to solve the crisis are being adopted piecemeal by a growing number of regional, state and city governments.

For these and other reasons, I believe there is a realistic hope that momentum toward a global agreement will continue to build in September and carry through to the Paris negotiations in late 2015.

The American poet Wallace Stevens once wrote, “After the final ‘no’ there comes a ‘yes’/And on that ‘yes’ the future world depends.” There were many no’s before the emergence of a global consensus to abolish chattel slavery, before the consensus that women must have the right to vote, before the fever of the nuclear­arms race was broken, before the quickening global recognition of gay and lesbian equality, and indeed before every forward advance toward social progress. Though a great many obstacles remain in the path of this essential agreement, I am among the growing number of people who are allowing themselves to become more optimistic than ever that a bold and comprehensive pact may well emerge from the Paris negotiations late next year, which many regard as the last chance to avoid civilizational catastrophe while there is still time.

It will be essential for the United States and other major historical emitters to commit to strong action. The U.S. is, finally, now beginning to shift its stance. And the European Union has announced its commitment to achieve a 40-percent reduction in CO2 emissions by 2030. Some individual European nations are acting even more aggressively, including Finland’s pledge to reduce emissions 80 percent by 2050.

It will also be crucial for the larger developing and emerging nations – particularly China and India – to play a strong leadership role. Fortunately, there are encouraging signs. China’s new president, Xi Jinping, has launched a pilot cap-and-trade system in two cities and five provinces as a model for a nationwide cap-and-trade program in the next few years. He has banned all new coal burning in several cities and required the reporting of CO2 emissions by all major industrial sources. China and the U.S. have jointly reached an important agreement to limit another potent source of global-warming pollution – the chemical compounds known as hydro-fluorocarbons, or HFCs. And the new prime minister of India, as noted earlier, has launched the world’s most ambitious plan to accelerate the transition to solar electricity.

Underlying this new breaking of logjams in international politics, there are momentous changes in the marketplace that are exercising enormous influence on the perceptions by political leaders of the new possibilities for historic breakthroughs. More and more, investors are diversifying their portfolios to include significant investments in renewables. In June, Warren Buffett announced he was ready to double Berkshire Hathaway’s existing $15 billion investment in wind and solar energy.

A growing number of large investors – including pension funds, university endowments (Stanford announced its decision in May), family offices and others – have announced decisions to divest themselves from carbon­intensive assets. Activist and “impact” investors are pushing for divestment from carbon­rich assets and new investments in renewable and sustainable assets.

Several large banks and asset managers around the world (full disclosure: Generation Investment Management, which I co-founded with David Blood and for which I serve as chairman, is in this group) have advised their clients of the danger that carbon assets will become “stranded.” A “stranded asset” is one whose price is vulnerable to a sudden decline when markets belatedly recognize the truth about their underlying value – just as the infamous “subprime mortgages” suddenly lost their value in 2007 to 2008 once investors came to grips with the fact that the borrowers had absolutely no ability to pay off their mortgages.

Shareholder activists and public campaigners have pressed carbon-dependent corporations to deal with these growing concerns. But the biggest ones are still behaving as if they are in denial. In May 2013, ExxonMobil CEO Rex Tillerson responded to those pointing out the need to stop using the Earth’s atmosphere as a sewer by asking, “What good is it to save the planet if humanity suffers?”

I don’t even know where to start in responding to that statement, but here is a clue: Pope Francis said in May, “If we destroy creation, creation will destroy us. Never forget this.”

 

Exxonmobil, Shell and many other holders of carbon-intensive assets have argued, in essence, that they simply do not believe that elected national leaders around the world will ever reach an agreement to put a price on carbon pollution.

But a prospective global treaty (however likely or unlikely you think that might be) is only one of several routes to overturning the fossil-fuel economy. Rapid technological advances in renewable energy are stranding carbon investments; grassroots movements are building opposition to the holding of such assets; and new legal restrictions on collateral flows of pollution – like particulate air pollution in China and mercury pollution in the U.S. – are further reducing the value of coal, tar sands, and oil and gas assets.

In its series of reports to energy investors this spring, Citigroup questioned the feasibility of new coal plants not only in Europe and North America, but in China as well. Although there is clearly a political struggle under way in China between regional governments closely linked to carbon-­energy generators, suppliers and users and the central government in Beijing – which is under growing pressure from citizens angry about pollution – the nation’s new leadership appears to be determined to engineer a transition toward renewable energy. Only time will tell how successful they will be.

The stock exchanges in Johannesburg and São Paulo have decided to require the full integration of sustainability from all listed companies. Standard & Poor’s announced this spring that some nations vulnerable to the impacts of the climate crisis may soon have their bonds downgraded because of the enhanced risk to holders of those assets.

A growing number of businesses around the world are implementing sustainability plans, as more and more consumers demand a more responsible approach from businesses they patronize. Significantly, many have been pleasantly surprised to find that adopting efficient, low-carbon approaches can lead to major cost savings.

And all the while, the surprising and relentless ongoing decline in the cost of renewable energy and efficiency improvements are driving the transition to a low-carbon economy.

Is there enough time? Yes. Damage has been done, and the period of consequences will continue for some time to come, but there is still time to avoid the catastrophes that most threaten our future. Each of the trends described above – in technology, business, economics and politics – represents a break from the past. Taken together, they add up to genuine and realistic hope that we are finally putting ourselves on a path to solve the climate crisis.

How long will it take? When Martin Luther King Jr. was asked that question during some of the bleakest hours of the U.S. civil rights revolution, he responded, “How long? Not long. Because no lie can live forever. . . . How long? Not long. Because the arc of the moral universe is long, but it bends toward justice.”

And so it is today: How long? Not long.

This story is from the July 3rd-17th, 2014 issue of Rolling Stone.

http://www.rollingstone.com/politics/news/the-turning-point-new-hope-for-the-climate-20140618

Cientistas pedem limite à criação de vírus mortais em laboratório (O Globo)

JC e-mail 4991, de 17 de julho de 2014

Falhas em unidades americanas elevam riscos de surtos

Um grupo multidisciplinar de cientistas de importantes universidades em diferentes países publicou ontem um alerta sobre a manipulação, em laboratórios norte-americanos, de vírus que podem se espalhar e infectar homens e outros mamíferos. A preocupação vem na esteira de seguidas notícias sobre falhas de envolvendo micro-organismos potencialmente perigosos.

“Incidentes recentes com varíola, antraz e gripe aviária em alguns dos mais importantes laboratórios dos EUA nos faz lembrar da falibilidade até das unidades mais seguras, reforçando a necessidade urgente de uma reavaliação completa de biossegurança”, escreveu o autodenominado “Grupo de Trabalho de Cambridge”, composto de pesquisadores das universidades de Harvard, Yale, Ottawa, entre outras.

No alerta, eles relatam que incidentes com patógenos têm aumentado e ocorrido em média duas vezes por semana em laboratórios privados e públicos do país. A informação é de um estudo de 2012 do periódico “Applied Biosafety”.

– Quando vemos algum caso na imprensa, dá a impressão de que são episódios raros, mas não são – comentou Amir Attaran, da Universidade de Ottawa, um dos cientistas que assinou o documento. – Estamos preocupados com as experiências perigosas que estão sendo feitas para projetar os mais infecciosos e mortais vírus da gripe e da síndrome respiratória aguda grave (Sars). Achamos que essa ciência imprudente e insensata pode ferir ou matar um grande número de pessoas. O Centro de Controle de Prevenção de Doenças, na semana passada, admitiu que laboratórios de alta segurança perderam o controle com algumas amostras.

No último caso, frascos de varíola foram encontrados por acaso num depósito inutilizado de um laboratório federal em Washington. Estima-se que eles estivessem ali há mais de 50 anos.

Attaran comparou o pronunciamento do grupo ao que cientistas fizeram em 1943, antes dos bombardeios de Hiroshima, na Segunda Guerra Mundial. E disse que o risco dessas experiências são maiores do que os possíveis benefícios dessas pesquisas, citando a recriação in vitro do vírus da gripe espanhola, de 1918, que matou 40 milhões de pessoas. Em 2006, cientistas fizeram a experiência num laboratório americano.

– Não é para ficarmos alarmados aqui – garante Volnei Garrafa, coordenador do Programa de Pós-Graduação em Bioética da UnB e membro do Comitê Internacional de Bioética da Unesco. – Mas a preocupação deles é válida, sempre há riscos, e o governo precisaria se posicionar.

LEIA O DOCUMENTO NA ÍNTEGRA:
Incidentes recentes envolvendo varíola, antraz e gripe aviária em alguns dos mais importantes laboratórios dos Estados Unidos nos faz lembrar da falibilidade até das unidades mais seguras, reforçando a necessidade urgente de uma reavaliação completa de biossegurança. Tais incidentes têm aumentado e ocorrido em média duas vezes por semana com patógenos regulados em laboratórios privados e públicos do país. Uma infecção acidental com qualquer patógeno é preocupante. Mas riscos de acidente com os recém-criados ‘patógenos potencialmente pandêmicos’ levanta novas graves preocupações.

A criação em laboratório de novas cepas de vírus perigosos e altamente transmissíveis, especialmente de gripe, mas não apenas dela, apresenta riscos substancialmente maiores. Uma infecção acidental em tal situação poderia desencadear surtos que poderiam ser difíceis ou impossíveis de controlar. Historicamente, novas cepas de gripe, uma vez que comecem a transmissão na população humana, infectaram um quarto ou mais da população mundial em dois anos.

Para qualquer experimento, os benefícios esperados deveriam superar os riscos. Experiências envolvendo a criação de patógenos potencialmente pandêmicos deveria ser limitada até que haja uma avaliação quantitativa, objetiva e confiável, dos possíveis benefícios e oportunidades de mitigação de riscos, assim como a comparação contra abordagens experimentais mais seguras.

Uma versão moderna do processo Asilomar, que define regras para pesquisas com DNA recombinante, poderia ser um ponto de partida para identificar as melhores medidas para se atingir os objetivos de saúde pública global no combate a doenças pandêmicas e assegurar os mais altos níveis de segurança. Sempre que possível, a segurança deve ser prioridade em detrimento a ações que tenham risco de pandemia acidental.

(Flávia Milhorance / O Globo)
http://oglobo.globo.com/sociedade/saude/cientistas-pedem-limite-criacao-de-virus-mortais-em-laboratorio-13281731#ixzz37jNAiHZI

The fight to reform Econ 101 (Al Jazeera)

Economics is a dismal nonscience, but it need not remain that way

July 16, 2014 6:00AM ET

by 

During the last weekend of June, hundreds of students, university lecturers, professors and interested members of the public descended on the halls of University College London to attend the Rethinking Economics conference. They all shared a similar belief: that economics education in most universities had become narrow, insular and detached from the real world.

For a brief period after the financial crisis of 2008, the shortcomings of the economics profession and the way it is taught were recognized. Many economists offered up mea culpas of various kinds and conceded that since they did not foresee the biggest economic event since the Great Depression, there was probably something seriously wrong with the discipline. But as time passed and many economies began to experience gradual, somewhat muted recoveries, the profession regained its confidence.

When I was completing my master’s degree at Kingston University last year, I experienced this firsthand from the more mainstream faculty there. Lecturers offered potted explanations of the crisis using old analytical tools such as supply and demand graphs that cannot incorporate expectations to explain asset price bubbles. The same economists who, just a few years ago, told us that financial markets were the conduits of perfect information began to introduce doublethink phrases in the media such as “rational bubble” (in which investors allegedly act irrationally by bidding up asset prices in full knowledge that prices are heavily inflated but think they can bail out of the market before prices fall) to explain the events of the past few years. There is nothing rational about investors’ acting this way, because they cannot know when the bubble will burst and so cannot time their exit from the market. They cannot know when the herd movement that they are part of will come to an end, so any action that they take to ride the wave will be just as irrational as those of people unaware of the bubble. The entire exercise appeared to be an ad hoc attempt to reinterpret the facts to fit the pet theory — economic agents aware of relevant information act rationally — rather than to alter the theory in light of the facts.

It was difficult not to sense the Soviet-style revisionism that had occurred within the halls of learning: The party had tossed history down the memory hole and introduced a strange, seemingly self-contradictory language that they were busy foisting upon an unwitting public. One Chicago school economist, Ray Ball, argues that the now notorious efficient market hypothesis (EMH), which states that financial markets price in all relevant information, is actually supported by the recent crisis. He argues that the capital flight that led to the bank meltdowns lends support to the EMH because it shows how rapidly financial markets react to new information. But as many will remember, investigations clearly showed that information was not being processed efficiently by market participants in the run-up to the crisis. The most colorful example of this was the Standard & Poor’s employee who, responding to a colleague who said that they should not be rating a mortgage-backed security deal because the estimations of risk were incorrect, said that cows could be estimating the risk of a product and S&P would still rate it.

Shine a light

Despite such attempts to shore up the orthodoxy, students have sensed that something is wrong: Over the past two years, they have been organizing across more than 60 countries with the aim of forcing the vampire that is the economics profession into the light of day. While the students in the movement have a diversity of opinions on various issues, they have all come to believe that the best way to reform economics is to demand that a plurality of approaches be taught. They have rightly identified the key fault with contemporary economics teaching: the monoculture it engenders. Currently only one approach to economics is taught in the vast majority of departments in the U.S. and Europe: what is usually called neoclassical or marginalist economics, epitomized by Harvard’s Gregory Mankiw — a former chairman of the Council of Economic Advisers under President George W. Bush — and Chicago’s Gary Becker, a Nobel laureate. This is the economics of the rational, atomized individual purged of all social context, whose only goal is to maximize a mysterious, effervescent quantity called utility. In this view, the economy tends toward an equilibrium end point, at which everyone has a job and wages and profits are set in line with what each individual contributes to society.

Donald Gillies, a former president of the British Society for the Philosophy of Science, told a stunned audience that he had examined three well-known Nobel Prize–winning papers in economics and could find nothing in them that he could call scientific.

When I spoke with the students, they were struck by how even those who dissented from contemporary economic policies like austerity shared this overarching vision. Paul Krugman, for example, to whom many turned after the crisis to provide context — including many of the students I met — also accepted the orthodox view (although he has not embraced some of the worst excesses echoed by his peers).

True dissenters

The students at the June conference also said that there were true dissenters in the discipline who found that economics was a highly contested field. Cambridge University’s Ha-Joon Chang pointed out that there are any number of schools of economic thought, each with their own approaches and insights. Their opinions range from the Austrians, who believe that government interference in the economy leads to wasted resources, to post-Keynesians, who believe that capitalist economies are inherently unstable and require government intervention to stave off collapse and stagnation, to Marxians, institutionalists, Schumpeterians, neo-Ricardians and so on. Chang argued that none of these schools of thought were inherently right or wrong; they all had insights into the working of the economy, and every one of them had a right to be taught to students as a competing point of view. It was up to the students, he said, to find what they found interesting, useful and credible.

One of the conference speakers pointed out that this is required in all the other disciplines that study people and society. He told an anecdote about being in the psychology department of his university when an inspector from a psychological association turned up to ensure that there was an adequately pluralist approach being undertaken. The speaker quipped that it would be far more likely that an inspector from an economics association would turn up to ensure that the current doctrine was being firmly adhered to.

But what, exactly, constitutes this dogmatic thinking? For starters, the firm belief that economics is a science on par with physics and chemistry. After all, these economists say, only a crank would demand that a plurality of approaches to physics and chemistry should be taught in universities. But the truth of the matter is that economics is not a science on par with physics and chemistry and it never will be. Donald Gillies, a former president of the British Society for the Philosophy of Science, told a stunned audience that he had examined three well-known Nobel Prize–winning papers in economics and could find nothing in them that he could call scientific. Rather, he said, they utilized sophisticated mathematics to hide the fact that they were not saying anything remotely relevant about the real world that could be proved or disproved.

The dirty little secret about economics is that it cannot, like other sciences, undertake proper laboratory experiments. Even the experiments of the behaviorist economists are open to doubt in that it seems unlikely that the manner in which people act in a lab while under observation is identical to how they act day to day. Economics is therefore ill equipped to make claims with the same confidence as bona fide sciences. What economists offer are instead interpretations of the world around them. Once this is understood, it becomes very difficult to argue against a plurality of opinions in the discipline. This was what the students sensed, and this is why their clarion call became one for pluralism.

New curriculum

These students are well organized, and their numbers are growing; their commitment is unlikely to go away anytime soon. They are focused in a manner that is impressive for a protest movement, willing to transcend their political differences in order to fight for a common goal. Every week a new group springs up. At the conference I attended, organizers went around with pads and pens collecting the contact details of sympathetic faculty members and other students in countries where the movement was only partially developed.

Even institutions are hopping on board. Many employers complain that the mainstream departments are churning out employees with mathematical skills completely out of proportion to the jobs they do but who seem unable to undertake basic economic analysis. Often these employees have to be retrained on the job in order to function at their institutions. The chief economist of the Bank of England, Andy Haldane, wrote in the foreword to the students’ international manifesto that “employers of economists, like the Bank of England, stand to benefit from such an evolution in the economics curriculum.” Given that mainstream economists often claim that the consumer is king and competition is sacrosanct, it is increasingly difficult to see how they make a case for their current monopoly over the educational process.

In September another conference will take place in New York, and rumor has it that an enormous international meeting will soon be organized too. If and when the movement reaches that level of international organization, it could start putting real pressure on companies, governments and economics departments to rethink their models and their ways. If the profession wishes to uphold what is left of its credibility, it would do well to pay attention.

Philip Pilkington is a London-based economist and member of the Political Economy Research Group at Kingston University. He runs the blog Fixing the Economists.

Negócio arriscado (Folha de São Paulo)

JC e-mail 4987, de 11 de julho de 2014

Em editorial, a Folha de São Paulo faz uma leitura sobre as mudanças climáticas no Planeta

O ano de 2015 poderá assistir a uma mudança de sinal na questão da mudança climática planetária. Há indícios de que ela já deixa o terreno estéril das polêmicas ao estilo Fla-Flu para se tornar, cada vez mais, uma preocupação crescente entre empresários e governantes de todos os matizes.

Não têm faltado manifestações nesse sentido. Elas aparecem bem sumarizadas na entrevista de Achim Steiner, diretor-executivo do Programa das Nações Unidas para o Meio Ambiente (Pnuma) ao jornal “Valor Econômico”.

Steiner aposta num bom acordo internacional em Paris, no final de 2015, decisiva reunião de cúpula sobre o clima. Um tratado abrangente e ambicioso reverteria o fiasco de Copenhague (2009), que deveria ter produzido um documento para substituir o Protocolo de Kyoto (1997), extinto em 2012.

Para o diretor do Pnuma, a poluição do carbono –que agrava o efeito estufa e leva ao aquecimento global– não é o preço inevitável do desenvolvimento. Seus argumentos são essencialmente econômicos, e não ideológicos.

Ele aponta a distorção dos subsídios concedidos mundialmente aos combustíveis fósseis (carvão, petróleo e gás natural), principal fonte do carbono lançado na atmosfera por atividades humanas. A conta fica entre US$ 600 bilhões e US$ 700 bilhões anuais e correspondente a cerca de dez vezes os incentivos para energias renováveis, como a eólica (ventos).

Seu exemplo é o Quênia, país que planeja incluir em cinco anos os 75% da população hoje sem acesso à eletricidade –e o fará com 95% de fontes limpas. Poderia ter citado o Brasil, que tem 80% de sua matriz com geração renovável e, nos últimos anos, descobriu os atrativos da energia eólica.

E Achim Steiner não está só. No contexto da opinião pública dos EUA, talvez a mais refratária ao tema do aquecimento global, líderes da política e da economia –democratas e republicanos– também vieram a público para defender que a inação diante dos problemas do clima, hoje, custará caro no futuro cada vez menos distante.

A manifestação apareceu no relatório “Risky Business” (negócio arriscado), com o endosso de pesos pesados como os ex-secretários do Tesouro dos EUA Henry Paulson, Robert Rubin e George Shultz, além de Michael Bloomberg, ex-prefeito de Nova York.

Seu raciocínio é cristalino: por remoto que seja o perigo, faz-se seguro contra incêndio; que sentido haveria, então, em ignorar os riscos do aquecimento global? A resposta será dada, ou não, em Paris.

(Folha de São Paulo)
http://www1.folha.uol.com.br/fsp/opiniao/175354-negocio-arriscado.shtml

Os céticos estão perdendo espaço (Valor Econômico)

JC e-mail 4985, de 09 de julho de 2014

Artigo de Martin Wolf publicado no Valor Econômico

Não temos uma atmosfera chinesa ou americana. Temos uma atmosfera planetária. Não podemos fazer experimentos independentes com ela. Mas temos feito um experimento conjunto. Não foi uma decisão consciente: ocorreu em consequência da Revolução Industrial. Mas estamos decidindo conscientemente não suspendê-lo.

Realizar experimentos irreversíveis com o único planeta que temos é irresponsável. Só seria racional se recusar a fazer alguma coisa para mitigar os riscos se tivéssemos certeza de que a ciência da mudança climática provocada pelo homem é um embuste.

Qualquer leitor razoavelmente aberto a novas ideias do “Summary for Policymakers” do Painel Intergovernamental sobre Mudança Climática chegaria à conclusão que qualquer certeza desse tipo sobre a ciência seria absurda. É racional perguntar se os benefícios da mitigação superam os custos. É irracional negar que é plausível a mudança climática provocada pelo homem.

Nessas discussões e, aliás, na política climática, os Estados Unidos desempenham papel central, por quatro motivos. Em primeiro lugar, os EUA ainda são o segundo maior emissor mundial de dióxido de carbono, embora sua participação de 14% do total mundial em 2012 o situe bem atrás dos 27% da China. Em segundo lugar, as emissões americanas per capita correspondem aproximadamente ao dobro das emissões das principais economias da Europa ocidental ou do Japão. Seria impossível convencer as economias emergentes a reduzir as emissões se os EUA não aderissem. Em terceiro lugar, os EUA dispõem de recursos científicos e tecnológicos insuperáveis, que serão necessários para que o mundo possa enfrentar o desafio de associar baixas emissões à prosperidade para todos. Finalmente, os EUA abrigam o maior número de opositores ativistas apaixonados e engajados.

Diante desse quadro, dois acontecimentos recentes são estimulantes para os que (como eu) acreditam que o senso comum mais elementar nos obriga a agir. Um deles foi a publicação do “President’s Climate Action Plan” no mês passado. Esse plano abrange a mitigação, a adaptação e a cooperação mundial. Seu objetivo é reduzir até 2020 as emissões de gases-estufa para níveis 17% inferiores aos de 2005.

O outro acontecimento, também ocorrido no mês passado, foi a publicação de um relatório – o “Risky Business” – por um poderoso grupo bipartidário que incluía o ex-prefeito de Nova York, Michael Bloomberg, os ex-secretários do Tesouro dos EUA, Hank Paulson e Robert Rubin, e o ex-secretário de Estado George Shultz.

Mas precisamos moderar nossa alegria. Mesmo se o governo implementar seu plano com êxito, ao explorar sua autoridade reguladora, será um começo apenas modesto. As concentrações de dióxido de carbono, metano e óxido nitroso subiram para níveis sem precedentes do último período de pelo menos 800 mil anos, muito antes do surgimento do “Homo sapiens”. Pelo nosso ritmo atual, o aumento será muito maior até o fim do século, e os impactos sobre o clima tenderão a ser grandes, irreversíveis e talvez catastróficos. Aumentos da média da temperatura de 5° C acima dos níveis pré-industriais são concebíveis à luz do nosso ritmo atual. O planeta seria diferente do que é hoje.

“Risky Business” revela o que isso poderia significar para os EUA. O documento se concentra nos danos aos imóveis e à infraestrutura litorâneos decorrentes da elevação dos níveis do mar. Examina os riscos de tempestades mais fortes e mais frequentes. Considera possíveis mudanças na agricultura e na demanda por energia, bem como o impacto da alta das temperaturas sobre a produtividade e a saúde pública. algumas áreas do país poderão se tornar quase inabitáveis.

O que faz do relatório um documento importante é que ele expõe a questão, corretamente, como um problema de gestão de risco. O objetivo tem de ser eliminar os riscos localizados na extremidade da distribuição das possíveis consequências. A maneira de fazer isso é mudar o comportamento. Ninguém pode nos vender seguros contra mudanças planetárias. Já vimos o que o risco remoto, localizado na extremidade da distribuição de riscos, significa em finanças. No âmbito do clima, as extremidades são mais encorpadas e tendentes a ser muito mais prejudiciais.

A questão é se uma coisa real e importante pode derivar desses novos começos modestos. Pode sim, embora deter o aumento das concentrações de gases-estufa é coisa que exige muito esforço.

Sempre pensei que a maneira de avançar seria por meio de um acordo mundial de limitação das emissões, à base de alguma combinação entre impostos e cotas. Atualmente considero esse enfoque inútil, como demonstra o fracasso do Protocolo de Kyoto de 1997 em promover qualquer verdadeira mudança na nossa trajetória de emissões. O debate político em favor de políticas públicas substanciais terá sucesso se, e somente se, duas coisas acontecerem: em primeiro lugar, as pessoas precisam acreditar que o impacto da mudança climática pode ser ao mesmo tempo grande e caro; em segundo lugar, elas precisam acreditar que os custos da mitigação serão toleráveis. Esse último fator, por sua vez, exige o desenvolvimento de tecnologias confiáveis e exequíveis para um futuro de baixos teores de carbono. Assim que ficar demonstrada a viabilidade de um futuro desse tipo, a adoção das políticas necessárias será mais fácil.

Nesse contexto, os dois novos documentos se corroboram mutuamente. “Risky Business” documenta os custos potenciais para os americanos da mudança climática não mitigada. O foco do governo em padrões reguladores é, portanto, uma grande parte da resposta, principalmente porque os padrões certamente obrigarão a uma aceleração da inovação na produção e no uso da energia. Ao reforçar o apoio à pesquisa fundamental, o governo americano poderá desencadear ondas de inovação benéficas em nossos sistemas de energia e de transportes marcados pelo desperdício. Se promovida com urgência suficiente, essa medida também poderá transformar o contexto das negociações mundiais. Além disso, em vista da falta de mitigação até esta altura, uma grande parte da reação deverá consistir em adaptação. Mais uma vez, o engajamento dos EUA deverá fornecer mais exemplos de medidas que funcionam.

Secretamente esperava que o tempo desse razão aos opositores. Só assim a ausência de resposta a esse desafio se revelaria sem custo. Mas é pouco provável que tenhamos essa sorte.

Continuar no nosso caminho atual deverá gerar danos irreversíveis e onerosos. Existe uma possibilidade mais alvissareira. Talvez se mostre possível reduzir o custo da mitigação em tal medida que ele se torne politicamente palatável. Talvez, também, nos conscientizemos muito mais dos riscos. Nenhuma das duas hipóteses parece provável. Mas, se esses dois relatórios efetivamente motivarem uma mudança na postura dos EUA, as probabilidades de escapar do perigo terão aumentado, embora talvez tarde demais. Isso não merece dois, muito menos três vivas. Mas poderíamos tentar um. (Tradução de Rachel Warszawski)

Martin Wolf é editor e principal analista econômico do FT.

(Valor Econômico)
http://www.valor.com.br/opiniao/3607960/os-ceticos-estao-perdendo-espaco#ixzz36yVZ5PEw

Busca de novas metodologias direciona conservação biológica (Fapesp)

Pesquisadores avaliam fatores limitantes atuais na área em livro sobre ecologia aplicada e dimensões humanas na governança da biodiversidade (foto: Wikimedia)
03/07/2014

Por Elton Alisson

Agência FAPESP – A necessidade de ampliação da base conceitual e de inovações metodológicas e tecnológicas, além do aprimoramento da gestão, tem limitado a identificação e a solução de problemas relacionados à conservação biológica no planeta.

A avaliação é feita no livro Applied Ecology and Human Dimensions in Biological Conservation, recém-lançado pela editora Springer. A publicação é resultado de dois workshops internacionais realizados pelo Programa FAPESP de Pesquisas em Caracterização, Conservação, Restauração e Uso Sustentável da Biodiversidade (BIOTA-FAPESP), respectivamente, em 2009 e 2010, e dos avanços propiciados por esses eventos.

No workshop de 2009, o tema tratado foi ecologia aplicada e dimensões humanas na conservação biológica. Em 2010, foram abordados programas de estudos de longa duração em biodiversidade relacionados principalmente a monitoramento de padrões de diversidade biológica.

“Uma das novidades desses dois workshops foi abordar a conservação biológica do ponto de vista dos fatores que a limitam, como a necessidade de ampliação de sua base conceitual”, disse Luciano Martins Verdade, professor do Centro de Energia Nuclear na Agricultura (Cena) da Universidade de São Paulo (USP) e um dos editores do livro, à Agência FAPESP.

“Muitas vezes não se sabe como identificar e solucionar os problemas porque faltam conceitos sobre o tema”, afirmou Verdade, que coordenou os dois workshops e é membro da coordenação do Programa BIOTA-FAPESP.

Segundo Verdade, um conceito que precisa ser aprimorado é o da própria diversidade biológica. Ao tratar as espécies como unidades da diversidade biológica – pressupondo que, quanto mais espécies, maior a diversidade biológica de um determinado grupo –, corre-se o risco de subestimar o valor de linhagens mais antigas em termos evolutivos, mas que foram mais conservativas e tiveram menor especiação do que grupos mais recentes.

“Mesmo tendo originado menos espécies, o patrimônio gênico dessas linhagens mais antigas pode ter um valor maior do ponto de vista evolutivo do que o de grupos mais recentes”, avaliou.

Outro conceito que tem sido revisto, segundo o pesquisador, é o do papel histórico da ação humana na montagem dos padrões de diversidade biológica observados atualmente.

Há uma tendência de achar que biomas, como a Floresta Amazônica e até mesmo a Mata Atlântica, tenham áreas intocadas (prístinas) que reflitam padrões de diversidade biológica não influenciados pelo homem.

Ao estudar a história desses biomas, contudo, é possível observar que neles há registros da presença humana de forma significativa nos últimos milênios.

Mesmo antes da chegada dos europeus, no século XVI, já havia um uso da terra expressivo que pode ter se refletido nos atuais padrões de diversidade biológica de biomas como a Floresta Amazônica, apontou Verdade.

“Há registros de que índios caiapós plantavam pomares na Floresta Amazônica em intervalos mais ou menos regulares, contribuindo para aumentar a diversidade florística e, consequentemente, faunística do bioma, uma vez que animais se aproximavam dos pomares atraídos pelas árvores frutíferas e tornavam-se alvo de caça”, disse.

Dessa forma, o papel do ser humano no passado e no presente na montagem dos padrões de biodiversidade é um fator crucial que não pode ser ignorado, ressaltou Verdade.

“A pressão humana associada à expansão da agricultura hoje é tão forte que provavelmente tem causado mudanças genéticas nas espécies que fazem com que, do ponto de vista do patrimônio gênico, elas sejam diferentes daquelas que existiam no passado”, exemplificou.

Monitoramento da biodiversidade

Segundo Verdade, outro fator limitante para a tomada de decisão em conservação biológica é a falta de uma política de monitoramento que permita a detecção de problemas de mudança na biodiversidade dos biomas em tempo de serem solucionados.

Ainda não há um conjunto de indicadores que permita realizar medições da biodiversidade de modo a indicar se uma determinada espécie está em declínio, se virou uma praga ou se o uso que está sendo feito dela é sustentável ou não, segundo Verdade.

Por isso, os autores do livro defendem a necessidade de se estabelecer uma rede mundial de estações de monitoramento da biodiversidade em longo prazo a fim de contribuir efetivamente para os processos de tomada de decisão em matéria de conservação, uso e controle da biodiversidade do planeta.

“A implementação de uma política de monitoramento da biodiversidade necessita de instituições bem estruturadas, que saibam como, quando e o que deve ser monitorado”, avaliou Verdade. “Além disso, também exige a estruturação de programas de pesquisa de longo prazo, como o BIOTA-FAPESP, para ampliar os conceitos e possibilitar a detecção dos problemas relacionados à conservação da biodiversidade.”

Os pesquisadores também apontam no livro a necessidade de desenvolvimento e aprimoramento de métodos de levantamento populacional e de tecnologias que auxiliem na detecção e identificação de espécies em campo e na avaliação de processos ecológicos e evolutivos, especialmente em ambientes já alterados pela ação humana.

“O uso de marcadores moleculares em fezes de animais que normalmente não são fáceis de serem observados em campo, por exemplo, pode auxiliar a estimar a população da espécie de forma menos invasiva e até mesmo mais acurada e precisa do que a observação direta”, avaliou Verdade.

Divisão por seções

O livro Applied Ecology and Human Dimensions in Biological Conservation tem 14 capítulos, escritos por 38 especialistas do Brasil e do exterior, e é dividido em três seções.

Na primeira seção é enfatizada a importância de uma rede ampla de monitoramento de padrões de biodiversidade e o papel dos processos ecológicos, evolutivos e históricos condicionantes dos padrões atuais de biodiversidade.

Já na segunda seção são apresentadas as inovações metodológicas e tecnológicas que permitem o desenvolvimento da conservação biológica. E a terceira seção apresenta exemplos de governança da biodiversidade.

“Os autores dos capítulos trazem informações de ponta em relação a conceitos, inovação e gestão da conservação biológica do ponto de vista da aplicação da Ciência da Ecologia, que chamamos de Ecologia Aplicada, e das dimensões humanas associadas a ela”, disse Verdade.

Applied Ecology and Human Dimensions in Biological Conservation
Lançamento: 2014
Preço: US$ 189 (impresso) e US$ 149 (e-book)
Páginas: 228
Mais informações: www.springer.com/life+sciences/ecology/book/978-3-642-54750-8 .