Arquivo da tag: Incerteza

Walter Neves: o pai de Luzia (Fapesp)

Arqueólogo e antropólogo da USP conta como formulou uma teoria sobre a chegada do homem às Américas

MARCOS PIVETTA e RICARDO ZORZETTO | Edição 195 – Maio de 2012

© LEO RAMOS

Ele é o pai de Luzia, um crânio humano de 11 mil anos, o mais antigo até agora encontrado nas Américas, que pertenceu a um extinto povo de caçadores-coletores da região de Lagoa Santa, nos arredores de Belo Horizonte. O arqueólogo e antropólogo Walter Neves, coordenador do Laboratório de Estudos Evolutivos Humanos do Instituto de Biociências da Universidade de São Paulo (USP), não foi o responsável por ter resgatado esse antigo esqueleto de um sítio pré-histórico, mas foi graças a seus estudos que Luzia, assim batizada por ele, tornou-se o símbolo de sua polêmica teo-ria de povoamento das Américas: o modelo dos dois componentes biológicos.Formulada há mais de duas décadas, a teoria advoga que nosso continente foi colonizado por duas levas de Homo sapiensvindas da Ásia. A primeira onda migratória teria ocorrido há uns 14 mil anos e fora composta por indivíduos parecidos com Luzia, com morfologia não mongoloide, semelhante à dos atuais australianos e africanos, mas que não deixaram descendentes. A segunda leva teria entrado aqui há uns 12 mil anos e seus membros apresentavam o tipo físico característico dos asiáticos, dos quais os índios modernos derivam.

Nesta entrevista, Neves, um cientista tão aguerrido como popular, que gosta de uma boa briga acadêmica, fala de Luzia e de sua carreira.

Como surgiu seu interesse por ciência?
Venho de uma família pobre, de Três Pontas, Minas Gerais. Por alguma razão, aos 8 anos, eu já sabia que queria ser cientista. As 12 anos, que queria trabalhar com evolução humana. Não tenho explicação para isso.

Quando você veio para São Paulo?
Foi em 1970, depois da Copa. Migramos para São Bernardo, onde morei grande parte da minha vida.

Como era sua vida?
Todo mundo em casa tinha que trabalhar. A família era pequena. Era meu pai, minha mãe, eu e meu irmão, três anos mais velho. Quando chegamos a São Paulo, meu pai era pedreiro e minha mãe vendia Yakult na rua. Eu tinha de 12 para 13 anos. Um ano depois de chegar aqui, comecei a trabalhar. Vendia massas uma vez por semana numa barraca de feirantes do meu bairro. Meu primeiro emprego fixo foi de ajudante-geral na Malas Primicia, para fazer fechadura de mala. E eu odiava. Era chato, não exigia qualificação. Durou pouco. Um mês depois fui contratado na fábrica de turbinas de avião da Rolls-Royce, em São Bernardo. Eu me beneficiei muito desse ambiente, que era refinado, cheio de regras e de valorização da hierarquia. Acho que desenvolvi minha excelente capacidade administrativa nos anos que passei na Rolls-Royce. Tive uma formação burocrática de primeira. Todos os dias, quando a gente chegava à fábrica, tinha um quadro da rainha da Inglaterra e a gente tinha que fazer mesura. Eu achava o máximo. Para quem vivia no mato, era um upgrade de glamour na vida. Tinha de 13 para 14 anos.

O que fazia?
Comecei como office-boy e quando saí era assistente da diretoria técnica. A Rolls-Royce no Brasil recebia as turbinas para fazer reparos e revisão geral. Meu chefe era diretor dessa parte e eu o ajudava em tudo. Trabalhava oito horas por dia e estudava à noite. Estudei em escola pública e entrei na USP, em biologia, em 1976. Tínhamos um ensino médio público de excelente nível.

Por que escolheu biologia?
Sempre achei que o caminho para estudar evolução humana era estudar história. Numa visita à USP no colegial, conheci o Instituto de Pré-história, que não existe mais. O instituto fora fundado por Paulo Duarte e funcionava no prédio da Zoo-logia, onde hoje fica a Ecologia. Nessa visita, fui ao prédio da História atrás de informação sobre o curso e me disseram que, se eu fizesse história, não aprenderia nada sobre evolução humana. Descendo a rua do Matão, vi numa plaquinha escrito Instituto de Pré-história, onde conheci a arqueó-loga Dorath Uchôa. Lá vi as réplicas de hominídeos fósseis e esqueletos pré-históricos escavados nos sambaquis da costa brasileira. Então disse para Dorath: “Quero fazer arqueologia e estudar esqueleto”. E ela disse: “Não faça história. Ou você faz biologia ou medicina”. Medicina não dava porque era em tempo integral. Optei por biologia. Foi um bom negócio. Em 1978 fui contratado, ainda na graduação, pelo Instituto de Pré-história como técnico.

Você estava em que ano da faculdade?
Do segundo para o terceiro, acho. Quando concluí a licenciatura em 1980, fui contratado como pesquisador e professor. Não tinha concurso. Era indicação.

Era um instituto independente?
Sim. Depois foi anexado ao Museu de Arqueologia e Etnologia, o MAE. Na época se fazia arqueologia em três lugares na USP: no Instituto de Pré-história, o mais antigo, no MAE e no setor de arqueologia do Museu Paulista, no Ipiranga. No final dos anos 1980, os três foram unidos em um só. Trabalhei no Instituto de Pré-história como pesquisador de 1980 a 1985. Em 1982 fui fazer doutorado sanduíche na Universidade Stanford. Eu era autodidata, porque não havia no Brasil especialista nessa área. No Instituto de Pré-história, o material estava lá, a biblioteca estava lá, mas não havia quem me orientasse.

Eles não trabalhavam com evolução humana?
O instituto era muito pequeno, tinha dois pesquisadores, que se achavam donos daquilo. Quando fui contratado, outra arqueóloga, a Solange Caldarelli, também foi contratada. Formamos um par muito produtivo. Trabalhamos no interior de São Paulo com grupos de caçadores-coletores, na faixa cronológica dos 3 mil aos 5 mil anos. Foi com ela que me tornei um arqueólogo. Minha transformação de biólogo para antropólogo físico foi autodidata. O crescimento do nosso grupo de pesquisa começou a expor a mediocridade do trabalho feito no Instituto de Pré-história e no Brasil. Isso levou a uma guerra entre nós e o establishment. Em 1985 fomos expulsos da universidade.

Como assim?
Expulsos. Demitidos sumariamente.

O que alegavam?
Nada. Não tínhamos estabilidade. A maior parte dos docentes era contratada a título precário e fomos chutados do Instituto de Pré-história pelo pessoal mais velho.

Qual a diferença do antropólogo físico e do arqueólogo? Você se considera o que hoje?
Me considero antropólogo e arqueólogo. Na verdade me considero uma categoria que tem nos Estados Unidos e se chama evolutionary anthopologyst, antropólogo evolutivo. Mesmo entre os antropólogos evolutivos são raros os que têm uma trajetória em antropologia física, arqueologia e antropologia sociocultural. Nesse sentido tenho uma carreira única, que os meus colegas no exterior não entendiam. Eu fazia antropologia física e antropologia biológica e tinha projetos de arqueologia. Quando fui para a Amazônia, trabalhei com antropologia ecológica. Sou uma das únicas pessoas no mundo que passou por todas as antropologias possíveis. Se por um lado não sou bom em nenhuma delas, por outro eu tenho uma compreensão do humano muito mais multifacetada do que meus colegas.

O arqueólogo faz o trabalho de campo e o antropólogo físico espera o material?
O antropólogo físico pode ir a campo, mas não vai. Espera os arqueólogos entregarem o material para ele estudar. Me rebelei contra isso no Brasil. Falei: quero ser arqueólogo também. Nos Estados Unidos, no final dos anos 1980, se definiu uma área chamada bioarqueologia, composta por antropólogos físicos que não aguentavam mais ficar na dependência dos arqueólogos. Aqui de maneira independente me rebelei contra essa situação. E a demissão do instituto em 1985 foi traumática porque tínhamos sete anos de pesquisa de campo e perdemos tudo. De uma hora para outra minha carreira foi zerada. A sorte é que àquela altura eu tinha defendido meu doutorado.

Aqui?
Aqui na Biologia, mas sobre paleogenética. Fui para Stanford por meio de uma bolsa sanduíche de seis meses do CNPq. Para me manter em Stanford e em Berkeley, eu contava com meu salário daqui, na época dava US$ 250, e o [Luigi Luca] Cavalli-Sforza, com quem trabalhei, me pagava no laboratório mais US$ 250.

Ele é um grande pesquisador, mas da genética de populações.
Me perguntam por que não fui trabalhar com um antropólogo físico, se eu era autodidata na parte osteológica. Não fui porque o que o Cavalli-Sforza faz é fascinante. Ele une várias áreas do conhecimento. Na época eu estava matriculado no mestrado na Biologia, quem me orientava aqui era o [Oswaldo] Frota-Pessoa.

Que também é da genética.
Da genética, mas com uma visão muito abrangente do ser humano. Se o Frota não existisse, eu não teria conseguido fazer o mestrado. Ele percebeu minha situação e foi muito generoso. Quando eu estava terminando o trabalho em Stanford, o Cavalli-Sforza descobriu que eu estava fazendo mestrado, e não doutorado. Ele olhava para mim e dizia: “Como você pode estar fazendo mestrado se já tem diversas publicações, coordena dois projetos de arqueologia e tem sete estudantes? Não tem sentido. Vou mandar uma correspondência para o Frota-Pessoa sugerindo que você faça direto o doutorado”. Hoje isso é comum. Foi o que me salvou. Defendi o doutorado em dezembro de 1984 e, meses depois, fui demitido. A Solange Caldarelli saiu tão enojada com a academia que nunca mais quis saber de carreira universitária. Eu queria voltar para a academia. Aí surgiram três possibilidades. Uma era fazer um pós-doc em Harvard; outra um pós-doc na Universidade Estadual da Pensilvânia e uma terceira coisa, inesperada. Quando eu fui demitido disse para o Frota que ia para o exterior. Sabia que a minha condição ia ser sempre conflituosa com a arqueologia brasileira. Nessa época existia o programa integrado de genética, do CNPq, importante para o desenvolvimento da genética no Brasil, e o Frota coordenava alguns cursos itinerantes. Aí o Frota disse: “Agora que a gente ia ter um especialista em evolução humana você vai embora. Eu entendo, mas eu vou te convidar para, antes de ir para o exterior, você dar um curso itinerante pelo Brasil sobre evolução humana”. Dei o curso na Universidade Federal da Bahia, na Federal do Rio Grande do Norte, no Museu Goeldi e na Universidade de Brasília. Fiquei muito bem impressionado com o Goeldi. No último dia de curso no Goeldi, o diretor quis me conhecer. Falei da minha trajetória e que estava indo para os Estados Unidos. Ele me perguntou: “Não tem nada que possa demover você dessa ideia?” Eu disse: “Olha, Guilherme”, o nome dele é Guilherme de La Penha, “a única coisa que me faria ficar no Brasil seria ter a oportunidade de criar meu próprio centro de estudos, que pudesse ser interdisciplinar e não estivesse ligado nem à antropologia, nem à arqueologia”. E ele me convidou para criar lá o que na época se chamou de núcleo de biologia e ecologia humana. Aconteceu também uma coisa no nível pessoal que me levou a optar por Belém.

Isso em 1985?
Ainda em 1985. Um pouco antes de eu dar esse curso pelo Brasil, eu me apaixonei profundamente pela primeira vez. Me apaixonei pelo Wagner, a melhor coisa que aconteceu na minha vida. Se fosse para os Estados Unidos, dificilmente conseguiria levá-lo. Em Belém, seria mais fácil arrumar um emprego para ele e continuar o relacionamento. Por isso aceitei a ida para o Goeldi. Só que tive de me afastar dos esqueletos. Na Amazônia a última coisa do mundo que se pode fazer é trabalhar com esqueletos, porque eles não se preservam.

O que você fazia?
Comecei a me dedicar à antropologia ecológica.

E o que é antropologia ecológica?
Ela estuda as adaptações de sociedades tradicionais ao ambiente. Até então, era uma linha que os americanos trabalhavam muito na Amazônia. Como a nossa antropologia aqui é eminentemente estruturalista, e tem urticária de alguma coisa que seja biológica, essa linha nunca progrediu no Brasil. Aí pensei: “Bárbaro, vou comprar outra briga. Vou formar uma primeira geração em antropologia ecológica”. Grande parte das pesquisas sobre antropologia ecológica na Amazônia era feita com indígenas. Então decidi estudar as populações caboclas tradicionais.

Vocês publicaram um livro, não?
Publicamos a primeira grande síntese sobre a adaptação cabocla na Amazônia, que saiu aqui e no exterior. Coloquei alunos que trabalharam comigo na Amazônia para fazer doutorado no exterior.

Quais conclusões você destaca dessa síntese?
Estudando essas populações amazônicas tradicionais, ficou claro que todo mundo que chega lá, as ONGs principalmente, acha que eles têm problema de nutrição. De fato, eles têm um déficit de crescimento em relação aos padrões internacionais. Mas nosso trabalho mostrou que na verdade eles não têm deficiência de ingestão de carboidratos e de proteínas. O problema é parasitose.

Como você retornou para a USP?
Em 1988, pouco depois de mudar para a Amazônia, o Wagner foi diagnosticado  com Aids, e fizemos um trato. Quando ele chegasse na fase terminal, voltaríamos para São Paulo. Vim fazer um pós-doc na antropologia. Quando o Wagner morreu, em 1992, eu não queria mais voltar para a Amazônia e prestei dois concursos.

Você fazia pós-doc em antropologia na USP?
Sim, na Faculdade de Filosofia, Letras e Ciências Humanas. Aí prestei dois concursos. Um na Federal de Santa Catarina, na área de antropologia ecológica, mas eu queria ficar em São Paulo. Como eu já tinha feito em 1989 a primeira descoberta do que se tornou o meu modelo de ocupação das Américas, pensei: “Tenho que ir para um lugar em que possa me dedicar a isso e voltar a me concentrar em esqueletos humanos”. Aí surgiu uma vaga aqui no departamento, na área de evolução. Passei em ambos os lugares, mas optei por aqui. Sabia que poderia criar um centro de estudos evolutivos humanos que tivesse arqueologia, antropologia física, antropologia ecológica.

Como teve a ideia de criar  um modelo alternativo de colonização das Américas?
Um dia, o Guilherme de La Penha, diretor do Goeldi, me chamou e disse: “Olha, Walter, daqui a uma semana tenho de ir a um congresso em Estocolmo sobre arqueologia de salvamento. Preciso que você me substitua”. Eu disse: “Mas assim, em cima da bucha?” Então lembrei que Copenhague fica na rota de Estocolmo. Negociei com ele a permissão para passar uns cinco dias em Copenhague e conhecer a coleção Lund. Fiz a viagem e não só conheci como medi os crânios de Lagoa Santa da coleção Lund. Quando voltei, falei com um pesquisador da Argentina que passava um tempo no Goeldi, o Hector Pucciarelli, meu maior parceiro de pesquisa e o mais importante bioantropólogo da América do Sul. Propus que fizéssemos um trabalhinho com esse material. Na época estavam surgindo os trabalhos de Niède Guidon com conclusões que me pareciam loucura, como dizer que o homem estava nas Américas havia 30 mil anos. Minha ideia no trabalho sobre os crânios de Lund era mostrar que os primeiros americanos não eram diferentes dos índios atuais. Bom, imagina nossa cara quando vimos que os crânios de Lagoa Santa eram mais parecidos com os australianos e os africanos do que com os asiáticos. Entramos em pânico. Vimos que precisávamos de um modelo para explicar isso.

O que vocês fizeram então?
Alguns autores clássicos, dos anos 1940 e 1950, como o antropólogo francês Paul Rivet, já haviam reconhecido uma similaridade entre o material de Lagoa Santa e o da Austrália. Só que o Rivet propôs uma migração direta da Austrália para a América do Sul para explicar a semelhança. Mais tarde, com o avanço dos estudos de genética indígena, principalmente com o trabalho do (Francisco) Salzano, ficou claro que todos os marcadores genéticos daqui apontavam para a Ásia. Não havia similaridade com os australianos. Pensamos então em criar um modelo que explorasse essa dualidade morfológica. Não queríamos cair em desgraça como o Rivet e começamos a estudar a ocupação da Ásia. Descobrimos que lá, no final do Pleistoceno, também havia uma dualidade morfológica. Havia os pré-mongoloides e os mongoloides. Nossas populações de Lagoa Santa eram parecidas com os pré-mongoloides. Os índios atuais são parecidos com os mongoloides. Foi daí que surgiu a ideia de que a América foi ocupada por duas levas distintas: uma com morfologia generalizada, parecida com os africanos e os australianos; e outra parecida com os asiáticos. Nosso primeiro trabalho foi publicado na revista Ciência e Cultura, em 1989. A partir de 1991 começamos a publicar no exterior.

Você então formulou esse modelo antes de examinar o crânio da Luzia.
Dez anos antes. No Brasil vários museus tinham acervos da região de Lagoa Santa. Mas, como eu era o enfant gâté da arqueologia brasileira, não me davam acesso às coleções. Por isso fui estudar a coleção Lund. Só passei a ter acesso às coleções no Brasil a partir de 1995, quando algumas das pessoas que colocavam barreiras morreram. Um dos crânios que eu tinha mais curiosidade de estudar era o da Luzia.

Já tinha esse nome?
Não. Eu é que dei. A gente conhecia como esqueleto da Lapa Vermelha IV, nome do sítio em que foi encontrado. O sítio foi escavado pela missão franco-brasileira, coordenada pela madame Annette Emperaire. O esqueleto da Luzia foi achado nas etapas de 1974 e 1975. Mas a madame Emperaire morreu inesperadamente. Com exceção de um artigo que ela publicou, não tinha mais nada escrito sobre a Lapa Vermelha.

No artigo ela falava que o crânio era antigo?
Madame Emperaire achava que havia dois esqueletos na Lapa Vermelha: um mais recente e outro mais antigo, datado de mais de 12 mil anos, antes da cultura Clovis, ao qual pertenceria o crânio da Luzia. Só que o André Prous (arqueólogo francês que participou da missão e hoje é professor da UFMG) revisou as anotações dela e percebeu que o crânio era do esqueleto mais recente, que estava cerca de um metro acima. Luzia não foi sepultada, foi depositada no chão do abrigo, numa fenda. Prous demonstrou que o crânio tinha rolado e caído num buraco de uma raiz de gameleira que tinha apodrecido. Portanto, o crânio pertencia a esses restos que estavam na faixa dos 11 mil anos de idade. Madame Emperaire morreu acreditando que tinha encontrado uma evidência pré-Clovis na América do Sul, o crânio que apelidei de Luzia.

Onde estava o crânio da Luzia quando você o examinou?
Sempre esteve no Museu Nacional do Rio de Janeiro, mas as informações não. O museu era a instituição parceira da missão francesa.

O povo de Luzia era restrito a Lagoa Santa?
Lagoa Santa é uma situação excepcional. No artigo síntese do meu trabalho, que publiquei em 2005 na revista PNAS, usamos 81 crânios da região. Para se ter uma ideia de como são raros os esqueletos com mais de 7 mil anos no nosso continente, os Estados Unidos e o Canadá, juntos, têm cinco. Temos o que chamamos de fossil powerno que se refere à questão da origem do homem americano. Estudei também algum material de outras partes do Brasil, do Chile, do México e da Flórida e demonstrei que a morfologia pré-mongoloide não era uma peculiaridade de Lagoa Santa. Acredito que os não mongoloides devem ter entrado lá em cima por volta de uns 14 mil anos e os mongoloides por volta de 10 ou 12 mil anos. Na verdade, a morfologia mongoloide na Ásia é muito recente. Imagino que, entre uma e outra, não deve ter mais do que 2 ou 3 mil anos de diferença. Mas é puro chute.

Dois ou 3 mil anos são o suficiente para mudar o fenótipo?
Foram o suficiente para mudar na Ásia. Hoje está mais ou menos claro que a morfologia mongoloide é resultado da exposição das populações que saíram da África, com uma morfologia tipicamente africana, e se submeteram ao frio extremo da Sibéria. Meu modelo não é totalmente aceito por alguns colegas, inclusive argentinos. Eles acham que o processo de mongolização ocorreu na Ásia e na América de forma paralela e independente. Não vamos resolver o assunto por falta de amostras. Mas, em evolução, a gente sempre opta pela lei da parcimônia. Você escolhe o modelo que envolve o menor número de passos evolutivos para explicar o que encontrou. Pela regra da parcimônia, meu modelo é melhor do que outros, que dependem de ter havido dois eventos evolutivos paralelos e independentes. Mas há oposição ao meu modelo.

De quem?
Dos geneticistas. Mas acho que não dá para enterrar o meu modelo com esse tipo de dado. Não há razão para o DNA mitocondrial, por exemplo, se comportar evolutivamente do mesmo jeito que a morfologia craniana. Onde geneticistas veem certa homogeneidade do ponto de vista do DNA, posso encontrar fenótipos diferentes.

Também tem o argumento de que teria havido uma só leva migratória para as Américas, já composta por uma população com tipos mongoloides e não mongoloides como Luzia.
Existe essa terceira possibilidade. Mas teria que ter havido uma taxa de deriva genética assombrosa para explicar a colonização dessa forma. Por que teria desaparecido um fenótipo e ficado apenas o outro? Das opções ao meu modelo, acho essa a mais fraca.

Mas como você explica o desaparecimento da morfologia de Luzia?
Na verdade, descobrimos nos últimos anos que ela não desapareceu. Quando propusemos o modelo, achávamos que uma população tinha substituído a outra. Mas em 2003 ou 2004 um colega argentino mostrou que uma tribo mexicana que viveu isolada do resto dos índios, num território hoje pertencente à Califórnia, manteve a morfologia não mongoloide até o século XVI, quando os europeus chegaram pelo mar. Estamos descobrindo também que os índios botocudos, do Brasil Central, mantiveram essa morfologia até o século XIX. Quando se estuda a etnografia dos botocudos, vê-se que eles se mantiveram como caçadores-coletores até o fim do século XIX. Estavam cercados por outros grupos indígenas, com os quais tinham relação belicosa. O cenário foi esse. Sobrou um pouquinho da morfologia não mongoloide até recentemente.

O que você acha do trabalho da arqueóloga Niède Guidon no Parque Nacional Serra da Capivara? Para ela, o homem chegou ao Piauí há 50 mil, talvez 100 mil anos.
Mas cadê as publicações? Ela publicou uma nota na Nature nos anos 1990 e estamos esperando as publicações. Eu e a Niède fomos inimigos mortais por 20 anos. Uns anos atrás, a gente fumou o cachimbo da paz. Já estive no Piauí algumas vezes e até publicamos trabalhos sobre esqueletos de lá. No parque havia as duas morfologias de crânio. É muito interessante. Tive uma boa formação em análise de indústria da pedra lascada. A Niède abriu toda a coleção lítica para mim e o Astolfo Araujo (hoje no MAE). Saí 99,9% convencido do fato de que houve ali uma ocupação humana com mais de 30 mil anos. Mas tenho esse 0,1% de dúvida, que é muito significativo.

O que seria preciso para acabar com a dúvida?
A Niède deveria convidar os melhores especialistas internacionais em tecnologia lítica para ver o material e publicar os resultados das análises. Se ela estiver certa, teremos de jogar tudo que sabemos fora. Meu trabalho não terá servido para nada. Mas, graças a Deus, não só o meu, o de todo mundo.

Da garoa à tempestade (Fapesp)

Temporais se tornam mais frequentes e chuva aumenta 30% em São Paulo em 80 anos

MARCOS PIVETTA | Edição 195 – Maio de 2012

  © LEO RAMOS

Entre 1933 e 2010, o total anual de chuvas aumentou 425 mm na região metropolitana, segundo dados da USP

A terra da garoa virou a megalópole da tempestade. Em cerca de 80 anos, a quantidade de chuva anual que cai na Região Metropolitana de São Paulo, onde um em cada 10 brasileiros vive numa área equivalente a quase 1% do território nacional, aumentou 425 milímetros (mm), metade do que chove em boa parte do semiárido brasileiro. Saltou de uma média anual de quase 1.200 mm na década de 1930 para algo em torno dos 1.600 nos anos 2000. Fazendo uma soma linear, é como se todo ano tivesse chovido 5,5 mm a mais do que nos 12 meses anteriores. A pluviosidade não apenas se intensificou como alterou seu padrão de ocorrência. Não está simplesmente chovendo um pouco mais a cada dia, um efeito que seria pouco perceptível na prática e incapaz de ocasionar alagamentos constantes na região. A quantidade de dias com chuva forte ou moderada cresceu, provocando inclusive tempestades no inverno, época normalmente seca. Em contrapartida, o número de dias com chuva fraca, menor do que 5 mm, diminuiu.

Um regime de extremos, pendular, passou a dominar o ciclo das águas na região metropolitana: quando chove, em geral é muito; mas, entre os dias de grande umidade, pode haver longos períodos de seca. A Grande São Paulo parece caminhar para o pior dos dois mundos, alternando períodos intensos de excesso e de falta de chuva ao longo do ano. “A urbanização e o chamado efeito ilha de calor, além da poluição atmosférica, parecem ter um papel importante na alteração do padrão de pluviosidade em São Paulo, em especial nas estações já normalmente mais úmidas, como primavera e verão”, afirma Maria Assunção da Silva Dias, do Instituto de Astronomia, Geofísica e Ciências Atmosféricas da Universidade de São Paulo (IAG-USP), autora de um estudo ainda inédito sobre o tema. “Nos meses mais secos, a influência das mudanças globais do clima é responsável por 85% da dinâmica envolvida no aumento de chuvas extremas.” Embora com menos nitidez, a mesma tendência de elevação no número de dias com chuva intensa foi detectada na Região Metropolitana do Rio de Janeiro.

O novo padrão pluviométrico em São Paulo não é como uma frente fria passageira. Veio para ficar, segundo modelagens feitas pelo Centro de Ciência do Sistema Terrestre do Instituto Nacional de Pesquisas Espaciais (CCST-Inpe). As projeções sugerem que a situação atual é uma espécie de prólogo do enredo futuro. Elas sinalizam que deverá ocorrer até o final deste século um aumento no número de dias com chuvas superiores a 10, 20, 30 e 50 mm, ou seja, praticamente em todas as faixas significativas de pluviosidade. Haverá apenas uma diminuição na quantidade de dias com chuvas muito fracas e possivelmente um aumento no número de dias secos. “A sazonalidade das chuvas também deverá mudar”, afirma José Marengo, chefe do CCST, coordenador de um trabalho ainda não publicado sobre as projeções de chuva na região metropolitana. “A quantidade de tempestades fora da época normalmente mais úmida deverá crescer, um tipo de situação que pega a população de surpresa.” As simulações levam em conta apenas os possíveis efeitos sobre o regime pluviométrico da região metropolitana causados pelas chamadas mudanças climáticas globais, sobretudo o aumento nas concentrações dos gases de efeito estufa, que esquentam a temperatura do ar. O peso que a urbanização e a poluição atmosférica podem ter sobre as chuvas da Grande São Paulo não é considerado nas projeções.

Verde escasso na metrópole de concreto e asfalto: se 25% do território da Grande São Paulo fosse coberto por árvores, a temperatura média cairia até 2,5ºC

Uma das grandes dificuldades de fazer grandes estudos, capazes de revelar flutuações climáticas do passado e servir de baliza para projeções futuras, é a ausência de séries históricas longas e confiáveis, com informações diárias sobre a incidência de chuvas. Sem elas, não é possível fazer uma análise estatística robusta e ter uma visão clara sobre quanto chovia e como se distribuía a pluviosidade ao longo dos anos e das estações climáticas (primavera, verão, outono e inverno). Os especialistas são unânimes em apontar essa deficiência no Brasil. A série com dados de melhor qualidade sobre chuvas num ponto do território nacional é a fornecida pela estação meteorológica do IAG, que fica no Parque do Estado, no bairro da Água Funda, zona Sul da cidade de São Paulo. Os registros se iniciaram em 1933, quando a unidade foi inaugurada, e prosseguem até hoje.

Outro fator reveste os dados fornecidos pela estação meteorológica do IAG de um caráter único. Os registros foram obtidos dentro de uma grande área verde da cidade de São Paulo que não mudou radicalmente seu perfil ao longo de quase oito décadas – uma raridade numa megalópole que não possui muitos parques e jardins. Em outras palavras, embora a cidade tenha sofrido um forte processo de urbanização e de impermeabilização do solo no século passado, as condições naturais nos arredores da estação do Parque do Estado não se alteraram radicalmente.  Dessa forma, faz sentido comparar os dados do presente com os do passado, visto que o ambiente local é mais ou menos o mesmo. “Na zona Norte de São Paulo, no Mirante de Santana, existe uma estação meteorológica com medições desde os anos 1950”, afirma Pedro Leite da Silva Dias, pesquisador do IAG-USP e diretor do Laboratório Nacional de Computação Científica (LNCC), no Rio de Janeiro, também autor do estudo sobre a evolução das chuvas na região metropolitana. “Mas lá só havia matas algumas décadas atrás e hoje tem prédio do lado da estação.”

Devido à riqueza de dados fornecidos pela estação do IAG no Parque do Estado, Assunção e seus colaboradores puderam enxergar detalhes e tendências mais sutis no regime das chuvas ao longo das últimas oito décadas. Entre 1935 e 1944 choveu, em média, mais do que 40 mm em cerca de 30 dias, com grande concentração de pluviosidade nos meses de verão e, em menor escala, na primavera e no outono. Durante o período não houve registros de episódios de pluviosidade dessa intensidade nos meses de inverno. A situação começou a mudar a partir de meados dos anos 1940. Desde então, em todas as décadas ocorreu, em média, ao menos uma chuva desse porte no inverno. Entre 2000 e 2009, o número total de jornadas com tempestades acima de 40 mm esteve na casa de 70 eventos. Uma tendência similar se repete quando se analisa década a década a ocorrência de chuvas diárias acima de 60 e de 80 mm.

De forma geral, dois fatores principais podem estar relacionados com a alteração no regime de chuvas na região metropolitana: as mudanças climáticas globais, um fenômeno de grande escala, e o efeito ilha de calor, de caráter localizado e típico das megacidades. Os dois atuam em conjunto. Um potencializa os efeitos do outro e, em geral, é difícil traçar uma linha divisória entre ambos. Segundo Marengo, a maioria dos modelos climáticos indica que haverá um aumento na quantidade de chuva desde a bacia do Prata até o Sudeste do Brasil nas próximas décadas. Dentro dessa moldura mais ampla, surge a questão específica do clima nas grandes cidades, em especial do efeito ilha de calor, que, ao tornar mais quentes as áreas extremamente urbanizadas, também funciona como um ímã de chuvas.

Brisa marinha mais úmida
A temperatura superficial do oceano Atlântico no litoral paulista aumentou cerca de um grau entre os anos de 1950 e 2010. Passou de 21,5°C para 22,5°C. Pode parecer pouco, mas uma das consequências desse aquecimento é aumentar a taxa de evaporação da água do oceano, combustível que torna a brisa marinha ainda mais carregada de umidade. Esse processo tem repercussões sobre o clima acima da serra do Mar, no planalto onde fica a região metropolitana.

Por que boa parte das chuvas na Grande São Paulo ocorre entre o meio e o final da tarde, depois das 15 ou 16 horas? Essa é a hora em que a brisa marinha, quente e úmida, vinda da Baixada Santista, termina de subir a serra e atinge a megalópole. “A zona Sudeste é geralmente o primeiro lugar da capital que sente os efeitos da brisa”, comenta Maria Assunção. A estrutura interna das cidades, com muitos prédios altos, altera a direção dos ventos e pode até provocar a ascensão da brisa marinha em certoss pontos da região metropolitana e favorecer localmente a formação de nuvens de chuvas. A poluição urbana, sobretudo os aerossóis, pode tanto favorecer como inibir a ocorrência de tempestades sobre as cidades, dependendo de sua quantidade.

Estudos feitos nos Estados Unidos na década de 1990 sugerem que parte do aumento de pluviosidade em algumas regiões metropolitanas, como na de Saint Louis, se deve à sua crescente urbanização. Nessa área do estado de Missouri, onde vivem cerca de 2,9 milhões de pessoas, as chuvas aumentaram entre 5% e 25% nas últimas décadas. Um estudo do ano passado, conduzido em grandes cidades da Índia, conclui que as alterações no regime pluviométrico dessas concentrações urbanas derivam mais das  flutuações naturais do clima do que de fenômenos locais.

Estratégias de mitigação
No caso da Região Metropolitana de São Paulo, o trabalho da USP encontrou uma forte correlação entre seu processo de urbanização e as alterações no regime das chuvas. Os episódios de chuvas extremas, acima de 40 mm, se acentuam à medida que a população de São Paulo e de suas cidades vizinhas cresce e os territórios desses municípios viram praticamente uma única mancha de ocupação contínua, com pouco verde, muito asfalto e repleta de fontes de poluição e calor. De 1940 a 2010, a população da região metropolitana aumentou 10 vezes, de 2 para 20 milhões de habitantes. A mancha urbana cresceu 12 vezes entre 1930 e 2002, de 200 para 2.400 quilômetros quadrados. A temperatura média anual de São Paulo subiu 3°C entre 1933 e 2009, de acordo com os registros da estação do IAG no Parque do Estado e o total de chuvas aumentou em um terço. “Antes estudávamos esse processo de forma teórica”, afirma Pedro Leite da Silva Dias. “Agora temos mais dados, inclusive de fontes digitais.”

Mitigar o efeito ilha de calor pode ser uma forma de reduzir os episódios de chuvas extremas nos centros urbanos. O físico Edmilson Dias de Freitas, do IAG-USP, tem testado algumas medidas em simulações computacionais para ter uma ideia de seu impacto sobre o clima da Região Metropolitana de São Paulo. Pintar de branco as superfícies das casas e prédios não seria um procedimento eficaz. “A poluição e os eventos meteorológicos escurecem o branco rapidamente em São Paulo”, diz Freitas. “Não há como manter isso.” A medida mais eficaz seria aumentar a cobertura vegetal da cidade. Segundo as simulações, se 25% da área da região metropolitana fosse tomada por árvores, a temperatura média poderia ser reduzida entre 1,5°C e 2,5°C. Um clima mais ameno reduziria o efeito ilha de calor e talvez não atraísse tanta chuva para a região. Hoje as áreas verdes não representam nem 10% da Grande São Paulo.

Por tabela, se houvesse mais parques e menos áreas impermeabilizadas na maior metrópole brasileira, o efeito mais perverso das tempestades também seria minimizado: as chuvas intensas produziriam menos enchentes e alagamentos. O solo exposto absorve mais as águas que caem sobre ele. “São Paulo fere um princípio básico de drenagem: a água da chuva tem de se infiltrar no solo onde ela cai”, diz a engenheira civil Denise Duarte, professsora da Faculdade de Arquitetura e Urbanismo da USP, que colabora com colegas do IAG. “Aqui, com boa parte da cidade impermeabilizada, a água é simplesmente escoada.” A chuva de um lugar é transferida para outro, em geral os situados em pontos baixos da mancha urbana.

Nas zonas mais úmidas, em geral pontuadas por serras e montanhas, a pluviosidade anual pode chegar a 2.400 mm, quantidade de chuva parecida com a da floresta amazônica. Esse é caso da porção da Grande São Paulo cortada pela serra do Mar, que pega o trecho sul da capital paulista e parte de cidades como São Bernardo do Campo e Rio Grande da Serra, e também de trechos de Santana do Parnaíba e Cajamar, no oeste da região metropolitana. Nas áreas menos úmidas, como uma grande parte de Mogi das Cruzes, o índice de chuvas pode ficar na casa dos  1.300 mm por ano. Entre esses dois extremos há vários níveis intermediários de pluviosidade.O valor atual de aproximadamente 1.600 mm anuais de chuva registrado na estação do IAG funciona como uma referência genérica ao regime pluviométrico vigente na região metropolitana. Numa área que hoje se estende por 8 mil quilômetros quadrados e engloba os territórios de 39 municípios, a quantidade de chuva realmente medida ano a ano em cada estação meteorológica pode variar bastante. Um trabalho do CCST traça uma espécie de distribuição geográfica da pluviosidade na Grande São Paulo a partir de séries históricas, com o total diário de chuva, fornecidas por 94 estações meteorológicas do Departamento de Águas e Energia (DAEE) do Estado de São Paulo e da Agência Nacional de Águas (ANA). Dados de um período de 25 anos, entre 1973 e 1997, foram utilizados no trabalho.

“Essa diferença de níveis de chuvas se mantém ao longo do ano e em todas as estações climáticas”, diz Guillermo Obregón, do CCST, principal autor do estudo sobre a distribuição geográfica da chuva na região metropolitana. “Nos locais mais úmidos predominam as chuvas orográficas ou de relevo.” Esse mecanismo faz as massas de ar quente e úmido subirem ao se chocar com elevações topográficas, condensarem-se e gerarem precipitações frequentes. Seja por seus prédios e asfalto, seja por suas áreas montanhosas, a Grande São Paulo parece estar no caminho das chuvas.

Artigos científicos
1 SILVA DIAS, M.A.F.  et al. Changes in extreme daily rainfall for São Paulo, Brazil. Climatic Change. no prelo. 2012.
2 MARENGO, J. A. et al. The climate in future: projections of changes in rainfall extremes for the Metropolitan Area of São Paulo (Masp). Climate Research. no prelo. 2012

© LEO RAMOS

Mais águas na Guanabara

As chuvas na Região Metropolitana do Rio de Janeiro, a segunda maior do país com 12,5 milhões de habitantes, parecem exibir tendências semelhantes às de São Paulo. Embora a capital fluminense não disponha de uma série histórica sobre pluviosidade tão longa e confiável como a do IAG-USP, duas estações do Instituto Nacional de Meteorologia (Inmet) instaladas no Rio de Janeiro fornecem dados de qualidade razoável sobre ao menos quatro décadas de chuva.

De acordo com os registros obtidos entre 1967 e 2007 pela estação mantida no Alto da Boa Vista, a quantidade de água despejada sobre esse bairro da zona Norte da capital fluminense nos dias de forte tempestade elevou-se, em média, 11,7 mm ao ano. A estação fica no Parque Nacional da Tijuca, uma das maiores florestas urbanas do planeta. “Houve uma tendência de aumento da pluviosidade total na região metropolitana e as áreas de floresta, como o Alto da Boa Vista, se tornaram mais úmidas”, afirma a meteorologista Claudine Dereczynski, da Universidade Federal do Rio de Janeiro (UFRJ), principal autora do estudo, ainda não publicado.

A outra estação do Inmet se situa em Santa Cruz, bairro com menos áreas verdes da zona Oeste. Nessa região, os sinais de intensificação das chuvas foram discretos, segundo as informações coletadas entre 1964 e 2009, e não foram considerados estatisticamente significativos. “No Rio, os dados climáticos das últimas décadas sinalizam mais claramente um aumento na temperatura local e de forma mais fraca uma elevação da quantidade de chuvas”, diz Claudine. Simulações feitas por pesquisadores do Inpe e da UFRJ projetam para as próximas décadas um aumento na intensidade e na frequência tanto dos dias de chuva intensa como dos de seca. A pluviosidade apresenta tendência a se tornar mais mal distribuída ao longo do ano e a se concentrar fortemente em alguns dias.

Os Projetos
1Narrowing the Uncertainties on Aerosol and Climate Changes in São Paulo State – Nuance-SPS – n° 08/58104-8
2Assessment of impacts and vulnerability to climate change in Brazil and strategies for adaptation option – n° 08/58161-1
Modalidade
1e2Programa FAPESP de Pesquisa sobre Mudanças Climáticas Globais – Projeto Temático
Coordenadores
1Maria de Fátima Andrade – IAG-USP
2José Marengo – Inpe
Investimento
1R$ 570.084,46
US$ 2.654.199,16
2R$ 1.264.027,66

Game Over for the Climate (N.Y.Times)

May 9, 2012 – By JAMES HANSEN

GLOBAL warming isn’t a prediction. It is happening. That is why I was so troubled to read a recent interview with President Obama in Rolling Stone in which he said that Canada would exploit the oil in its vast tar sands reserves “regardless of what we do.”

If Canada proceeds, and we do nothing, it will be game over for the climate.

Canada’s tar sands, deposits of sand saturated with bitumen, contain twice the amount of carbon dioxide emitted by global oil use in our entire history. If we were to fully exploit this new oil source, and continue to burn our conventional oil, gas and coal supplies, concentrations of carbon dioxide in the atmosphere eventually would reach levels higher than in the Pliocene era, more than 2.5 million years ago, when sea level was at least 50 feet higher than it is now. That level of heat-trapping gases would assure that the disintegration of the ice sheets would accelerate out of control. Sea levels would rise and destroy coastal cities. Global temperatures would become intolerable. Twenty to 50 percent of the planet’s species would be driven to extinction. Civilization would be at risk.

That is the long-term outlook. But near-term, things will be bad enough. Over the next several decades, the Western United States and the semi-arid region from North Dakota to Texas will develop semi-permanent drought, with rain, when it does come, occurring in extreme events with heavy flooding. Economic losses would be incalculable. More and more of the Midwest would be a dust bowl. California’s Central Valley could no longer be irrigated. Food prices would rise to unprecedented levels.

If this sounds apocalyptic, it is. This is why we need to reduce emissions dramatically. President Obama has the power not only to deny tar sands oil additional access to Gulf Coast refining, which Canada desires in part for export markets, but also to encourage economic incentives to leave tar sands and other dirty fuels in the ground.

The global warming signal is now louder than the noise of random weather, as I predicted would happen by now in the journal Science in 1981. Extremely hot summers have increased noticeably. We can say with high confidence that the recent heat waves in Texas and Russia, and the one in Europe in 2003, which killed tens of thousands, were not natural events — they were caused by human-induced climate change.

We have known since the 1800s that carbon dioxide traps heat in the atmosphere. The right amount keeps the climate conducive to human life. But add too much, as we are doing now, and temperatures will inevitably rise too high. This is not the result of natural variability, as some argue. The earth is currently in the part of its long-term orbit cycle where temperatures would normally be cooling. But they are rising — and it’s because we are forcing them higher with fossil fuel emissions.

The concentration of carbon dioxide in the atmosphere has risen from 280 parts per million to 393 p.p.m. over the last 150 years. The tar sands contain enough carbon — 240 gigatons — to add 120 p.p.m. Tar shale, a close cousin of tar sands found mainly in the United States, contains at least an additional 300 gigatons of carbon. If we turn to these dirtiest of fuels, instead of finding ways to phase out our addiction to fossil fuels, there is no hope of keeping carbon concentrations below 500 p.p.m. — a level that would, as earth’s history shows, leave our children a climate system that is out of their control.

We need to start reducing emissions significantly, not create new ways to increase them. We should impose a gradually rising carbon fee, collected from fossil fuel companies, then distribute 100 percent of the collections to all Americans on a per-capita basis every month. The government would not get a penny. This market-based approach would stimulate innovation, jobs and economic growth, avoid enlarging government or having it pick winners or losers. Most Americans, except the heaviest energy users, would get more back than they paid in increased prices. Not only that, the reduction in oil use resulting from the carbon price would be nearly six times as great as the oil supply from the proposed pipeline from Canada, rendering the pipeline superfluous, according to economic models driven by a slowly rising carbon price.

But instead of placing a rising fee on carbon emissions to make fossil fuels pay their true costs, leveling the energy playing field, the world’s governments are forcing the public to subsidize fossil fuels with hundreds of billions of dollars per year. This encourages a frantic stampede to extract every fossil fuel through mountaintop removal, longwall mining, hydraulic fracturing, tar sands and tar shale extraction, and deep ocean and Arctic drilling.

President Obama speaks of a “planet in peril,” but he does not provide the leadership needed to change the world’s course. Our leaders must speak candidly to the public — which yearns for open, honest discussion — explaining that our continued technological leadership and economic well-being demand a reasoned change of our energy course. History has shown that the American public can rise to the challenge, but leadership is essential.

The science of the situation is clear — it’s time for the politics to follow. This is a plan that can unify conservatives and liberals, environmentalists and business. Every major national science academy in the world has reported that global warming is real, caused mostly by humans, and requires urgent action. The cost of acting goes far higher the longer we wait — we can’t wait any longer to avoid the worst and be judged immoral by coming generations.

James Hansen directs the NASA Goddard Institute for Space Studies and is the author of “Storms of My Grandchildren.”

Study links biodiversity and language loss (BBC)

13 May 2012

By Mark KinverEnvironment reporter, BBC News

Brazilian tribesman (Image: AP)The study identified that high biodiversity areas also had high linguistic diversity

The decline of linguistic and cultural diversity is linked to the loss of biodiversity, a study has suggested.

The authors said that 70% of the world’s languages were found within the planet’s biodiversity hotspots.

Data showed that as these important environmental areas were degraded over time, cultures and languages in the area were also being lost.

The results of the study have been published in the Proceedings of the National Academy of Sciences (PNAS).

“Biologists estimate annual loss of species at 1,000 times or more greater than historic rates, and linguists predict that 50-90% of the world’s languages will disappear by the end of the century,” the researchers wrote.

Lead author Larry Gorenflo from Penn State University, in the US, said previous studies had identified a geographical connection between the two, but did not offer the level of detail required.

Dr Gorenflo told BBC News that the limitation to the data was that either the languages were listed by country or there was a dot on the map to indicate the location.

“But what you did not know was if the area extended two kilometres or 200 kilometres, so you really did not get a sense of the extent of the language,” he explained.

“We used improved language data to really get a more solid sense of how languages and biodiversity co-occurred and an understanding of how geographically extensive the language was.”

He said the study achieved this by also looking at smaller areas with high biodiversity, such as national parks or other protected habitats.

“When we did that, not only did we get a sense of co-occurrence at a regional scale, but we also got a sense that co-occurrence was found at a much finer scale,” he said.

“We are not quite sure yet why this happens, but in a lot of cases it may well be that biodiversity evolved as part-and-parcel of cultural diversity, and vice versa.”

In their paper, the researchers pointed out that, out of the 6,900 or more languages spoken on Earth, more than 4,800 occurred in regions containing high biodiversity.

Dr Gorenflo described these locations as “very important landscapes” which were “getting fewer and fewer” but added that the study’s data could help provide long-term security.

“It provides a wonderful opportunity to integrate conservation efforts – you can have people who can get funding for biological conservation, and they can collaborate with people who can get funding for linguistic or cultural conservation,” he suggested.

“In the past, it was hard to get biologists to look at people.

“That has really changed dramatically in the past few years. One thing that a lot of biologists and ecologists are now seeing is that people are part of these ecosystems.”

‘Climategate’ Undermined Belief in Global Warming Among Many TV Meteorologists, Study Shows (Science Daily)

ScienceDaily (Feb. 22, 2011) — A new paper by George Mason University researchers shows that ‘Climategate’ — the unauthorized release in late 2009 of stolen e-mails between climate scientists in the U.S. and United Kingdom — undermined belief in global warming and possibly also trust in climate scientists among TV meteorologists in the United States, at least temporarily.

In the largest and most representative survey of television weathercasters to date, George Mason University’s Center for Climate Change Communication and Center for Social Science Research asked these meteorologists early in 2010, when news stories about the climate e-mails were breaking, several questions about their awareness of the issue, attention to the story and impact of the story on their beliefs about climate change. A large majority (82 percent) of the respondents indicated they had heard of Climategate, and nearly all followed the story at least “a little.”

Among the respondents who indicated that they had followed the story, 42 percent indicated the story made them somewhat or much more skeptical that global warming is occurring.These results stand in stark contrast to the findings of several independent investigations of the emails, conducted later, that concluded no scientific misconduct had occurred and nothing in the emails should cause doubts about the fact which show that global warming is occurring.

The results, which were published in the journal Bulletin of the American Meteorology Society, also showed that the doubts were most pronounced among politically conservative weathercasters and those who either do not believe in global warming or do not yet know. The study showed that age was not a factor nor was professional credentials, but men — independent of political ideology and belief in global warming — were more likely than their female counterparts to say that Climategate made them doubt that global warming was happening.

“Our study shows that TV weathercasters — like most people — are motivated consumers of information in that their beliefs influence what information they choose to see, how they evaluate information, and the conclusions they draw from it,” says Ed Maibach, one of the researchers. “Although subsequent investigations showed that the climate scientists had done nothing wrong, the allegation of wrongdoing undermined many weathercasters’ confidence in the conclusions of climate science, at least temporarily.”

The poll of weathercasters was conducted as part of a larger study funded by the National Science Foundation on American television meteorologists. Maibach and others are now working with a team of TV meteorologists to test what audience members learn when weathercasters make efforts to educate their viewers about the relationship between the changing global climate and local weather conditions.

Ultimately, the team hopes to answer key research questions about how to help television meteorologists nationwide become an effective source of informal science education about climate change.

“Most members of the public consider television weather reporters to be a trusted source of information about global warming — only scientists are viewed as more trustworthy,” says Maibach. “Our research here is based on the premise that weathercasters, if given the opportunity and resources, can become an important source of climate change education for a broad cross section of Americans.”

Weathercasters Take On Role of Science Educators; Feel Some Uncertainty On Issue of Climate Change (Science Daily)

ScienceDaily (Mar. 29, 2010) — In a time when only a handful of TV news stations employ a dedicated science reporter, TV weathercasters may seem like the logical people to fill that role, and in many cases they do.

In the largest and most representative survey of television weathercasters to date, George Mason University’s Center for Climate Change Communication shows that two-thirds of weathercasters are interested in reporting on climate change, and many say they are already filling a role as an informal science educator.

“Our surveys of the public have shown that many Americans are looking to their local TV weathercaster for information about global warming,” says Edward Maibach, director of the Center for Climate Change Communication. “The findings of this latest survey show that TV weathercasters play — or can play — an important role as informal climate change educators.”

According to the survey, climate change is already one of the most common science topics TV weathercasters discuss — most commonly at speaking events, but also at the beginning or end of their on-air segments, on blogs and web sites, on the radio and in newspaper columns.

Weathercasters also indicated that they are interested in personalizing the story for their local viewers — reporting on local stories such as potential flooding/drought, extreme heat events, air quality and crops. About one-quarter of respondents said they have already seen evidence of climate change in their local weather patterns.

“Only about 10 percent of TV stations have a dedicated specialist to cover these topics,” says University of Texas journalism professor Kristopher Wilson, a collaborator on the survey. “By default, and in many cases by choice, science stories become the domain of the only scientifically trained person in the newsroom — weathercasters.”

Many of the weathercasters said that having access to resources such as climate scientists to interview and high-quality graphics and animations to use on-air would increase their ability to educate the public about climate change.

However, despite their interest in reporting more on this issue, the majority of weathercasters (61 percent) feel there is a lot of disagreement among scientists about the issue of global warming. Though 54 percent indicated that global warming is happening, 25 percent indicated it isn’t, and 21 percent say they don’t know yet.

“A recent survey showed that more than 96 percent of leading climate scientists are convinced that global warming is real and that human activity is a significant cause of the warming,” says Maibach. “Climate scientists may need to make their case directly to America’s weathercasters, because these two groups appear to have a very different understanding about the scientific consensus on climate change.”

This survey is one part of a National Science Foundation-funded research project on meteorologists. Using this data, Maibach and his research team will next conduct a field test of 30-second, broadcast-quality educational segments that TV weathercasters can use in their daily broadcasts to educate viewers about the link between predicted (or current) extreme weather events in that media market and the changing global climate.

Ultimately, the team hopes to answer key research questions supporting efforts to activate TV meteorologists nationwide as an important source of informal science education about climate change.

Jô Soares entrevista Ricardo Augusto Felício sobre mudanças climáticas + comentário de Alexandre Costa

Programa Jô Soares, dia 02 de maio de 2012

* * *

Comentário de Alexandre A. Costa, um dos mais respeitados meteorologistas do Brasil, sobre a entrevista:

A Negação da Mudança Climática e a Direita Organizada (10 de maio de 2012 – postado no Facebook)

Vocês devem ter assistido ou ouvido falar da entrevista recentemente veiculada no programa do Jô, com o Sr. Ricardo Felício que, mesmo sendo professor da Geografia da USP, atacou a comunidade de cientistas do clima, esboçou uma série de teorias conspiratórias e cometeu absurdos que não fazem sentido científico algum como as afirmações de que “não há elevação do nível do mar”, “o efeito estufa não existe”, “a camada de ozônio não existe”, “a Floresta Amazônica se reconstituiria em 20 anos após ser desmatada” e chegou ao auge ao apresentar uma explicação desprovida de sentido para a alta
temperatura de Vênus, apresentando uma interpretação totalmente absurda da lei dos gases.

Enfim, o que levaria uma pessoa que, a princípio é ligada à comunidade acadêmica, a postura tão absurda? Primeiro, achei tratar-se de alpinismo midiático. Como o currículo da figura não mostra nenhuma produção minimamente relevante, achei apenas que bater no “mainstream” fosse uma maneira de chamar atenção, atrair publicidade, ganhar fama, etc. Ingenuidade minha.

Após uma breve pesquisa, encontrei este trecho de entrevista de Ricardo Felicio disponível em http://www.fakeclimate.com/arquivos/EntrevistasImprensaFake/EntrevistaAqGloFINAL.pdf:

Entrevistador: “Você conhece alguma instituição que apóie o seu pensamento? Como ela funciona? E o que ela faz?” Ridardo Felício: “Recomendo que procurem, aqui no Brasil, a MSIa – Movimento de Solidariedade Ibero-Americana.”

Mas quem é essa MSIa? Um grupo de extrema-direita especialista em teorias conspiratórias e em ataques ao Greenpeace (“um instrumento político das oligarquias internacionais”), ao Movimento de Trabalhadores Sem Terra — MST (“um instrumento de guerra contra o Estado Brasileiro), o Foro de São Paulo (“reúne grupos revolucionários que objetivam desestabilizar as Forças Armadas”), a Pastoral da Terra, etc. Eu mesmo fui no site dessa organização e a última desse pessoal é uma campanha contra a Comissão da Verdade, a favor dos militares (“A quem interessa uma crise militar”)! Para quem quiser conhecer os posicionamentos desse pessoal, basta checar em http://www.msia.org.br/

Eis que um pouco mais de busca e achei o Ricardo Felicio sendo citado (‘”A ONU achou um jeito de implementar seu governo global, e o mundo será gerido por painéis pseudocientíficos””) onde? No site http://www.midiasemmascara.org/ do ultra-direitista Olavo de Carvalho…

Parece ser sintomático que às vésperas do final do prazo para veto do Código ruralista, alguém com esse tipo de vínculo (a MSIa se associa à UDR) venha dizer que se pode desmatar a Amazônia que a mesma se regenera em vinte anos… É interessante que a acusação de uma agenda “ambientalista”, “comunista”, de “governança internacional” ou qualquer que seja o delírio que os negadores da mudança climática colocam ao tentarem politizar-ideologizar a questão apenas mostram de onde vem essa politização-ideologização e com que matiz.

Como costumo dizer, moléculas de CO2 não têm ideologia e absorvem radiação infravermelho, independente da existência não só de posições políticas, mas até dos humanos que as expressam. O aumento de suas concentrações na atmosfera terrestre não poderiam ter outro efeito que não o de aquecimento do sistema climático global. Negar uma verdade científica óbvia então só faz sentido para aqueles que têm interesses atingidos. E fica claro. Esse senhor, que academicamente é um farsante é, na verdade, um militante de direita. Parafraseando aqueles que tanto o admiram, precisa aparecer na mídia sem a máscara de “professor da USP”, “climatologista”, etc., mas sim com sua verdadeira face.

Alexandre A. Costa, Ph.D.
Professor Titular
Mestrado em Ciências Físicas Aplicadas
Universidade Estadual do Ceará

A Negação das Mudanças Climáticas e a Direita Organizada – Parte II: Mais Revelações (13 de maio de 2012 – postado no Facebook)

Não é difícil continuar a ligar os pontos, após a aparição do Sr. Ricardo Felício no programa do Jô Soares. Por que alguém se disporia a se expor ao ridículo daquela forma? Como alguém seria capaz de, na posição de doutor em Geografia, professor da USP e “climatologista”, assassinar não apenas o conhecimento científico recente, mas leis básicas da Física, conhecimentos fundamentais de Química, Ecologia, etc.? O que levaria alguém a insultar de forma tão grosseira a comunidade acadêmica brasileira e internacional, principalmente a nós, Cientistas do Clima?

O que pretendo mostrar é que para chegar a esse ponto, é preciso ter motivações. E estas, meus caros, não são de mera vaidade, desejo pelo estrelato, etc. É uma agenda.

Para os que quiserem continuar comigo a rastrear a motivação por trás dessa tal entrevista, peço que visitem, mesmo que isso dê a eles alguma audiência, o repositório dos vídeos do pop-star tupiniquim da negação das mudanças climáticas em http://www.youtube.com/user/TvFakeClimate. Lá, os links são para o conhecido site http://www.msia.org.br/ do “Movimento de Solidariedade Íbero-Americana”, cujo nome pomposo esconde o neo-fascismo LeRouchista, especializado em teorias conspiratórias e manipulação e inimigo visceral, como se pode ver em seu site, do MST, do movimento feminista, do movimento de direitos humanos, da Comissão da Verdade, etc; para o não menos direitoso http://www.midiaamais.com.br/, cujos artigos não consegui ler até o fim, mas que são de ataques de direita a Obama, de ridicularização do movimento dos moradores do Pinheirinho, em SJC, de combate à decisão do STF em considerar as cotas constitucionais e, claro, negação da mudança climática e ataques ao IPCC, etc,; um site anti-movimento ambientalista de nome http://ecotretas.blogspot.com/, que por sua vez contém links neo-fascistas como “vermelho não” (http://vermelhosnao.blogspot.com.br/search/label/verdismo), que por sinal está fazendo a campanha “Não Veta, Dilma”, ou especializados em teorias conspiratórias como http://paraummundolivre.blogspot.com.br/ e até diretistas exóticos, defensores da restauração da monarquia em Portugal (http://quartarepublica.wordpress.com/) ou neo-salazaristas (http://nacionalismo-de-futuro.blogspot.com.br/).

Como coloquei em diversos momentos, não é a escolha política-ideológica que faz com que alguém tenha ou não razão em torno da questão climática. Tenho colegas em minha comunidade de pesquisa que simpatizam com os mais variados matizes político-ideológicos (o que por si só já dificultaria que nos juntássemos numa “conspiração”… como é mesmo… ah!… para “conquistar uma governança mundial da ONU via painéis de clima”, tipo de histeria típico da direita mais tresloucada dos EUA). A questão do clima é objetiva. Os mecanismos de controle do clima são conhecidos, incluindo o papel dos gases de efeito estufa. As medições, os resultados de modelos (atacados de maneira desonesta pelo entrevistado), os testemunhos paleoclimáticos, todos convergem. E dentre todas as possíveis hipóteses para o fenômeno do aquecimento do sistema climático, a contribuição antrópica via emissão de gases de efeito estufa foi a única a permanecer de pé após todos os testes. Constatar isso independe de ideologia. Basta abrir os olhos. O tipo de política pública a ser aplicada para lidar com os impactos, a adaptação às mudanças e a mitigação das mesmas, aí sim… é um terreno em que as escolhas políticas adquirem grau de liberdade.

O problema é que, para uma determinada franja político-ideológica, no caso a extrema-direita, há realmente incompatibilidade com qualquer agenda ambiental que possa significar controle público sobre o capital privado. Há também uma necessidade de ganhar respaldos afagando desejos escondidos da opinião pública (como o de que nada precisa ser feito a respeito das mudanças climáticas) e fazendo apelos ao nacionalismo (típico dos Mussolinis, dos Hitlers, dos Francos, dos Salazares e de tantas ditaduras de direita na América Latina) – ainda que eventualmente isso signifique adotar um discurso falsamente antiimperialista. Com esses objetivos “maiores”, que incluem sabotar a campanha pelo veto presidencial sobre o monstro que é o Código Florestal aprovado pelos deputados, para que compromisso com a verdade científica? Para que ética e tratamento respeitoso em relação aos demais colegas de mundo acadêmico?

É impressionante como aqueles que nos acusam de “fraude”, “conspiração”, etc., na verdade são exatamente os que as praticam. Como coloquei em outros textos que escrevi sobre o assunto, é preciso desmistificar cientificamente os pseudo-argumentos apresentados pelos negadores (e isso tenho feito em outros textos), mas como bem lembra o colega Michael Mann, eles são como a hidra. Sempre têm mais mentiras na manga para lançarem por aí e não têm preocupação nenhuma em apresentarem um todo coerente em oposição aos pontos de vista da comunidade científica. Interessa a eles semearem confusão, ganharem espaço político, atrasarem ações de proteção da estabilidade climática, darem tempo para os que os financiam na base (ainda que possa haver negadores não ligados diretamente à indústria de petróleo e outras, mas já ficou evidente a ligação desta com a campanha articulada anti-ciência do clima em escala mundial). A pseudo-ciência e a impostura intelectual são as cabeças da hidra. O coração do monstro é a agenda político-ideológica. Mas a espada da verdade é longa o suficiente para ferir-lhe de morte!

Alexandre A. Costa, Ph.D.
Professor Titular
Mestrado em Ciências Físicas Aplicadas
Universidade Estadual do Ceará

Em Defesa da Ciência do Clima (10 de maio de 2012 – postado no Facebook)

Tenho me preocupado muito com os ataques feitos recentemente à Ciência do Clima, dentre outros motivos, porque estes tem se constituído num amálgama estranho que reúne o Tea Party, a indústria petroquímica e pessoas que parecem acreditar numa grande conspiração imperialista para, ao impedir que queimem suas reservas de combustíveis fósseis, a periferia do capitalismo se “desenvolva”, o que, com o perdão da palavra, já é per si uma visão absolutamente tacanha de “desenvolvimento”.

Mas essa não é uma questão ideológica, mesmo porque se o fosse estaria eu distante de Al Gore. É uma questão científica, pois moléculas de CO2 não têm ideologia. O que elas são dotadas, assim como outras moléculas (caso do CH4 e do próprio vapor d’água), é de uma propriedade da qual não gozam os gases majoritários em nossa atmosfera, que é a de um modo de oscilação cuja frequência coincide com a de uma região do espectro eletromagnético conhecida como infravermelho. A retenção do calor é uma consequência da presença desses gases (mesmo tão minoritários) na atmosfera terrestre. Não fosse por eles, a Terra teria temperatura média de -18 graus, em contraste com os moderados 15, para não falar do papel dos mesmos em mantê-la entre limites amenos. A Terra não é Mercúrio que, por não ter atmosfera, devolve livremente a energia absorvida do Sol na porção em que é dia, levando-o a contrastes de temperatura de 430 graus durante o dia e -160 graus à noite. Felizmente, tampouco é Vênus, cuja cobertura de nuvens faz com que chegue à sua superfície menos energia solar do que na Terra, mas cujo efeito estufa, causado por sua atmosfera composta quase que exclusivamente por CO2, eleva sua temperatura a praticamente constantes 480 graus.

Desconhecer essas idéias científicas simples, de que o CO2 é um gás de efeito estufa (conhecido e medido por Tyndall, Arrhenius e outros, desde o século XIX), com mecanismo bem explicado pela Física de sua estrutura molecular; ignorar o conhecido efeito global que o CO2 tem sobre um planeta vizinho, o que é bem estabelecido pela astronomia desde o saudoso Sagan, não faz sentido, especialmente no meio acadêmico, onde encontram-se alguns dos negadores mais falantes. A esses eu gostaria de lembrar de algo básico no método científico. De um lado, a ciência não tem dogma, nem verdades definitivas. Suas verdades são sempre, por construção, parciais e provisórias (que bom, senão viraria algo chato e tedioso como, digamos, uma religião). No entanto, por outro lado, o conhecimento científico é cumulativo e, nesse sentido, não se pode andar para trás! Só quando uma teoria falha, se justifica uma nova e esta não pode ser apenas a negação da anterior, pois precisa ser capaz de reproduzir todos os seus méritos (caso da Mecânica Clássica e da Relatividade, que se reduz à primeira para baixas velocidades).

Não é uma questão de crença. “Monotonia” à parte, é ciência bem estabelecida, bem conhecida. Tanto quanto a Gravitação Universal (que também é “apenas” uma teoria) ou a Evolução das Espécies.

INJUSTIÇA, DESRESPEITO E SUBESTIMAÇÃO

Os Cientistas do Clima tem sofrido ataques, com base em factóides que em nenhum momento se assemelham à realidade de nossa área. Nenhuma Ciência é hoje tão pública e aberta. Quem quiser, pode obter facilmente, na maioria dos casos diretamente pela internet, dados observados do clima, que demonstram claramente o aquecimento global (www.cru.uea.ac.uk/cru/data/ dentre outros), dados de modelagem que estão sendo gerados agora e que certamente subsidiarão o 5o relatório do IPCC (http://cmip-pcmdi.llnl.gov/cmip5/data_portal.html) ou dados de testemunhos paleoclimáticos, que servem para analisar o clima do passado (www.ncdc.noaa.gov). Pode obter os relatórios do IPCC, emwww.ipcc.ch e seguir as referências, revisadas e publicadas em sua esmagadora maioria, principalmente no caso do Grupo de Trabalho que lida com as Bases Físicas do Sistema Climático, em revistas de grande impacto, sejam gerais (Science, Nature), sejam da área. Duvido que, em nossas universidades, cheias de laboratórios com convênios privados, sejam na engenharia de materiais ou na bioquímica, haja um segmento tão aberto, que tenha o desprendimento de sentar à mesa, compartilhar dados, levantar o estado-da-arte em sua ciência e elaborar coletivamente um relatório de síntese. Duvido! Desafio!

Os cientistas que participamos desses painéis não somos “representantes de governos”. Nada é criado ou inventado nesses painéis, além de uma síntese da Ciência que é produzida de maneira independente e publicada na literatura revisada por pares. Os que participam da comunidade acadêmica podem, inclusive, se informar melhor com facilidade, junto a colegas da comunidade científica brasileira que participaram e participam das iniciativas do IPCC e do PBMC sobre o funcionamento desses painéis, antes de emitir opinião, para que não terminem, na prática, difamando o que desconhece. Algumas pessoas, sem a menor conduta crítica em relação aos detratores do IPCC, repete-lhes a verborragia, quando poderiam ser céticos em relação aos “céticos”.

Mas não o são. Em nenhum momento, questionam as reais motivações de dois ou três (felizmente, são tão raros) que assumem a conduta lamentável da negação anti-ciência, ou por serem abertamente corruptos e serviçais da indústria petroquímica ou, simplesmente, por terem uma vaidade que não cabe no papel secundário que cumpririam caso estivessem, como nós, desprendendo, em geral quase anonimamente, enorme energia para colocar tijolo por tijolo no edifício da Ciência do Clima. É preciso saber distinguir entre o ceticismo honesto, genuíno, que é saudável em ciência, consonante com a dúvida sincera e a conduta crítica, da negação religiosa, baseada em fé e na necessidade cega de defender determinado ponto de vista, independente se o mesmo tem base real ou não e, principalmente, da canalhice pura e simples, que é o que é promovido por alguns dos negadores. O possível “sucesso” dessas idéias junto ao público, para mim, são terreno da psicologia social, mas a melhor analogia que tenho é a da popularidade de idéias religiosas, em geral mentiras reconfortantes que são preferidas em detrimento de verdades desagradáveis.

O verdadeiro ceticismo levou até onde os físicos de Berkeley foram (http://www.berkeleyearth.org/index.php). Inicialmente questionando os resultados obtidos por nossa comunidade, se municiaram de um enorme banco de dados de temperatura em escala mundial, mais amplo do que os que o Hadley Centre inglês e a NASA dispunham. Testaram outras metodologias, chegaram até a excluir as estações meteorológicas usadas por nossos centros de pesquisa. A postura inicial de Richard Muller, idealizador dessa iniciativa, era de tamanho questionamento em relação a nossos resultados que ele chegou a alavancar recursos da famigerada Fundação Koch, abertamente anti-Ciência do Clima. Mas o que Muller e seus parceiros encontraram? O mesmo resultado que já nos era conhecido. A Terra está aquecendo e este aquecimento se acelerou bastante nas últimas décadas do século XX. Este aquecimento se aproxima de um grau e portanto está muito acima de todas as flutuações naturais registradas desde que se tem registro instrumental. Aliás, confirmou o que também sabíamos: que os dados da Universidade de East Anglia (aqueles mesmos da farsa montada sob o nome altissonante de “climategate”, aqueles que foram perseguidos e cuja reputação foi ignominiosamente atacada, com repercussões em suas carreiras profissionais e vidas pessoais) contém um erro… para menos! O aquecimento sugerido pelos dados da CRU/UEA é um décimo de grau inferior aos das outras fontes de dados e, claro, entre nós, ninguém os acusa de desonestos por isso.

Outra impostura – e infelizmente, apesar da dureza do termo, acho que é neste caso em que ele se aplica – é a subestimação da inteligência de nossa comunidade, aliada ao desconhecimento dos materiais por ela produzidos. O 4o relatório do IPCC já contém um capítulo exclusivamente sobre Paleoclimatologia, isto é, sobre o clima do passado. Eu pessoalmente tenho dedicado grandes esforços na análise de testemunhos do clima passado e na modelagem das condições climáticas passadas. Existe uma preocupação permanente em discernir o sinal natural e separar, dele, o sinal antrópico, desde o primeiro relatório do IPCC. Para isso, avalia-se o papel das variações de atividade solar, as emissões dos vulcões, etc. Já avaliamos as possíveis influências naturais e as descartamos como possível causa para o aquecimento observado.

Nesse sentido, não há lugar para sofismas e tergiversações. Sobre os registros paleoclimáticos, que são capazes de recontar o histórico de temperatura e de concentração de gases de efeito estufa de 800 mil anos atrás até o presente, todos sabemos que, no passado, um pequeno aquecimento do planeta precedeu o aumento da concentração dos gases de efeito estufa. Isso se deu antes do encerramento de todas as eras glaciais. Mas é um raciocínio obtuso deduzir daí que o CO2 não exerce nenhum papel ou, nas palavras dos negadores “é consequência e não causa”. Existem diversos processos de retroalimentação no sistema climático e este é um dos melhores exemplos. As sutis variações da insolação e da distribuição desta sobre a superfície da Terra associadas aos ciclos orbitais são – e isto é do conhecimento de todos – muito pequenas para explicar as grandes diferenças de temperatura entre os períodos glaciais (“eras do gelo”) e os interglaciais (períodos quentes, mais breves, que as intercalaram). Mas um aquecimento sutil, após alguns séculos, mostrou-se suficiente para elevar a emissões naturais de CO2 e metano, que causam efeito estufa e amplificam o processo. Essa retroalimentação só era refreada, em condições livres da ação do homem, quando as condições orbitais mudavam novamente, levando a um resfriamento sutil, que induzia a captura de CO2 no sistema terrestre, que por sua vez amplificava o resfriamento e assim por diante.

Mas não é porque pessoas morrem de câncer e infarto que não se possa atribuir responsabilidades a um assassino! Porque pessoas morrem naturalmente de derrame, alguém acha possível dizer que “é impossível que um tiro mate alguém”? Ou que não se deva julgar mais ninguém por assassinato? Antes, era preciso um pequeno aquecimento para deflagrar emissões naturais e aumento de concentração de CO2, para daí o aquecimento se acelerar. Hoje, há uma fonte independente de CO2, estranha aos ciclos naturais e esta é a queima de combustíveis fósseis! Devo, aliás, frisar que até a análise isotópica (a composição é diferente entre combustíveis fósseis e outras fontes) é clara: a origem do CO2 excedente na atmosfera terrestre é sim, em sua maioria, petróleo, carvão, gás natural! Um mínimo de verdadeiro aprofundamento científico deixa claro que, hoje, o aumento das concentrações de CO2 na atmosfera é eminentemente antrópico e que é isso que vem acarretando as mudanças climáticas observadas. Não é possível mais tapar o sol, ou melhor, tapar os gases de efeito estufa com uma peneira! Os registros paleoclimáticos mostram que o aquecimento atual é inédito nos últimos 2500 anos. Mostram que a concentração atual de CO2 está 110 ppm acima do observado antes da era industrial e quase 100 ppm acima do que se viu nos últimos 800 mil anos. Mostram que esse número é maior do que a diferença entre a concentração de CO2 existente nos interglaciais e nas “eras do gelo” e que isso faz, sim, grande diferença sobre o clima.

QUAIS OS VERDADEIROS ERROS

Algumas pessoas se dizem céticas, críticas e desconfiadas em relação à maioria de nossa comunidade de cientistas do clima, mas não percebem o erro fundamental que cometem: a absoluta falta de ceticismo, criticidade e desconfiança em relação aos que nos detratam. A postura dos que combatem a Ciência do Clima sob financiamento da indústria petroquímica, ou em associação com setores partidários e da mídia mais reacionários é auto-explicativa. Interessa o acobertamento da realidade. Mas não só. Há desde essas pessoas que recebem diretamente recursos da indústria do petróleo a falastrões que há muito não têm atuação científica de verdade na área e, sem serem capazes de permanecer em evidência trabalhando seriamente para contribuir com o avançar de nossa ciência, debruçando-se sobre as verdadeiras incertezas, contribuindo para coletar dados, melhorar métodos e modelos, etc., apenas para manterem holofotes sobre si, têm atacado o restante da comunidade. Estranho e espalhafatoso como as penas de pavão. Prosaico como os mecanismos evolutivos que levaram tais penas a surgirem. Daí é preciso também combater o ponto de vista daqueles que dão a esse ataque um falso verniz “de esquerda”, pois lançam mão de teorias de conspiração, uma deturpação patológica do raciocínio crítico. Lutar com o alvo errado, com a arma errada, é pior do que desarmar para a luta.
O IPCC é perfeito? Não, é claro. Cometeu erros. Mas querem saber, de fato, quais são? Uma coisa precisa ficar claro a todos. As avaliações do IPCC tendem a ser conservadoras. As projeções de temperatura realizadas para após o ano 2000 estão essencialmente acertadas, mas sabe o que acontece com as projeções de elevação do nível dos oceanos e de degelo no Ártico? Estão subestimadas. Isso mesmo. O cenário verdadeiro é mais grave do que o 4o relatório do IPCC aponta. Mas de novo não é por uma questão política, mas pela limitação, na época, dos modelos de criosfera, incapazes de levar em conta processos importantes que levam ao degelo. Provavelmente, baseando-se em artigos que vêm sendo publicados nesse meio tempo, o 5o relatório será capaz de corrigir essas limitações e mostrar um quadro mais próximo da real gravidade do problema em 2013-2014 quando de sua publicação.

QUAL A VERDADEIRA QUESTÃO IDEOLÓGICA?

Não faz sentido “acreditar” ou não na gravidade, na evolução ou no efeito estufa. Não se trata de uma “opção ideológica” (apesar de haver, nos EUA, uma forte correlação entre ideologia e ciência junto ao eleitorado republicano mais reacionário, que dá ouvidos aos detratores da ciência do clima e que também querem ver Darwin fora das escolas).

A verdadeira questão ideológica, é que as mudanças climáticas são um processo de extrema desigualdade, da raiz, aos seus impactos. Quem mais se beneficiou das emissões dos gases de efeito estufa foram e continuam sendo as classes dominantes dos países capitalistas centrais. Juntamente com os mega-aglomerados do capital financeiro, a indústria petroquímica, o setor de mineração (que inclui mineração de carvão), o setor energético, etc. concentraram riquezas usando a atmosfera como sua grande lata de lixo. Mais do que a “pegada” de carbono atual (que é ainda extremamente desigual se compararmos americanos, europeus e australianos, de um lado, com africanos do outro), é mais díspar ainda a “pegada histórica” (isto é, o já emitido, o acumulado a partir das emissões de cada país), que faz da Europa e, em seguida, dos EUA, grandes emissores históricos.

Cruelmente, em contrapartida, os impactos das mudanças no clima recairão sobre os países mais pobres, sobre as pequenas nações, principalmente sobre os pobres dos países pobres, sobre os mais vulneráveis. Perda de territórios em países insulares, questões de segurança hídrica e alimentar em regiões semi-áridas (tão vastas no berço de nossa espécie, que é o continente africano), efeitos de eventos severos (que, com base física muito clara, devem se tornar mais frequentes num planeta aquecido), comprometimento de ecossistemas marinhos costeiros e florestas, atingindo pesca e atividades de coleta; inviabilização de culturas agrícolas tradicionais… tudo isso recai onde? Sobre o andar de baixo! O de cima fala em “adaptação” e tem muito mais instrumentos para se adaptar às mudanças. A nós, neste caso, interessa sermos conservadores quanto ao clima e frear esse “experimento” desastrado, desordenado, de alteração da composição química da atmosfera terrestre e do balanço energético planetário! Para a maioria dos 7 bilhões de habitantes dessa esfera, a estabilidade climática é importante!

Alguns dos mais ricos, na verdade, veem o aquecimento global como “oportunidade”… Claro, “oportunidade” de expandir o agronegócio para as futuras terras agricultáveis do norte do Canadá e da Sibéria e para explorar petróleo no oceano que se abrirá com o crescente degelo do Ártico.

Assim, é preciso perceber que há uma verdadeira impostura vagando por aí e a Ciência precisa ser defendida. Uma rocha é uma rocha; uma árvore é uma árvore; uma molécula de CO2 é uma molécula de CO2, independente de ideologia. Mas os de baixo só serão/seremos capazes de se/nos armarem/armarmos para transformar a sociedade se estiverem/estivermos bem informados e aí, é preciso combater os absurdos proferidos pelos detratores da Ciência do Clima.

Alexandre Costa é bacharel em Física e mestre em Física pela Universidade Federal do Ceará, Ph.D. em Ciências Atmosféricas pela Colorado State University, com pós-doutorado pela Universidade de Yale, com publicações em diversos periódicos científicos, incluindo Science, Journal of the Amospheric Sciences e Atmospheric Research. É bolsista de produtividade do CNPq e membro do Painel Brasileiro de Mudanças Climáticas.

Television Has Less Effect On Education About Climate Change Than Other Forms Of Media (Science Daily)

ScienceDaily (Oct. 16, 2009) — Worried about climate change and want to learn more? You probably aren’t watching television then. A new study by George Mason University Communication Professor Xiaoquan Zhao suggests that watching television has no significant impact on viewers’ knowledge about the issue of climate change. Reading newspapers and using the web, however, seem to contribute to people’s knowledge about this issue.

The study, “Media Use and Global Warming Perceptions: A Snapshot of the Reinforcing Spirals,” looked at the relationship between media use and people’s perceptions of global warming. The study asked participants how often they watch TV, surf the Web, and read newspapers. They were also asked about their concern and knowledge of global warming and specifically its impact on the polar regions.

“Unlike many other social issues with which the public may have first-hand experience, global warming is an issue that many come to learn about through the media,” says Zhao. “The primary source of mediated information about global warming is the news.”

The results showed that people who read newspapers and use the Internet more often are more likely to be concerned about global warming and believe they are better educated about the subject. Watching more television, however, did not seem to help.

He also found that individuals concerned about global warming are more likely to seek out information on this issue from a variety of media and nonmedia sources. Other forms of media, such as the Oscar-winning documentary “The Inconvenient Truth” and the blockbuster thriller “The Day After Tomorrow,” have played important roles in advancing the public’s interest in this domain.

Politics also seemed to have an influence on people’s perceptions about the science of global warming. Republicans are more likely to believe that scientists are still debating the existence and human causes of global warming, whereas Democrats are more likely to believe that a scientific consensus has already been achieved on these matters.

“Some media forms have clear influence on people’s perceived knowledge of global warming, and most of it seems positive,” says Zhao. “Future research should focus on how to harness this powerful educational function.”

Support for Climate Policy Linked to People’s Perceptions About Scientific Agreement Regarding Global Warming (Science Daily)

ScienceDaily (Nov. 21, 2011) — People who believe there is a lot of disagreement among scientists about global warming tend to be less certain that global warming is happening and less supportive of climate policy, researchers at George Mason, San Diego State, and Yale Universities report in a new study published in the journal Nature Climate Change.

A recent survey of climate scientists conducted by researchers at the University of Illinois found near unanimous agreement among climate scientists that human-caused global warming is happening.

This new George Mason University study, however, using results from a national survey of the American public, finds that many Americans believe that most climate scientists actually disagree about the subject.

In the national survey conducted in June 2010, two-thirds of respondents said they either believed there is a lot of disagreement among scientists about whether or not global warming is happening (45 percent), that most scientists think it is not happening (5 percent), or that they did not know enough to say (16 percent.) These respondents were less likely to support climate change policies and to view climate change as a lower priority.

By contrast, survey respondents who correctly understood that there is widespread agreement about global warming among scientists were themselves more certain that it is happening, and were more supportive of climate policies.

“Misunderstanding the extent of scientific agreement about climate change is important because it undermines people’s certainty that climate change is happening, which in turn reduces their conviction that America should find ways to deal with the problem,” says Edward Maibach, director of the Center for Climate Change Communication at George Mason University.

Maibach argues that a campaign should be mounted to correct this misperception. “It is no accident that so many Americans misunderstand the widespread scientific agreement about human-caused climate change. A well-financed disinformation campaign deliberately created a myth about there being lack of agreement. The climate science community should take all reasonable measures to put this myth to rest.”

Large Gaps Found in Public Understanding of Climate Change (Science Daily)

ScienceDaily (Oct. 14, 2010) — Sixty-three percent of Americans believe that global warming is happening, but many do not understand why, according to a national study conducted by researchers at Yale University.

The report titled “Americans’ Knowledge of Climate Change” found that only 57 percent know what the greenhouse effect is, only 45 percent of Americans understand that carbon dioxide traps heat from the Earth’s surface, and just 50 percent understand that global warming is caused mostly by human activities. Large majorities incorrectly think that the hole in the ozone layer and aerosol spray cans cause global warming. Meanwhile, 75 percent of Americans have never heard of the related problems of ocean acidification or coral bleaching.

However, many Americans do understand that emissions from cars and trucks and the burning of fossil fuels contribute to global warming and that a transition to renewable energy sources is an important solution.

Americans also recognize their own limited understanding. Only 1 in 10 say that they are “very well-informed” about climate change, and 75 percent say they would like to know more about the issue. Likewise, 75 percent say that schools should teach children about climate change and 68 percent would welcome a national program to teach Americans more about the issue.

“This study demonstrates that Americans need to learn more about the causes, impacts and potential solutions to global warming,” said study director Anthony Leiserowitz of Yale University. “But it also shows that Americans want to learn more about climate change in order to make up their minds and take action.”

The executive summary and full report are available online:http://environment.yale.edu/climate/publications/knowledge-of-climate-change

The online survey was conducted by Knowledge Networks from June 24 to July 22, 2010, with 2,030 American adults 18 and older. The margin of sampling error is plus- or minus-2 percent, with 95 percent confidence.

Increased Knowledge About Global Warming Leads To Apathy, Study Shows (Science Daily)

ScienceDaily (Mar. 27, 2008) — The more you know the less you care — at least that seems to be the case with global warming. A telephone survey of 1,093 Americans by two Texas A&M University political scientists and a former colleague indicates that trend, as explained in their recent article in the peer-reviewed journal Risk Analysis.

“More informed respondents both feel less personally responsible for global warming, and also show less concern for global warming,” states the article, titled “Personal Efficacy, the Information Environment, and Attitudes toward Global Warming and Climate Change in the USA.”

The study showed high levels of confidence in scientists among Americans led to a decreased sense of responsibility for global warming.

The diminished concern and sense of responsibility flies in the face of awareness campaigns about climate change, such as in the movies An Inconvenient Truth and Ice Age: The Meltdown and in the mainstream media’s escalating emphasis on the trend.

The research was conducted by Paul M. Kellstedt, a political science associate professor at Texas A&M; Arnold Vedlitz, Bob Bullock Chair in Government and Public Policy at Texas A&M’s George Bush School of Government and Public Service; and Sammy Zahran, formerly of Texas A&M and now an assistant professor of sociology at Colorado State University.

Kellstedt says the findings were a bit unexpected. The focus of the study, he says, was not to measure how informed or how uninformed Americans are about global warming, but to understand why some individuals who are more or less informed about it showed more or less concern.

“In that sense, we didn’t really have expectations about how aware or unaware people were of global warming,” he says.

But, he adds, “The findings that the more informed respondents were less concerned about global warming, and that they felt less personally responsible for it, did surprise us. We expected just the opposite.

“The findings, while rather modest in magnitude — there are other variables we measured which had much larger effects on concern for global warming — were statistically quite robust, which is to say that they continued to appear regardless of how we modeled the data.”

Measuring knowledge about global warming is a tricky business, Kellstedt adds.

“That’s true of many other things we would like to measure in surveys, of course, especially things that might embarrass people (like ignorance) or that they might feel social pressure to avoid revealing (like prejudice),” he says.

“There are no industry standards, so to speak, for measuring knowledge about global warming. We opted for this straightforward measure and realize that other measures might produce different results.”

Now, for better or worse, scientists have to deal with the public’s abundant confidence in them. “But it cannot be comforting to the researchers in the scientific community that the more trust people have in them as scientists, the less concerned they are about their findings,” the researchers conclude in their study.

Despite Awareness Of Global Warming Americans Concerned More About Local Environment (Science Daily)

ScienceDaily (Mar. 26, 2008) — British Prime Minister Gordon Brown recently declared climate change a top international threat, and Al Gore urged politicians to get involved to fight global warming. Results from a recent survey conducted by a University of Missouri professor reveal that the U.S. public, while aware of the deteriorating global environment, is concerned predominantly with local and national environmental issues.

Potomac River near Washington DC. The top three issues that the US public wants the government to address are protecting community drinking water, reducing pollution of U.S. rivers and lakes, and improving urban air pollution issues like smog. (Credit: Michele Hogan)

“The survey’s core result is that people care about their communities and express the desire to see government action taken toward local and national issues,” said David Konisky, a policy research scholar with the Institute of Public Policy. “People are hesitant to support efforts concerning global issues even though they believe that environmental quality is poorer at the global level than at the local and national level. This is surprising given the media attention that global warming has recently received and reflects the division of opinion about the severity of climate change.”

Konisky, an assistant professor in the Truman School of Public Affairs at MU, recently surveyed 1,000 adults concerning their attitudes about the environment. The survey polled respondents about their levels of concern for the environment and preferences for government action to address a wide set of environmental issues.

A strong majority of the public expressed general concern about the environment. According to the survey, the top three issues that the public wants the government to address are protecting community drinking water, reducing pollution of U.S. rivers and lakes, and improving urban air pollution issues like smog.  In the survey, global warming ranks eighth in importance.

“Americans are clearly most concerned about pollution issues that might affect their personal health, or the health of their families,” Konisky said.

Additionally, Konisky and his colleagues found that the best predictor of individuals’ environmental preferences is their political attributes. They examined the relationship between party identification and political ideology and support for action to address environmental problems.

“The survey reinforced the stark differences in people’s environmental attitudes, depending on their political leanings,” Konisky said. “Democrats and political liberals clearly express more desire for governmental action to address environmental problems. Republicans and ideological conservatives are much less enthusiastic about further government intervention.”

Results from the survey were recently presented at the annual meeting of the Western Political Science Association in San Diego.

Support for Climate Change Action Drops, Poll Finds (Science Daily)

ScienceDaily (May 8, 2012) — Americans’ support for government action on global warming remains high but has dropped during the past two years, according to a new survey by Stanford researchers in collaboration with Ipsos Public Affairs. Political rhetoric and cooler-than-average weather appear to have influenced the shift, but economics doesn’t appear to have played a role.

The survey directed by Jon Krosnick, a senior fellow at the Stanford Woods Institute for the Environment, shows that support for a range of policies intended to reduce future climate change dropped by an average of 5 percentage points per year between 2010 and 2012.

In a 2010 Stanford survey, more than three-quarters of respondents expressed support for mandating more efficient and less polluting cars, appliances, homes, offices and power plants. Nearly 90 percent of respondents favored federal tax breaks to spur companies to produce more electricity from water, wind and solar energy. On average, 72 percent of respondents supported government action on climate change in 2010. By 2012, that support had dropped to 62 percent.

The drop was concentrated among Americans who distrust climate scientists, even more so among such people who identify themselves as Republicans. Americans who do not trust climate science were especially aware of and influenced by recent shifts in world temperature, and 2011 was tied for the coolest of the last 11 years.

Krosnick pointed out that during the recent campaign, all but one Republican presidential candidate expressed doubt about global warming, and some urged no government action to address the issue. Rick Santorum described belief in climate change as a “pseudo-religion,” while Ron Paul called it a “hoax.” Mitt Romney, the apparent Republican nominee, has said, “I can tell you the right course for America with regard to energy policy is to focus on job creation and not global warming.”

The Stanford-Ipsos study found no evidence that the decline in public support for government action was concentrated among respondents who lived in states struggling the most economically.

The study found that, overall, the majority of Americans continue to support many specific government actions to mitigate global warming’s effect. However, most Americans remain opposed to consumer taxes intended to decrease public use of electricity and gasoline.

Bruno Latour: Love Your Monsters (Breakthrough)

Breakthrough Journal, No. 2, Fall 2011

Latour - crying baby - AP.jpg

In the summer of 1816, a young British woman by the name of Mary Godwin and her boyfriend Percy Shelley went to visit Lord Byron in Lake Geneva, Switzerland. They had planned to spend much of the summer outdoors, but the eruption of Mount Tambora in Indonesia the previous year had changed the climate of Europe. The weather was so bad that they spent most of their time indoors, discussing the latest popular writings on science and the supernatural.

After reading a book of German ghost stories, somebody suggested they each write their own. Byron’s physician, John Polidori, came up with the idea for The Vampyre, published in 1819,1 which was the first of the “vampire-as-seducer” novels. Godwin’s story came to her in a dream, during which she saw “the pale student of unhallowed arts kneeling beside the thing he had put together.”2 Soon after that fateful summer, Godwin and Shelley married, and in 1818, Mary Shelley’s horror story was published under the title, Frankenstein, Or, the Modern Prometheus.3

Frankenstein lives on in the popular imagination as a cautionary tale against technology. We use the monster as an all-purpose modifier to denote technological crimes against nature. When we fear genetically modified foods we call them “frankenfoods” and “frankenfish.” It is telling that even as we warn against such hybrids, we confuse the monster with its creator. We now mostly refer to Dr. Frankenstein’s monster as Frankenstein. And just as we have forgotten that Frankenstein was the man, not the monster, we have also forgotten Frankenstein’s real sin.

Dr. Frankenstein’s crime was not that he invented a creature through some combination of hubris and high technology, but rather that heabandoned the creature to itself. When Dr. Frankenstein meets his creation on a glacier in the Alps, the monster claims that it was notborn a monster, but that it became a criminal only after being left alone by his horrified creator, who fled the laboratory once the horrible thing twitched to life. “Remember, I am thy creature,” the monster protests, “I ought to be thy Adam; but I am rather the fallen angel, whom thou drivest from joy for no misdeed… I was benevolent and good; misery made me a fiend. Make me happy, and I shall again be virtuous.”

Written at the dawn of the great technological revolutions that would define the 19th and 20th centuries, Frankenstein foresees that the gigantic sins that were to be committed would hide a much greater sin. It is not the case that we have failed to care for Creation, but that we have failed to care for our technological creations. We confuse the monster for its creator and blame our sins against Nature upon our creations. But our sin is not that we created technologies but that we failed to love and care for them. It is as if we decided that we were unable to follow through with the education of our children.4

Let Dr. Frankenstein’s sin serve as a parable for political ecology. At a time when science, technology, and demography make clear that we can never separate ourselves from the nonhuman world — that we, our technologies, and nature can no more be disentangled than we can remember the distinction between Dr. Frankenstein and his monster — this is the moment chosen by millions of well-meaning souls to flagellate themselves for their earlier aspiration to dominion, to repent for their past hubris, to look for ways of diminishing the numbers of their fellow humans, and to swear to make their footprints invisible?

The goal of political ecology must not be to stop innovating, inventing, creating, and intervening. The real goal must be to have the same type of patience and commitment to our creations as God the Creator, Himself. And the comparison is not blasphemous: we have taken the whole of Creation on our shoulders and have become coextensive with the Earth.

What, then, should be the work of political ecology? It is, I believe, tomodernize modernization, to borrow an expression proposed by Ulrich Beck.5 
This challenge demands more of us than simply embracing technology and innovation. It requires exchanging the modernist notion of modernity for what I have called a “compositionist” one that sees the process of human development as neither liberation from Nature nor as a fall from it, but rather as a process of becoming ever-more attached to, and intimate with, a panoply of nonhuman natures.

1.
At the time of the plough we could only scratch the surface of the soil. Three centuries back, we could only dream, like Cyrano de Bergerac, of traveling to the moon. In the past, my Gallic ancestors were afraid of nothing except that the “sky will fall on their heads.”

Today we can fold ourselves into the molecular machinery of soil bacteria through our sciences and technologies. We run robots on Mars. We photograph and dream of further galaxies. And yet we fear that the climate could destroy us.

Everyday in our newspapers we read about more entanglements of all those things that were once imagined to be separable — science, morality, religion, law, technology, finance, and politics. But these things are tangled up together everywhere: in the Intergovernmental Panel on Climate Change, in the space shuttle, and in the Fukushima nuclear power plant.

If you envision a future in which there will be less and less of these entanglements thanks to Science, capital S, you are a modernist. But if you brace yourself for a future in which there will always be more of these imbroglios, mixing many more heterogeneous actors, at a greater and greater scale and at an ever-tinier level of intimacy requiring even more detailed care, then you are… what? A compositionist!

The dominant, peculiar story of modernity is of humankind’semancipation from Nature. Modernity is the thrusting-forward arrow of time — Progress — characterized by its juvenile enthusiasm, risk taking, frontier spirit, optimism, and indifference to the past. The spirit can be summarized in a single sentence: “Tomorrow, we will be able to separate more accurately what the world is really like from the subjective illusions we used to entertain about it.”

The very forward movement of the arrow of time and the frontier spirit associated with it (the modernizing front) is due to a certain conception of knowledge: “Tomorrow, we will be able to differentiate clearly what in the past was still mixed up, namely facts and values, thanks to Science.”

Science is the shibboleth that defines the right direction of the arrow of time because it, and only it, is able to cut into two well-separated parts what had, in the past, remained hopelessly confused: a morass of ideology, emotions, and values on the one hand, and, on the other, stark and naked matters of fact.

The notion of the past as an archaic and dangerous confusion arises directly from giving Science this role. A modernist, in this great narrative, is the one who expects from Science the revelation that Nature will finally be visible through the veils of subjectivity — and subjection — that hid it from our ancestors.

And here has been the great failure of political ecology. Just when all of the human and nonhuman associations are finally coming to the center of our consciousness, when science and nature and technology and politics become so confused and mixed up as to be impossible to untangle, just as these associations are beginning to be shaped in our political arenas and are triggering our most personal and deepest emotions, this is when a new apartheid is declared: leave Nature alone and let the humans retreat — as the English did on the beaches of Dunkirk in the 1940s.

Just at the moment when this fabulous dissonance inherent in the modernist project between what modernists say (emancipation from all attachments!) and what they do (create ever-more attachments!) is becoming apparent to all, along come those alleging to speak for Nature to say the problem lies in the violations and imbroglios — the attachments!

Instead of deciding that the great narrative of modernism (Emancipation) has always resulted in another history altogether (Attachments), the spirit of the age has interpreted the dissonance in quasi-apocalyptic terms: “We were wrong all along, let’s turn our back to progress, limit ourselves, and return to our narrow human confines, leaving the nonhumans alone in as pristine a Nature as possible, mea culpa, mea maxima culpa…

Nature, this great shortcut of due political process, is now used to forbid humans to encroach. Instead of realizing at last that the emancipation narrative is bunk, and that modernism was always about attachments, modernist greens have suddenly shifted gears and have begun to oppose the promises of modernization.

Why do we feel so frightened at the moment that our dreams of modernization finally come true? Why do we suddenly turn pale and wish to fall back on the other side of Hercules’s columns, thinking we are being punished for having transgressed the sign: “Thou shall not transgress?” Was not our slogan until now, as Nordhaus and Shellenberger note in Break Through, “We shall overcome!”?6

In the name of indisputable facts portraying a bleak future for the human race, green politics has succeeded in leaving citizens nothing but a gloomy asceticism, a terror of trespassing Nature, and a diffidence toward industry, innovation, technology, and science. No wonder that, while political ecology claims to embody the political power of the future, it is reduced everywhere to a tiny portion of electoral strap-hangers. Even in countries where political ecology is a little more powerful, it contributes only a supporting force.

Political ecology has remained marginal because it has not grasped either its own politics or its own ecology. It thinks it is speaking of Nature, System, a hierarchical totality, a world without man, an assured Science, but it is precisely these overly ordered pronouncements that marginalize it.

Set in contrast to the modernist narrative, this idea of political ecology could not possibly succeed. There is beauty and strength in the modernist story of emancipation. Its picture of the future is so attractive, especially when put against such a repellent past, that it makes one wish to run forward to break all the shackles of ancient existence.

To succeed, an ecological politics must manage to be at least as powerful as the modernizing story of emancipation without imagining that we are emancipating ourselves from Nature. What the emancipation narrative points to as proof of increasing human mastery over and freedom from Nature — agriculture, fossil energy, technology — can be redescribed as the increasing attachmentsbetween things and people at an ever-expanding scale. If the older narratives imagined humans either fell from Nature or freed themselves from it, the compositionist narrative describes our ever-increasing degree of intimacy with the new natures we are constantly creating. Only “out of Nature” may ecological politics start again and anew.

2.
The paradox of “the environment” is that it emerged in public parlance just when it was starting to disappear. During the heyday of modernism, no one seemed to care about “the environment” because there existed a huge unknown reserve on which to discharge all bad consequences of collective modernizing actions. The environment is what appeared when unwanted consequences came back to haunt the originators of such actions.

But if the originators are true modernists, they will see the return of “the environment” as incomprehensible since they believed they were finally free of it. The return of consequences, like global warming, is taken as a contradiction, or even as a monstrosity, which it is, of course, but only according to the modernist’s narrative of emancipation. In the compositionist’s narrative of attachments, unintended consequences are quite normal — indeed, the most expected things on earth!

Environmentalists, in the American sense of the word, never managed to extract themselves from the contradiction that the environment is precisely not “what lies beyond and should be left alone” — this was the contrary, the view of their worst enemies! The environment is exactly what should be even more managed, taken up, cared for, stewarded, in brief, integrated and internalized in the very fabric of the polity.

France, for its part, has never believed in the notion of a pristine Nature that has so confused the “defense of the environment” in other countries. What we call a “national park” is a rural ecosystem complete with post offices, well-tended roads, highly subsidized cows, and handsome villages.

Those who wish to protect natural ecosystems learn, to their stupefaction, that they have to work harder and harder — that is, to intervene even more, at always greater levels of detail, with ever more subtle care — to keep them “natural enough” for Nature-intoxicated tourists to remain happy.

Like France’s parks, all of Nature needs our constant care, our undivided attention, our costly instruments, our hundreds of thousands of scientists, our huge institutions, our careful funding. But though we have Nature, and we have nurture, we don’t know what it would mean for Nature itself to be nurtured.7

The word “environmentalism” thus designates this turning point in history when the unwanted consequences are suddenly considered to be such a monstrosity that the only logical step appears to be to abstain and repent: “We should not have committed so many crimes; now we should be good and limit ourselves.” Or at least this is what people felt and thought before the breakthrough, at the time when there was still an “environment.”

But what is the breakthrough itself then? If I am right, the breakthrough involves no longer seeing a contradiction between the spirit of emancipation and its catastrophic outcomes, but accepting it as the normal duty of continuing to care for unwanted consequences, even if this means going further and further down into the imbroglios. Environmentalists say: “From now on we should limit ourselves.” Postenvironmentalists exclaim: “From now on, we should stop flagellating ourselves and take up explicitly and seriously what we have been doing all along at an ever-increasing scale, namely, intervening, acting, wanting, caring.” For environmentalists, the return of unexpected consequences appears as a scandal (which it is for the modernist myth of mastery). For postenvironmentalists, the other, unintended consequences are part and parcel of any action.

3.
One way to seize upon the breakthrough from environmentalism to postenvironmentalism is to reshape the very definition of the “precautionary principle.” This strange moral, legal, epistemological monster has appeared in European and especially French politics after many scandals due to the misplaced belief by state authority in the certainties provided by Science.8

When action is supposed to be nothing but the logical consequence of reason and facts (which the French, of all people, still believe), it is quite normal to wait for the certainty of science before administrators and politicians spring to action. The problem begins when experts fail to agree on the reasons and facts that have been taken as the necessary premises of any action. Then the machinery of decision is stuck until experts come to an agreement. It was in such a situation that the great tainted blood catastrophe of the 1980s ensued: before agreement was produced, hundreds of patients were transfused with blood contaminated by the AIDS virus.9

The precautionary principle was introduced to break this odd connection between scientific certainty and political action, stating that even in the absence of certainty, decisions could be made. But of course, as soon as it was introduced, fierce debates began on its meaning. Is it an environmentalist notion that precludes action or a postenvironmentalist notion that finally follows action through to its consequences?

Not surprisingly, the enemies of the precautionary principle — which President Chirac enshrined in the French Constitution as if the French, having indulged so much in rationalism, had to be protected against it by the highest legal pronouncements — took it as proof that no action was possible any more. As good modernists, they claimed that if you had to take so many precautions in advance, to anticipate so many risks, to include the unexpected consequences even before they arrived, and worse, to be responsible for them, then it was a plea for impotence, despondency, and despair. The only way to innovate, they claimed, is to bounce forward, blissfully ignorant of the consequences or at least unconcerned by what lies outside your range of action. Their opponents largely agreed. Modernist environmentalists argued that the principle of precaution dictated no action, no new technology, no intervention unless it could be proven with certainty that no harm would result. Modernists we were, modernists we shall be!

But for its postenvironmental supporters (of which I am one) the principle of precaution, properly understood, is exactly the change ofzeitgeist needed: not a principle of abstention — as many have come to see it — but a change in the way any action is considered, a deep tidal change in the linkage modernism established between science and politics. From now on, thanks to this principle, unexpected consequences are attached to their initiators and have to be followed through all the way.

4.
The link between technology and theology hinges on the notion of mastery. Descartes exclaimed that we should be “maîtres et possesseurs de la nature.”10
But what does it mean to be a master? In the modernist narrative, mastery was supposed to require such total dominance by the master that he was emancipated entirely from any care and worry. This is the myth about mastery that was used to describe the technical, scientific, and economic dominion of Man over Nature.

But if you think about it according to the compositionist narrative, this myth is quite odd: where have we ever seen a master freed from any dependence on his dependents? The Christian God, at least, is not a master who is freed from dependents, but who, on the contrary, gets folded into, involved with, implicated with, and incarnated into His Creation. God is so attached and dependent upon His Creation that he is continually forced (convinced? willing?) to save it. Once again, the sin is not to wish to have dominion over Nature, but to believe that this dominion means emancipation and not attachment.

If God has not abandoned His Creation and has sent His Son to redeem it, why do you, a human, a creature, believe that you can invent, innovate, and proliferate — and then flee away in horror from what you have committed? Oh, you the hypocrite who confesses of one sin to hide a much graver, mortal one! Has God fled in horror after what humans made of His Creation? Then have at least the same forbearance that He has.

The dream of emancipation has not turned into a nightmare. It was simply too limited: it excluded nonhumans. It did not care about unexpected consequences; it was unable to follow through with its responsibilities; it entertained a wholly unrealistic notion of what science and technology had to offer; it relied on a rather impious definition of God, and a totally absurd notion of what creation, innovation, and mastery could provide.

Which God and which Creation should we be for, knowing that, contrary to Dr. Frankenstein, we cannot suddenly stop being involved and “go home?” Incarnated we are, incarnated we will be. In spite of a centuries-old misdirected metaphor, we should, without any blasphemy, reverse the Scripture and exclaim: “What good is it for a man to gain his soul yet forfeit the whole world?” /

1. Polidori, John, et al. 1819. The Vampyre: A Tale. Printed for Sherwood, Neely, and Jones.

2. Shelley, Mary W., 1823. Frankenstein: Or, The Modern Prometheus. Printed for G. and W.B. Whittaker.

3. Ibid.

4. This is also the theme of: Latour, Bruno. 1996. Aramis or the Love of Technology. Translated by Catherine Porter. Cambridge, Mass: Harvard University Press.

5. Beck, Ulrich. 1992. Risk Society: Towards a New Modernity. London: Sage.

6. Nordhaus, Ted, and Michael Shellenberger. 2007. Break Through: From the Death of Environmentalism to the Politics of Possibility. Boston: Houghton Mifflin Harcourt.

7. Descola, Philippe. 2005. Par dela nature et culture. Paris: Gallimard.

8. Sadeleer, Nicolas de, 2006. Implementing the Precautionary Principle: Approaches from Nordic Countries and the EU. Earthscan Publ. Ltd.

9. Hermitte, Marie-Angele. 1996. Le Sang Et Le Droit. Essai Sur La Transfusion Sanguine. Paris: Le Seuil.

10. Descartes, Rene. 1637. Discourse on Method in Discourse on Method and Related Writings. Translated by Desmond M. Clark. 1999. Part 6, 44. New York: Penguin.

The U.S. Has Fallen Behind in Numerical Weather Prediction: Part I

March 28, 2012 – 05:00 AM
By Dr. Cliff Mass (Twitter @CliffMass)

It’s a national embarrassment. It has resulted in large unnecessary costs for the U.S. economy and needless endangerment of our citizens. And it shouldn’t be occurring.

What am I talking about? The third rate status of numerical weather prediction in the U.S. It is a huge story, an important story, but one the media has not touched, probably from lack of familiarity with a highly technical subject. And the truth has been buried or unavailable to those not intimately involved in the U.S. weather prediction enterprise. This is an issue I have mentioned briefly in previous blogs, and one many of you have asked to learn more about. It’s time to discuss it.

Weather forecasting today is dependent on numerical weather prediction, the numerical solution of the equations that describe the atmosphere. The technology of weather prediction has improved dramatically during the past decades as faster computers, better models, and much more data (mainly satellites) have become available.

Supercomputers are used for numerical weather prediciton.

U.S. numerical weather prediction has fallen to third or fourth place worldwide, with the clear leader in global numerical weather prediction (NWP) being the European Center for Medium Range Weather Forecasting (ECMWF). And we have also fallen behind in ensembles (using many models to give probabilistic prediction) and high-resolution operational forecasting. We used to be the world leader decades ago in numerical weather prediction: NWP began and was perfected here in the U.S. Ironically, we have the largest weather research community in the world and the largest collection of universities doing cutting-edge NWP research (like the University of Washington!). Something is very, very wrong and I will talk about some of the issues here. And our nation needs to fix it.

But to understand the problem, you have to understand the competition and the players. And let me apologize upfront for the acronyms.

In the U.S., numerical weather prediction mainly takes place at the National Weather Service’s Environmental Modeling Center (EMC), a part of NCEP (National Centers for Environmental Prediction). They run a global model (GFS) and regional models (e.g., NAM).

The Europeans banded together decades ago to form the European Center for Medium-Range Forecasting (ECMWF), which runs a very good global model. Several European countries run regional models as well.

The United Kingdom Met Office (UKMET) runs an excellent global model and regional models. So does the Canadian Meteorological Center (CMC).

There are other major global NWP centers such as the Japanese Meteorological Agency (JMA), the U.S. Navy (FNMOC), the Australian center, one in Beijing, among others. All of these centers collect worldwide data and do global NWP.

The problem is that both objective and subjective comparisons indicate that the U.S. global model is number 3 or number 4 in quality, resulting in our forecasts being noticeably inferior to the competition. Let me show you a rather technical graph (produced by the NWS) that illustrates this. This figure shows the quality of the 500hPa forecast (about halfway up in the troposphere–approximately 18,000 ft) for the day 5 forecast. The top graph is a measure of forecast skill (closer to 1 is better) from 1996 to 2012 for several models (U.S.–black, GFS; ECMWF-red, Canadian: CMC-blue, UKMET: green, Navy: FNG, orange). The bottom graph shows the difference between the U.S. and other nation’s model skill.

You first notice that forecasts are all getting better. That’s good. But you will notice that the most skillful forecast (closest to one) is clearly the red one…the European Center. The second best is the UKMET office. The U.S. (GFS model) is third…roughly tied with the Canadians.

Here is a global model comparison done by the Canadian Meteorological Center, for various global models from 2009-2012 for the 120 h forecast. This is a plot of error (RMSE, root mean square error) again for 500 hPa, and only for North America. Guess who is best again (lowest error)?–the European Center (green circle). UKMET is next best, and the U.S. (NCEP, blue triangle) is back in the pack.

Lets looks at short-term errors. Here is a plot from a paper by Garrett Wedam, Lynn McMurdie and myself comparing various models at 24, 48, and 72 hr for sea level pressure along the West Coast. Bigger bar means more error. Guess who has the lowest errors by far? You guessed it, ECMWF.

I could show you a hundred of these plots, but the answers are very consistent. ECMWF is the worldwide gold standard in global prediction, with the British (UKMET) second. We are third or fourth (with the Canadians). One way to describe this, is that the ECWMF model is not only better at the short range, but has about one day of additional predictability: their 8 day forecast is about as skillful as our 7 day forecast. Another way to look at it is that with the current upward trend in skill they are 5-7 years ahead of the U.S.

Most forecasters understand the frequent superiority of the ECMWF model. If you read the NWS forecast discussion, which is available online, you will frequently read how they often depend not on the U.S. model, but the ECMWF. And during the January western WA snowstorm, it was the ECMWF model that first indicated the correct solution. Recently, I talked to the CEO of a weather/climate related firm that was moving up to Seattle. I asked them what model they were using: the U.S. GFS? He laughed, of course not…they were using the ECMWF.

A lot of U.S. firms are using the ECMWF and this is very costly, because the Europeans charge a lot to gain access to their gridded forecasts (hundreds of thousands of dollars per year). Can you imagine how many millions of dollars are being spent by U.S. companies to secure ECMWF predictions? But the cost of the inferior NWS forecasts are far greater than that, because many users cannot afford the ECMWF grids and the NWS uses their global predictions to drive the higher-resolution regional models–which are NOT duplicated by the Europeans. All of U.S. NWP is dragged down by these second-rate forecasts and the costs for the nation has to be huge, since so much of our economy is weather sensitive. Inferior NWP must be costing billions of dollars, perhaps many billions.

The question all of you must be wondering is why this bad situation exists. How did the most technologically advanced country in the world, with the largest atmospheric sciences community, end up with third-rate global weather forecasts? I believe I can tell you…in fact, I have been working on this issue for several decades (with little to show for it). Some reasons:

1. The U.S. has inadequate computer power available for numerical weather prediction. The ECMWF is running models with substantially higher resolution than ours because they have more resources available for NWP. This is simply ridiculous–the U.S. can afford the processors and disk space it would take. We are talking about millions or tens of millions of dollars at most to have the hardware we need. A part of the problem has been NWS procurement, that is not forward-leaning, using heavy metal IBM machines at very high costs.

2. The U.S. has used inferior data assimilation. A key aspect of NWP is to assimilate the observations to create a good description of the atmosphere. The European Center, the UKMET Office, and the Canadians using 4DVAR, an advanced approach that requires lots of computer power. We used an older, inferior approach (3DVAR). The Europeans have been using 4DVAR for 20 years! Right now, the U.S. is working on another advanced approach (ensemble-based data assimilation), but it is not operational yet.

3. The NWS numerical weather prediction effort has been isolated and has not taken advantage of the research community. NCEP’s Environmental Modeling Center (EMC) is well known for its isolation and “not invented here” attitude. While the European Center has lots of visitors and workshops, such things are a rarity at EMC. Interactions with the university community have been limited and EMC has been reluctant to use the models and approaches developed by the U.S. research community. (True story: some of the advances in probabilistic weather prediction at the UW has been adopted by the Canadians, while the NWS had little interest). The National Weather Service has invested very little in extramural research and when their budget is under pressure, university research is the first thing they reduce. And the U.S. NWP center has been housed in a decaying building outside of D.C.,one too small for their needs as well. (Good news… a new building should be available soon).

4. The NWS approach to weather related research has been ineffective and divided. The governmnent weather research is NOT in the NWS, but rather in NOAA. Thus, the head of the NWS and his leadership team do not have authority over folks doing research in support of his mission. This has been an extraordinarily ineffective and wasteful system, with the NOAA research teams doing work that often has a marginal benefit for the NWS.

5. Lack of leadership. This is the key issue. The folks in NCEP, NWS, and NOAA leadership have been willing to accept third-class status, providing lots of excuses, but not making the fundamental changes in organization and priority that could deal with the problem. Lack of resources for NWP is another issue…but that is a decision made by NOAA/NWS/Dept of Commerce leadership.

This note is getting long, so I will wait to talk about the other problems in the NWS weather modeling efforts, such as our very poor ensemble (probabilistic) prediction systems. One could write a paper on this…and I may.

I should stress that I am not alone in saying these things. A blue-ribbon panel did a review of NCEP in 2009 and came to similar conclusions (found here). And these issues are frequently noted at conferences, workshops, and meetings.

Let me note that the above is about the modeling aspects of the NWS, NOT the many people in the local forecast offices. This part of the NWS is first-rate. They suffer from inferior U.S. guidance and fortunately have access to the ECMWF global forecasts. And there are some very good people at NCEP that have lacked the resources required and suitable organization necessary to push forward effectively.

This problem at the National Weather Service is not a weather prediction problem alone, but an example of a deeper national malaise. It is related to other U.S. issues, like our inferior K-12 education system. Our nation, gaining world leadership in almost all areas, became smug, self-satisfied, and a bit lazy. We lost the impetus to be the best. We were satisfied to coast. And this attitude must end…in weather prediction, education, and everything else… or we will see our nation sink into mediocrity.

The U.S. can reclaim leadership in weather prediction, but I am not hopeful that things will change quickly without pressure from outside of the NWS. The various weather user communities and our congressional representatives must deliver a strong message to the NWS that enough is enough, that the time for accepting mediocrity is over. And the Weather Service requires the resources to be first rate, something it does not have at this point.

*  *  *

Saturday, April 7, 2012

Lack of Computer Power Undermines U.S. Numerical Weather Prediction (Revised)

In my last blog on this subject, I provided objective evidence of how U.S. numerical weather prediction (NWP), and particularly our global prediction skill, lags between major international centers, such as the European Centre for Medium Range Weather Forecasting (ECMWF), the UKMET office, and the Canadian Meteorological Center (CMC).   I mentioned briefly how the problem extends to high-resolution weather prediction over the U.S. and the use of ensemble (many model runs) weather prediction, both globally and over the U.S.  Our nation is clearly number one in meteorological research and we certainly have the knowledge base to lead the world in numerical weather prediction, but for a number of reasons we are not.  The cost of inferior weather prediction is huge: in lives lost, injuries sustained, and economic impacts unmitigated.  Truly, a national embarrassment. And one we must change.

In this blog, I will describe in some detail one major roadblock in giving the U.S. state-of-the-art weather prediction:  inadequate computer resources.   This situation should clearly have been addressed years ago by leadership in the National Weather Service, NOAA, and the Dept of Commerce, but has not, and I am convinced will not without outside pressure.  It is time for the user community and our congressional representatives to intervene.  To quote Samuel L. Jackson, enough is enough. (…)

In the U.S. we are trying to use less computer resources to do more tasks than the global leaders in numerical weather prediction. (Note: U.S. NWP is done by National Centers for Environmental Prediction’s (NCEP) Environmental Modeling Center (EMC)).  This chart tells the story:
Courtesy of Bill Lapenta, EMC.
ECMWF does global high resolution and ensemble forecasts, and seasonal climate forecasts.  UKMET office also does regional NWP (England is not a big country!) and regional air quality.  NCEP does all of this plus much, much more (high resolution rapid update modeling, hurricane modeling, etc.).   And NCEP has to deal with prediction over a continental-size country.

If you would expect the U.S. has a lot more computer power to balance all these responsibilities and tasks, you would be very wrong.  Right now the U.S. NWS has two IBM supercomputers, each with 4992 processors (IBM Power6 processors).   One computer does the operational work, the other is for back up (research and testing runs are done on the back-up).  About 70 teraflops (trillion floating points operations per second) for each machine.

NCEP (U.S.) Computer
The European Centre has a newer IBM machine with 8192, much faster, processors that gets 182 terraflops (yes, over twice as fast and with far fewer tasks to do).

The UKMET office, serving a far, far smaller country, has two newer IBM machines, each with 7680 processors for 175 teraflops per machine.

Here is a figure, produced at NCEP that compares the relative computer power of NCEP’s machine with the European Centre’s.  The shading indicates computational activity and the x-axis for each represents a 24-h period.  The relative heights allows you to compare computer resources.  Not only does the ECMWF have much more computer power, but they are more efficient in using it…packing useful computations into every available minute.

Courtesy of Bill Lapenta, EMC
Recently, NCEP had a request for proposals for a replacement computer system.  You may not believe this, but the specifications were ONLY for a system at least equal to the one that have.    A report in acomputer magazine suggests that perhaps this new system (IBM got the contract) might be slightly less powerful (around 150 terraflops) than one of the UKMET office systems…but that is not known at this point.

The Canadians?  They have TWO machines like the European Centre’s!

So what kind of system does NCEP require to serve the nation in a reasonable way?

To start, we need to double the resolution of our global model to bring it into line with ECMWF (they are now 15 km global).   Such resolution allows the global model to model regional features (such as our mountains).  Doubling horizontal resolution requires 8 times more computer power.  We need to use better physics (description of things like cloud processes and radiation).  Double again.  And we need better data assimilation (better use of observations to provide an improved starting point for the model).  Double once more.  So we need 32 times more computer power for the high-resolution global runs to allow us to catch up with ECMWF.  Furthermore, we must do the same thing for the ensembles (running many lower resolution global simulations to get probabilistic information).  32 times more computer resources for that (we can use some of the gaps in the schedule of the high resolution runs to fit some of this in…that is what ECMWF does).   There are some potential ways NCEP can work more efficiently as well.  Right now NCEP runs our global model out to 384 hours four times a day (every six hours).  To many of us this seems excessive, perhaps the longest periods (180hr plus) could be done twice a day.  So lets begin with a computer 32 times faster that the current one.

Many workshops and meteorological meetings (such as one on improvements in model physics that was held at NCEP last summer—I was the chair) have made a very strong case that the U.S. requires an ensemble prediction system that runs at 4-km horizontal resolution.  The current national ensemble system has a horizontal resolution about 32 km…and NWS plans to get to about 20 km in a few years…both are inadequate.   Here is an example of the ensemble output (mean of the ensemble members) for the NWS and UW (4km) ensemble systems:  the difference is huge–the NWS system does not even get close to modeling the impacts of the mountains.  It is similarly unable to simulate large convective systems.

Current NWS( NCEP) “high resolution” ensembles (32 km)
4 km ensemble mean from UW system
Let me make one thing clear.  Probabilistic prediction based on ensemble forecasts and reforecasting (running models back for years to get statistics of performance) is the future of weather prediction.  The days of giving a single number for say temperature at day 5 are over.  We need to let people know about uncertainty and probabilities.  The NWS needs a massive increase of computer power to do this. It lacks this computer power now and does not seem destined to get it soon.

A real champion within NOAA of the need for more computer power is Tom Hamill, an expert on data assimilation and model post-processing.   He and colleagues have put together a compelling case for more NWS computer resources for NWP.  Read it here.

Back-of-the-envelope calculations indicates that a good first step– 4km national ensembles–would require about 20,000 processors to do so in a timely manner–but it would revolutionize weather prediction in the U.S., including forecasting convection and in mountainous areas.  This high-resolution ensemble effort would meld with data assimilation over the long-term.

And then there is running super-high resolution numerical weather prediction to get fine-scale details right.  Here in the NW my group runs a 1.3 km horizontal resolution forecast out twice a day for 48h.   Such capability is needed for the entire country.  It does not exist now due to inadequate computer resources.

The bottom line is that the NWS numerical modeling effort needs a huge increase of computer power to serve the needs of the country–and the potential impacts would be transformative.   We could go from having a third-place effort, which is slipping back into the pack, to a world leader.  Furthermore, the added computer power will finally allow NOAA to complete Observing System Simulation Experiments (OSSEs) and Observing System Experiments (OSEs) to make rational decisions about acquisitions of very expensive satellite systems.  The fact that this is barely done today is really amazing and a potential waste of hundreds of millions of dollars on unnecessary satellite systems.

But do to so will require a major jump in computational power, a jump our nation can easily afford.   I would suggest that NWS’s EMC should begin by securing at least a 100,000 processor machine, and down the road something considerably larger.  Keep in mind my department has about 1000 processors in our computational clusters, so this is not as large as you think.

For a country with several billion-dollar weather disasters a year, investment in reasonable computer resrouces for NWP is obvious.
The cost?   Well, I asked Art Mann of Silicon Mechanics (a really wonderful local vendor of computer clusters) to give me rough quote:  using fast AMD chips, you could have such a 100K core machine for 11 million dollars. (this is without any discount!)  OK, this is the U.S. government and they like expensive, heavy metal machines….lets go for 25 million dollars.  The National Center for Atmospheric Research (NCAR) is getting a new machine with around 75,000 processors and the cost will be around 25-35 million dollars.   NCEP will want two machines, so lets budget 60 million dollars. We spend this much money on a single jet fighter, but we can’t invest this amount to greatly improve forecasts and public safety in the U.S.?  We have machines far larger than this for breaking codes, doing simulations of thermonuclear explosions, and simulating climate change.

Yes, a lot of money, but I suspect the cost of the machine would be paid back in a few months from improved forecasts.   Last year we had quite a few (over ten) billion-dollar storms….imagine the benefits of forecasting even a few of them better.  Or the benefits to the wind energy and utility industries, or U.S. aviation, of even modestly improved forecasts.   And there is no doubt such computer resources would improve weather prediction.  The list of benefits is nearly endless.   Recent estimates suggest that  normal weather events cost the U.S. economy nearly 1/2 trillion dollars a year.  Add to that hurricanes, tornadoes, floods, and other extreme weather.  The business case is there.

As someone with an insider’s view of the process, it is clear to me that the current players are not going to move effectively without some external pressure.  In fact, the budgetary pressure on the NWS is very intense right now and they are cutting away muscle and bone at this point (like reducing IT staff in the forecast offices by over 120 people and cutting back on extramural research).  I believe it is time for weather sensitive industries and local government, together with t he general public, to let NOAA management and our congressional representatives know that this acute problem needs to be addressed and addressed soon.   We are acquiring huge computer resources for climate simulations, but only a small fraction of that for weather prediction…which can clearly save lives and help the economy.  Enough is enough.

Posted by Cliff Mass Weather Blog at 8:38 PM

Best Practices Are the Worst (Education Next)

SUMMER 2012 / VOL. 12, NO. 3 – http://educationnext.org/

As reviewed by Jay P. Greene

“Best practices” is the worst practice. The idea that we should examine successful organizations and then imitate what they do if we also want to be successful is something that first took hold in the business world but has now unfortunately spread to the field of education. If imitation were the path to excellence, art museums would be filled with paint-by-number works.

The fundamental flaw of a “best practices” approach, as any student in a half-decent research-design course would know, is that it suffers from what is called “selection on the dependent variable.” If you only look at successful organizations, then you have no variation in the dependent variable: they all have good outcomes. When you look at the things that successful organizations are doing, you have no idea whether each one of those things caused the good outcomes, had no effect on success, or was actually an impediment that held organizations back from being even more successful. An appropriate research design would have variation in the dependent variable; some have good outcomes and some have bad ones. To identify factors that contribute to good outcomes, you would, at a minimum, want to see those factors more likely to be present where there was success and less so where there was not.

“Best practices” lacks scientific credibility, but it has been a proven path to fame and fortune for pop-management gurus like Tom Peters, with In Search of Excellence, and Jim Collins, with Good to Great. The fact that many of the “best” companies they featured subsequently went belly-up—like Atari and Wang Computers, lauded by Peters, and Circuit City and Fannie Mae, by Collins—has done nothing to impede their high-fee lecture tours. Sometimes people just want to hear a confident person with shiny teeth tell them appealing stories about the secrets to success.

With Surpassing Shanghai, Marc Tucker hopes to join the ranks of the “best practices” gurus. He, along with a few of his colleagues at the National Center on Education and the Economy, has examined the education systems in some other countries with successful outcomes so that the U.S. can become similarly successful. Tucker coauthors the chapter on Japan, as well as an introductory and two concluding chapters. Tucker’s collaborators write chapters featuring Shanghai, Finland, Singapore, and Canada. Their approach to greatness in American education, as Linda Darling-Hammond phrases it in the foreword, is to ensure that “our strategies must emulate the best of what has been accomplished in public education both from here and abroad.”

But how do we know what those best practices are? The chapters on high-achieving countries describe some of what those countries are doing, but the characteristics they feature may have nothing to do with success or may even be a hindrance to greater success. Since the authors must pick and choose what characteristics they highlight, it is also quite possible that countries have successful education systems because of factors not mentioned at all. Since there is no scientific method to identifying the critical features of success in the best-practices approach, we simply have to trust the authority of the authors that they have correctly identified the relevant factors and have properly perceived the causal relationships.

But Surpassing Shanghai is even worse than the typical best-practices work, because Tucker’s concluding chapters, in which he summarizes the common best practices and draws policy recommendations, have almost no connection to the preceding chapters on each country. That is, the case studies of Shanghai, Finland, Japan, Singapore, and Canada attempt to identify the secrets to success in each country, a dubious-enough enterprise, and then Tucker promptly ignores all of the other chapters when making his general recommendations.

Tucker does claim to be drawing on the insights of his coauthors, but he never actually references the other chapters in detail. He never names his coauthors or specifically draws on them for his conclusions. In fact, much of what Tucker claims as common lessons of what his coauthors have observed from successful countries is contradicted in chapters that appear earlier in the book. And some of the common lessons they do identify, Tucker chooses to ignore.

For example, every country case study in Surpassing Shanghai, with the exception of the one on Japan coauthored by Marc Tucker, emphasizes the importance of decentralization in producing success. In Shanghai the local school system “received permission to create its own higher education entrance examination. This heralded a trend of exam decentralization, which was key to localized curricula.” The chapter on Finland describes the importance of the decision “to devolve increasing levels of authority and responsibility for education from the Ministry of Education to municipalities and schools…. [T]here were no central initiatives that the government was trying to push through the system.” Singapore is similarly described: “Moving away from the centralized top-down system of control, schools were organized into geographic clusters and given more autonomy…. It was felt that no single accountability model could fit all schools. Each school therefore set its own goals and annually assesses its progress toward meeting them…” And the chapter on Canada teaches us that “the most striking feature of the Canadian system is its decentralization.”

Tucker makes no mention of this common decentralization theme in his conclusions and recommendations. Instead, he claims the opposite as the common lesson of successful countries: “students must all meet a common basic education standard aligned to a national or provincial curriculum… Further, in these countries, the materials prepared by textbook publishers and the publishers of supplementary materials are aligned with the national curriculum framework.” And “every high-performing country…has a unit of government that is clearly in charge of elementary and secondary education…In such countries, the ministry has an obligation to concern itself with the design of the system as a whole…”

Conversely, Tucker emphasizes that “the dominant elements of the American education reform agenda” are noticeably absent from high-performing countries, including “the use of market mechanisms, such as charter schools and vouchers….” But if Tucker had read the chapter on Shanghai, he would have found a description of a system by which “students choose schools in other neighborhoods by paying a sponsorship fee. It is the Chinese version of school choice, a hot issue in the United States.” And although the chapter on Canada fails to make any mention of it, Canada has an extensive system of school choice, offering options that vary by language and religious denomination. According to recently published research by David Card, Martin Dooley, and Abigail Payne, competition among these options is a significant contributor to academic achievement in Canada.

There is a reason that promoters of best-practices approaches are called “gurus.” Their expertise must be derived from a mystical sphere, because it cannot be based on a scientific appraisal of the evidence. Marc Tucker makes no apology for his nonscientific approach. In fact, he denounces “the clinical research model used in medical research” when assessing education policies. The problem, he explains, is that no country would consent to “randomly assigning entire national populations to the education systems of another country or to certain features of the education system of another country.” On the contrary, countries, states, and localities can and do randomly assign “certain features of the education system,” and we have learned quite a lot from that scientific process. In the international arena, Tucker may want to familiarize himself with the excellent work being done by Michael Kremer and Karthik Muralidharan utilizing random assignment around the globe.

In addition, social scientists have developed practices to observe and control for differences in the absence of random assignment that have allowed extensive and productive analyses of the effectiveness of educational practices in different countries. In particular, the recent work of Ludger Woessmann, Martin West, and Eric Hanushek has utilized the PISA and TIMSS international test results that Tucker finds so valuable, but they have done so with the scientific methods that Tucker rejects. Even well-constructed case study research, like that done by Charles Glenn, can draw useful lessons across countries. The problem with the best-practices approach is not entirely that it depends on case studies, but that by avoiding variation in the dependent variable it prevents any scientific identification of causation.

Tucker’s hostility to scientific approaches is more understandable, given that his graduate training was in theater rather than a social science. Perhaps that is also why Tucker’s book reminds me so much of The Music Man. Tucker is like “Professor” Harold Hill come to town to sell us a bill of goods. His expertise is self-appointed, and his method, the equivalent of “the think system,” is obvious quackery. And the Gates Foundation, which has for some reason backed Tucker and his organization with millions of dollars, must be playing the residents of River City, because they have bought this pitch and are pouring their savings into a band that can never play music except in a fantasy finale.

Best practices really are the worst.

Jay P. Greene is professor of education reform at the University of Arkansas and a fellow at the George W. Bush Institute.

Surpassing Shanghai: An Agenda for American Education Built on the World’s Leading Systems
Edited by Marc Tucker
Harvard Education Press, 2011, $49.99; 288 pages.

UK aid helps to fund forced sterilisation of India’s poor [climate change](The Guardian)

Money from the Department for International Development has helped pay for a controversial programme that has led to miscarriages and even deaths after botched operations

Gethin Chamberlain
The Observer, Sunday 15 April 2012

Sterilisation remains the most common method of family planning in India’s bid to curb its burgeoning population of 1.2 billion. Photograph: Mustafa Quraishi/AP

Tens of millions of pounds of UK aid money have been spent on a programme that has forcibly sterilised Indian women and men, theObserver has learned. Many have died as a result of botched operations, while others have been left bleeding and in agony. A number of pregnant women selected for sterilisation suffered miscarriages and lost their babies.

The UK agreed to give India £166m to fund the programme, despite allegations that the money would be used to sterilise the poor in an attempt to curb the country’s burgeoning population of 1.2 billion people.

Sterilisation has been mired in controversy for years. With officials and doctors paid a bonus for every operation, poor and little-educated men and women in rural areas are routinely rounded up and sterilised without having a chance to object. Activists say some are told they are going to health camps for operations that will improve their general wellbeing and only discover the truth after going under the knife.

Court documents filed in India earlier this month claim that many victims have been left in pain, with little or no aftercare. Across the country, there have been numerous reports of deaths and of pregnant women suffering miscarriages after being selected for sterilisation without being warned that they would lose their unborn babies.

Yet a working paper published by the UK’s Department for International Development in 2010 cited the need to fight climate change as one of the key reasons for pressing ahead with such programmes. The document argued that reducing population numbers would cut greenhouse gases, although it warned that there were “complex human rights and ethical issues” involved in forced population control.

The latest allegations centre on the states of Madhya Pradesh and Bihar, both targeted by the UK government for aid after a review of funding last year. In February, the chief minister of Madhya Pradesh had to publicly warn off his officials after widespread reports of forced sterilisation. A few days later, 35-year-old Rekha Wasnik bled to death in the state after doctors sterilised her. The wife of a poor labourer, she was pregnant with twins at the time. She began bleeding on the operating table and a postmortem cited the operation as the cause of death.

Earlier this month, India’s supreme court heard how a surgeon operating in a school building in the Araria district of Bihar in January carried out 53 operations in two hours, assisted by unqualified staff, with no access to running water or equipment to clean the operating equipment. A video shot by activists shows filthy conditions and women lying on the straw-covered ground.

Human rights campaigner Devika Biswas told the court that “inhuman sterilisations, particularly in rural areas, continue with reckless disregard for the lives of poor women”. Biswas said 53 poor and low-caste women were rounded up and sterilised in operations carried out by torchlight that left three bleeding profusely and led to one woman who was three months pregnant miscarrying. “After the surgeries, all 53 women were crying out in pain. Though they were in desperate need of medical care, no one came to assist them,” she said.

The court gave the national and state governments two months to respond to the allegations.

Activists say that it is India’s poor – and particularly tribal people – who are most frequently targeted and who are most vulnerable to pressure to be sterilised. They claim that people have been threatened with losing their ration cards if they do not undergo operations, or bribed with as little as 600 rupees (£7.34) and a sari. Some states run lotteries in which people can win cars and fridges if they agree to be sterilised.

Despite the controversy, an Indian government report shows that sterilisation remains the most common method of family planning used in its Reproductive and Child Health Programme Phase II, launched in 2005 with £166m of UK funding. According to the DfID, the UK is committed to the project until next year and has spent £34m in 2011-12. Most of the money – £162m – has been paid out, but no special conditions have been placed on the funding.

Funding varies from state to state, but in Bihar private clinics receive 1,500 rupees for every sterilisation, with a bonus of 500 rupees a patient if they carry out more than 30 operations on a particular day. NGO workers who convince people to have the operations receive 150 rupees a person, while doctors get 75 rupees for each patient.

A 2009 Indian government report said that nearly half a million sterilisations had been carried out the previous year but warned of problems with quality control and financial management.

In 2006, India’s ministry of health and family welfare published a report into sterilisation, which warned of growing concerns, and the following year an Indian government audit of the programme warned of continuing problems with sterilisation camps. “Quality of sterilisation services in the camps is a matter of concern,” it said. It also said the quality of services was affected because much of the work was crammed into the final part of the financial year.

When it announced changes to aid for India last year, the DfID promised to improve the lives of more than 10 million poor women and girls. It said: “We condemn forced sterilisation and have taken steps to ensure that not a penny of UK aid could support it. The UK does not fund sterilisation centres anywhere.

“The coalition government has completely changed the way that aid is spent in India to focus on three of the poorest states, and our support for this programme is about to end as part of that change. Giving women access to family planning, no matter where they live or how poor they are, is a fundamental tenet of the coalition’s international development policy.”

A Sharp Rise in Retractions Prompts Calls for Reform (N.Y. Times)

PLEA Dr. Ferric Fang argues that science has changed in worrying ways. Matthew Ryan Williams for The New York Times
By CARL ZIMMER – Published: April 16, 2012

In the fall of 2010, Dr. Ferric C. Fang made an unsettling discovery. Dr. Fang, who is editor in chief of the journal Infection and Immunity, found that one of his authors had doctored several papers.

It was a new experience for him. “Prior to that time,” he said in an interview, “Infection and Immunity had only retracted nine articles over a 40-year period.”

The journal wound up retracting six of the papers from the author, Naoki Mori of the University of the Ryukyus in Japan. And it soon became clear that Infection and Immunity was hardly the only victim of Dr. Mori’s misconduct. Since then, other scientific journals have retracted two dozen of his papers, according to the watchdog blog Retraction Watch.

“Nobody had noticed the whole thing was rotten,” said Dr. Fang, who is a professor at the University of Washington School of Medicine.

Dr. Fang became curious how far the rot extended. To find out, he teamed up with a fellow editor at the journal, Dr. Arturo Casadevall of the Albert Einstein College of Medicine in New York. And before long they reached a troubling conclusion: not only that retractions were rising at an alarming rate, but that retractions were just a manifestation of a much more profound problem — “a symptom of a dysfunctional scientific climate,” as Dr. Fang put it.

Dr. Casadevall, now editor in chief of the journal mBio, said he feared that science had turned into a winner-take-all game with perverse incentives that lead scientists to cut corners and, in some cases, commit acts of misconduct.

“This is a tremendous threat,” he said.

WATCHDOG  Dr. Arturo Casadevall of the Albert Einstein College of Medicine in New York teamed up with Dr. Ferric C. Fang to study a raft of retractions. Ángel Franco/The New York Times

Last month, in a pair of editorials in Infection and Immunity, the two editors issued a pleafor fundamental reforms. They also presented their concerns at the March 27 meeting of the National Academies of Sciences committee on science, technology and the law.

Members of the committee agreed with their assessment. “I think this is really coming to a head,” said Dr. Roberta B. Ness, dean of the University of Texas School of Public Health. And Dr. David Korn of Harvard Medical School agreed that “there are problems all through the system.”

No one claims that science was ever free of misconduct or bad research. Indeed, the scientific method itself is intended to overcome mistakes and misdeeds. When scientists make a new discovery, others review the research skeptically before it is published. And once it is, the scientific community can try to replicate the results to see if they hold up.

Source: Journal of Medical Ethics

But critics like Dr. Fang and Dr. Casadevall argue that science has changed in some worrying ways in recent decades — especially biomedical research, which consumes a larger and larger share of government science spending.

In October 2011, for example, the journal Nature reported that published retractions had increased tenfold over the past decade, while the number of published papers had increased by just 44 percent. In 2010 The Journal of Medical Ethics published a studyfinding the new raft of recent retractions was a mix of misconduct and honest scientific mistakes.

Several factors are at play here, scientists say. One may be that because journals are now online, bad papers are simply reaching a wider audience, making it more likely that errors will be spotted. “You can sit at your laptop and pull a lot of different papers together,” Dr. Fang said.

But other forces are more pernicious. To survive professionally, scientists feel the need to publish as many papers as possible, and to get them into high-profile journals. And sometimes they cut corners or even commit misconduct to get there.

To measure this claim, Dr. Fang and Dr. Casadevall looked at the rate of retractions in 17 journals from 2001 to 2010 and compared it with the journals’ “impact factor,” a score based on how often their papers are cited by scientists. The higher a journal’s impact factor, the two editors found, the higher its retraction rate.

The highest “retraction index” in the study went to one of the world’s leading medical journals, The New England Journal of Medicine. In a statement for this article, it questioned the study’s methodology, noting that it considered only papers with abstracts, which are included in a small fraction of studies published in each issue. “Because our denominator was low, the index was high,” the statement said.

Monica M. Bradford, executive editor of the journal Science, suggested that the extra attention high-impact journals get might be part of the reason for their higher rate of retraction. “Papers making the most dramatic advances will be subject to the most scrutiny,” she said.

Dr. Fang says that may well be true, but adds that it cuts both ways — that the scramble to publish in high-impact journals may be leading to more and more errors. Each year, every laboratory produces a new crop of Ph.D.’s, who must compete for a small number of jobs, and the competition is getting fiercer. In 1973, more than half of biologists had a tenure-track job within six years of getting a Ph.D. By 2006 the figure was down to 15 percent.

Yet labs continue to have an incentive to take on lots of graduate students to produce more research. “I refer to it as a pyramid scheme,” said Paula Stephan, a Georgia State University economist and author of “How Economics Shapes Science,” published in January by Harvard University Press.

In such an environment, a high-profile paper can mean the difference between a career in science or leaving the field. “It’s becoming the price of admission,” Dr. Fang said.

The scramble isn’t over once young scientists get a job. “Everyone feels nervous even when they’re successful,” he continued. “They ask, ‘Will this be the beginning of the decline?’ ”

University laboratories count on a steady stream of grants from the government and other sources. The National Institutes of Health accepts a much lower percentage of grant applications today than in earlier decades. At the same time, many universities expect scientists to draw an increasing part of their salaries from grants, and these pressures have influenced how scientists are promoted.

“What people do is they count papers, and they look at the prestige of the journal in which the research is published, and they see how many grant dollars scientists have, and if they don’t have funding, they don’t get promoted,” Dr. Fang said. “It’s not about the quality of the research.”

Dr. Ness likens scientists today to small-business owners, rather than people trying to satisfy their curiosity about how the world works. “You’re marketing and selling to other scientists,” she said. “To the degree you can market and sell your products better, you’re creating the revenue stream to fund your enterprise.”

Universities want to attract successful scientists, and so they have erected a glut of science buildings, Dr. Stephan said. Some universities have gone into debt, betting that the flow of grant money will eventually pay off the loans. “It’s really going to bite them,” she said.

With all this pressure on scientists, they may lack the extra time to check their own research — to figure out why some of their data doesn’t fit their hypothesis, for example. Instead, they have to be concerned about publishing papers before someone else publishes the same results.

“You can’t afford to fail, to have your hypothesis disproven,” Dr. Fang said. “It’s a small minority of scientists who engage in frank misconduct. It’s a much more insidious thing that you feel compelled to put the best face on everything.”

Adding to the pressure, thousands of new Ph.D. scientists are coming out of countries like China and India. Writing in the April 5 issue of Nature, Dr. Stephan points out that a number of countries — including China, South Korea and Turkey — now offer cash rewards to scientists who get papers into high-profile journals. She has found these incentives set off a flood of extra papers submitted to those journals, with few actually being published in them. “It clearly burdens the system,” she said.

To change the system, Dr. Fang and Dr. Casadevall say, start by giving graduate students a better understanding of science’s ground rules — what Dr. Casadevall calls “the science of how you know what you know.”

They would also move away from the winner-take-all system, in which grants are concentrated among a small fraction of scientists. One way to do that may be to put a cap on the grants any one lab can receive.

Such a shift would require scientists to surrender some of their most cherished practices — the priority rule, for example, which gives all the credit for a scientific discovery to whoever publishes results first. (Three centuries ago, Isaac Newton and Gottfried Leibniz were bickering about who invented calculus.) Dr. Casadevall thinks it leads to rival research teams’ obsessing over secrecy, and rushing out their papers to beat their competitors. “And that can’t be good,” he said.

To ease such cutthroat competition, the two editors would also change the rules for scientific prizes and would have universities take collaboration into account when they decide on promotions.

Ms. Bradford, of Science magazine, agreed. “I would agree that a scientist’s career advancement should not depend solely on the publications listed on his or her C.V.,” she said, “and that there is much room for improvement in how scientific talent in all its diversity can be nurtured.”

Even scientists who are sympathetic to the idea of fundamental change are skeptical that it will happen any time soon. “I don’t think they have much chance of changing what they’re talking about,” said Dr. Korn, of Harvard.

But Dr. Fang worries that the situation could be become much more dire if nothing happens soon. “When our generation goes away, where is the new generation going to be?” he asked. “All the scientists I know are so anxious about their funding that they don’t make inspiring role models. I heard it from my own kids, who went into art and music respectively. They said, ‘You know, we see you, and you don’t look very happy.’ ”

Doubtful significance (World Economics Association)

by G M Peter Swann [gmpswann@yahoo.co.uk]
World Economics Association Newsletter 2(2), April.2012, page 6.

In the February issue of this newsletter, Steve Keen (2012) makes some very good points about the use of mathematics in economics. Perhaps we should say that the problem is not so much the use of mathematics as the abuse of mathematics.

A particular issue that worries me is when econometricians make liberal use of assumptions, without realising how strong these are.

Consider the following example. First, you are shown a regression summary of the relationship between Y and X, estimated from 402 observations. The conventional t-statistic for the coefficient on X is 3.0. How would you react to that?

Most economists would remark that t = 3.0 implies significance at the 1% level, which is a strong confirmation of the relationship. Indeed, many researchers mark significance at the 1% level with three stars!

Second, consider the scatter diagram below. This also shows two variables Y and X, and is also based on 402 observations. What does this say about the relationship between Y and X?

Figure 1

I have shown this diagram to several colleagues and students, and typical reactions are either that there is no relationship, or that the relationship could be almost anything.
But the surprising fact is that the data in Figure 1 are exactly the same data as used to estimate the regression summary described earlier. How can such an amorphous scatter of points represent a statistically significant relationship? It is the result of a standard assumption of OLS regression: that the explanatory variable(s) X is/are independent of the noise term u.

So long as this independence assumption is true, we can estimate the relationship with surprising precision. To see this, rewrite the conventional t-statistic as,

, where ψ is a signal to noise ratio (describing the clarity of the scatter-plot) and N-k is the number of degrees of freedom (Swann, 2012). This formula can be used for bivariate and multivariate models.

In Figure 1, ψ is 0.15, which is quite low, but N-k = 400, which is large enough to make t = 3.0. More generally, even if the signal to noise ratio is very low, so that the relationship between Y and X is imperceptible from a scatter-plot, we can always estimate a significant tstatistic – so long as we have a large enough number of observations, and so long as the independence assumption is true. But there is something doubtful about this ‘significance’.

Is the independence assumption justified? In a context where data are noisy, where rough proxy variables are used, where endogeneity is pervasive, and so on, it does seem an exceptionally strong assumption.

What happens if we relax the independence assumption? When the signal to noise ratio is very low, the estimated relationship depends entirely on the assumption that replaces it. Swann (2012) shows that the relationship in Figure 1 could indeed be almost anything – depending on what we assume about the noise variable(s).

Some have suggested that this is not a problem in practice, because signal to noise ratios are usually large enough to avoid this difficulty. But, on the contrary, some evidence suggests the problem is generally worse than indicated by Figure 1.

Swann (2012) examined 100 econometric studies taken from 20 leading economics journals, yielding a sample of 2220 parameter estimates and the corresponding signal to noise ratios. Focussing on the parameter estimates that are significant (at the 5% level or better), we find that almost 80% of those have a signal to noise ratio even lower than that in Figure 1.

In summary, it appears that the problem of ‘doubtful significance’ is pervasive. The great majority of ‘significant relationships’ in this sample would be imperceptible from the corresponding scatter-plot. The ‘significance’ indicated by a high t-statistic derives from the large number of observations and the (very strong) independence assumption.

References

Keen S. (2012) “Maths for Pluralist Economics”, World Economics Association Newsletter 2 (1), 10-11

Swann G.M.P. (2012) Doubtful Significance, Working paper available at: https://sites.google.com/site/gmpswann/doubtful-significance

[Editor’s note: If you are interested in this topic, you may also wish to read D.A. Hollanders, “Five methodological fallacies in applied econometrics”, real-world economics review, issue no. 57, 6 September 2011, pp. 115-126, http://www.paecon.net/PAEReview/issue57/Hollanders57.pdf%5D

Are You Prepared for Zombies? (American Anthropological Association)

by Joslyn O.

 Today’s guest blog post is by cultural anthropologist and AAA member, Chad Huddleston. He is an Assistant Professor at St. Louis University in the Sociology, Anthropology and Criminal Justice department.

Recently, a host of new shows, such as Doomsday Preppers on NatGeo and Doomsday Bunkers on Discovery Channel, has focused on people with a wide array of concerns about possible events that may threaten their lives. Both of these shows focus on what are called ‘preppers.’ While the people that may have performed these behaviors in the past might have been called ‘survivalists,’ many ‘preppers’ have distanced themselves from that term, due to its cultural baggage: stereotypical anti-government, gun-loving, racist, extremists that are most often associated with the fundamentalist (politically and religiously) right side of the spectrum.

I’ve been doing fieldwork with preppers for the past two years, focusing on a group called Zombie Squad. It is ‘the nation’s premier non-stationary cadaver suppression task force,’ as well as a grassroots, 501(c)3 charity organization. Zombie Squad’s story is that while the zombie removal business is generally slow, there is no reason to be unprepared. So, while it is waiting for the “zombpacolpyse,” it focuses its time on disaster preparedness education for the membership and community.

The group’s position is that being prepared for zombies means that you are prepared for anything, especially those events that are much more likely than a zombie uprising – tornadoes, an interruption in services, ice storms, flooding, fires, and earthquakes.

For many in this group, Hurricane Katrina was the event that solidified their resolve to prep. They saw what we all saw – a natural disaster in which services were not available for most, leading to violence, death and chaos. Their argument is that the more prepared the public is before a disaster occurs, the less resources they will require from first responders and those agencies that come after them.

In fact, instead of being a victim of natural disaster, you can be an active responder yourself, if you are prepared. Prepare they do. Members are active in gaining knowledge of all sorts – first aid, communications, tactical training, self-defense, first responder disaster training, as well as many outdoor survival skills, like making fire, building shelters, hunting and filtering water.

This education is individual, feeding directly into the online forum they maintain (which has just under 30,000 active members from all over the world), and by monthly local meetings all over the country, as well as annual national gatherings in southern Missouri, where they socialize, learn survival skills and practice sharpshooting.

Sound like those survivalists of the past? Emphatically no. Zombie Squad’s message is one of public education and awareness, very successful charity drives for a wide array of organizations, and inclusion of all ethnicities, genders, religions and politics. Yet, the group is adamant on leaving politics and religion out of discussions on the group and prepping. You will not find exclusive language on their forum or in their media. That is not to say that the individuals in the group do not have opinions on one side or the other of these issues, but it is a fact that those issues are not to be discussed within the community of Zombie Squad.

Considering the focus on ‘future doom’ and the types of fears that are being pushed on the shows mentioned above, usually involve protecting yourself from disaster and then other people that have survived the disaster, Zombie Squad is a refreshing twist to the ‘prepper’ discourse. After all, if a natural disaster were to befall your region, whom would you rather be knocking at your door: ‘raiders’ or your neighborhood Zombie Squad member?

And the answer is no: they don’t really believe in zombies.