The study identified that high biodiversity areas also had high linguistic diversity
The decline of linguistic and cultural diversity is linked to the loss of biodiversity, a study has suggested.
The authors said that 70% of the world’s languages were found within the planet’s biodiversity hotspots.
Data showed that as these important environmental areas were degraded over time, cultures and languages in the area were also being lost.
The results of the study have been published in the Proceedings of the National Academy of Sciences (PNAS).
“Biologists estimate annual loss of species at 1,000 times or more greater than historic rates, and linguists predict that 50-90% of the world’s languages will disappear by the end of the century,” the researchers wrote.
Lead author Larry Gorenflo from Penn State University, in the US, said previous studies had identified a geographical connection between the two, but did not offer the level of detail required.
Dr Gorenflo told BBC News that the limitation to the data was that either the languages were listed by country or there was a dot on the map to indicate the location.
“But what you did not know was if the area extended two kilometres or 200 kilometres, so you really did not get a sense of the extent of the language,” he explained.
“We used improved language data to really get a more solid sense of how languages and biodiversity co-occurred and an understanding of how geographically extensive the language was.”
He said the study achieved this by also looking at smaller areas with high biodiversity, such as national parks or other protected habitats.
“When we did that, not only did we get a sense of co-occurrence at a regional scale, but we also got a sense that co-occurrence was found at a much finer scale,” he said.
“We are not quite sure yet why this happens, but in a lot of cases it may well be that biodiversity evolved as part-and-parcel of cultural diversity, and vice versa.”
In their paper, the researchers pointed out that, out of the 6,900 or more languages spoken on Earth, more than 4,800 occurred in regions containing high biodiversity.
Dr Gorenflo described these locations as “very important landscapes” which were “getting fewer and fewer” but added that the study’s data could help provide long-term security.
“It provides a wonderful opportunity to integrate conservation efforts – you can have people who can get funding for biological conservation, and they can collaborate with people who can get funding for linguistic or cultural conservation,” he suggested.
“In the past, it was hard to get biologists to look at people.
“That has really changed dramatically in the past few years. One thing that a lot of biologists and ecologists are now seeing is that people are part of these ecosystems.”
ScienceDaily (Feb. 22, 2011) — A new paper by George Mason University researchers shows that ‘Climategate’ — the unauthorized release in late 2009 of stolen e-mails between climate scientists in the U.S. and United Kingdom — undermined belief in global warming and possibly also trust in climate scientists among TV meteorologists in the United States, at least temporarily.
In the largest and most representative survey of television weathercasters to date, George Mason University’s Center for Climate Change Communication and Center for Social Science Research asked these meteorologists early in 2010, when news stories about the climate e-mails were breaking, several questions about their awareness of the issue, attention to the story and impact of the story on their beliefs about climate change. A large majority (82 percent) of the respondents indicated they had heard of Climategate, and nearly all followed the story at least “a little.”
Among the respondents who indicated that they had followed the story, 42 percent indicated the story made them somewhat or much more skeptical that global warming is occurring.These results stand in stark contrast to the findings of several independent investigations of the emails, conducted later, that concluded no scientific misconduct had occurred and nothing in the emails should cause doubts about the fact which show that global warming is occurring.
The results, which were published in the journal Bulletin of the American Meteorology Society, also showed that the doubts were most pronounced among politically conservative weathercasters and those who either do not believe in global warming or do not yet know. The study showed that age was not a factor nor was professional credentials, but men — independent of political ideology and belief in global warming — were more likely than their female counterparts to say that Climategate made them doubt that global warming was happening.
“Our study shows that TV weathercasters — like most people — are motivated consumers of information in that their beliefs influence what information they choose to see, how they evaluate information, and the conclusions they draw from it,” says Ed Maibach, one of the researchers. “Although subsequent investigations showed that the climate scientists had done nothing wrong, the allegation of wrongdoing undermined many weathercasters’ confidence in the conclusions of climate science, at least temporarily.”
The poll of weathercasters was conducted as part of a larger study funded by the National Science Foundation on American television meteorologists. Maibach and others are now working with a team of TV meteorologists to test what audience members learn when weathercasters make efforts to educate their viewers about the relationship between the changing global climate and local weather conditions.
Ultimately, the team hopes to answer key research questions about how to help television meteorologists nationwide become an effective source of informal science education about climate change.
“Most members of the public consider television weather reporters to be a trusted source of information about global warming — only scientists are viewed as more trustworthy,” says Maibach. “Our research here is based on the premise that weathercasters, if given the opportunity and resources, can become an important source of climate change education for a broad cross section of Americans.”
ScienceDaily (Mar. 29, 2010) — In a time when only a handful of TV news stations employ a dedicated science reporter, TV weathercasters may seem like the logical people to fill that role, and in many cases they do.
In the largest and most representative survey of television weathercasters to date, George Mason University’s Center for Climate Change Communication shows that two-thirds of weathercasters are interested in reporting on climate change, and many say they are already filling a role as an informal science educator.
“Our surveys of the public have shown that many Americans are looking to their local TV weathercaster for information about global warming,” says Edward Maibach, director of the Center for Climate Change Communication. “The findings of this latest survey show that TV weathercasters play — or can play — an important role as informal climate change educators.”
According to the survey, climate change is already one of the most common science topics TV weathercasters discuss — most commonly at speaking events, but also at the beginning or end of their on-air segments, on blogs and web sites, on the radio and in newspaper columns.
Weathercasters also indicated that they are interested in personalizing the story for their local viewers — reporting on local stories such as potential flooding/drought, extreme heat events, air quality and crops. About one-quarter of respondents said they have already seen evidence of climate change in their local weather patterns.
“Only about 10 percent of TV stations have a dedicated specialist to cover these topics,” says University of Texas journalism professor Kristopher Wilson, a collaborator on the survey. “By default, and in many cases by choice, science stories become the domain of the only scientifically trained person in the newsroom — weathercasters.”
Many of the weathercasters said that having access to resources such as climate scientists to interview and high-quality graphics and animations to use on-air would increase their ability to educate the public about climate change.
However, despite their interest in reporting more on this issue, the majority of weathercasters (61 percent) feel there is a lot of disagreement among scientists about the issue of global warming. Though 54 percent indicated that global warming is happening, 25 percent indicated it isn’t, and 21 percent say they don’t know yet.
“A recent survey showed that more than 96 percent of leading climate scientists are convinced that global warming is real and that human activity is a significant cause of the warming,” says Maibach. “Climate scientists may need to make their case directly to America’s weathercasters, because these two groups appear to have a very different understanding about the scientific consensus on climate change.”
This survey is one part of a National Science Foundation-funded research project on meteorologists. Using this data, Maibach and his research team will next conduct a field test of 30-second, broadcast-quality educational segments that TV weathercasters can use in their daily broadcasts to educate viewers about the link between predicted (or current) extreme weather events in that media market and the changing global climate.
Ultimately, the team hopes to answer key research questions supporting efforts to activate TV meteorologists nationwide as an important source of informal science education about climate change.
Comentário de Alexandre A. Costa, um dos mais respeitados meteorologistas do Brasil, sobre a entrevista:
A Negação da Mudança Climática e a Direita Organizada (10 de maio de 2012 – postado no Facebook)
Vocês devem ter assistido ou ouvido falar da entrevista recentemente veiculada no programa do Jô, com o Sr. Ricardo Felício que, mesmo sendo professor da Geografia da USP, atacou a comunidade de cientistas do clima, esboçou uma série de teorias conspiratórias e cometeu absurdos que não fazem sentido científico algum como as afirmações de que “não há elevação do nível do mar”, “o efeito estufa não existe”, “a camada de ozônio não existe”, “a Floresta Amazônica se reconstituiria em 20 anos após ser desmatada” e chegou ao auge ao apresentar uma explicação desprovida de sentido para a alta
temperatura de Vênus, apresentando uma interpretação totalmente absurda da lei dos gases.
Enfim, o que levaria uma pessoa que, a princípio é ligada à comunidade acadêmica, a postura tão absurda? Primeiro, achei tratar-se de alpinismo midiático. Como o currículo da figura não mostra nenhuma produção minimamente relevante, achei apenas que bater no “mainstream” fosse uma maneira de chamar atenção, atrair publicidade, ganhar fama, etc. Ingenuidade minha.
Entrevistador: “Você conhece alguma instituição que apóie o seu pensamento? Como ela funciona? E o que ela faz?” Ridardo Felício: “Recomendo que procurem, aqui no Brasil, a MSIa – Movimento de Solidariedade Ibero-Americana.”
Mas quem é essa MSIa? Um grupo de extrema-direita especialista em teorias conspiratórias e em ataques ao Greenpeace (“um instrumento político das oligarquias internacionais”), ao Movimento de Trabalhadores Sem Terra — MST (“um instrumento de guerra contra o Estado Brasileiro), o Foro de São Paulo (“reúne grupos revolucionários que objetivam desestabilizar as Forças Armadas”), a Pastoral da Terra, etc. Eu mesmo fui no site dessa organização e a última desse pessoal é uma campanha contra a Comissão da Verdade, a favor dos militares (“A quem interessa uma crise militar”)! Para quem quiser conhecer os posicionamentos desse pessoal, basta checar em http://www.msia.org.br/
Eis que um pouco mais de busca e achei o Ricardo Felicio sendo citado (‘”A ONU achou um jeito de implementar seu governo global, e o mundo será gerido por painéis pseudocientíficos””) onde? No site http://www.midiasemmascara.org/ do ultra-direitista Olavo de Carvalho…
Parece ser sintomático que às vésperas do final do prazo para veto do Código ruralista, alguém com esse tipo de vínculo (a MSIa se associa à UDR) venha dizer que se pode desmatar a Amazônia que a mesma se regenera em vinte anos… É interessante que a acusação de uma agenda “ambientalista”, “comunista”, de “governança internacional” ou qualquer que seja o delírio que os negadores da mudança climática colocam ao tentarem politizar-ideologizar a questão apenas mostram de onde vem essa politização-ideologização e com que matiz.
Como costumo dizer, moléculas de CO2 não têm ideologia e absorvem radiação infravermelho, independente da existência não só de posições políticas, mas até dos humanos que as expressam. O aumento de suas concentrações na atmosfera terrestre não poderiam ter outro efeito que não o de aquecimento do sistema climático global. Negar uma verdade científica óbvia então só faz sentido para aqueles que têm interesses atingidos. E fica claro. Esse senhor, que academicamente é um farsante é, na verdade, um militante de direita. Parafraseando aqueles que tanto o admiram, precisa aparecer na mídia sem a máscara de “professor da USP”, “climatologista”, etc., mas sim com sua verdadeira face.
Alexandre A. Costa, Ph.D.
Professor Titular
Mestrado em Ciências Físicas Aplicadas
Universidade Estadual do Ceará
A Negação das Mudanças Climáticas e a Direita Organizada – Parte II: Mais Revelações (13 de maio de 2012 – postado no Facebook)
Não é difícil continuar a ligar os pontos, após a aparição do Sr. Ricardo Felício no programa do Jô Soares. Por que alguém se disporia a se expor ao ridículo daquela forma? Como alguém seria capaz de, na posição de doutor em Geografia, professor da USP e “climatologista”, assassinar não apenas o conhecimento científico recente, mas leis básicas da Física, conhecimentos fundamentais de Química, Ecologia, etc.? O que levaria alguém a insultar de forma tão grosseira a comunidade acadêmica brasileira e internacional, principalmente a nós, Cientistas do Clima?
O que pretendo mostrar é que para chegar a esse ponto, é preciso ter motivações. E estas, meus caros, não são de mera vaidade, desejo pelo estrelato, etc. É uma agenda.
Para os que quiserem continuar comigo a rastrear a motivação por trás dessa tal entrevista, peço que visitem, mesmo que isso dê a eles alguma audiência, o repositório dos vídeos do pop-star tupiniquim da negação das mudanças climáticas em http://www.youtube.com/user/TvFakeClimate. Lá, os links são para o conhecido site http://www.msia.org.br/ do “Movimento de Solidariedade Íbero-Americana”, cujo nome pomposo esconde o neo-fascismo LeRouchista, especializado em teorias conspiratórias e manipulação e inimigo visceral, como se pode ver em seu site, do MST, do movimento feminista, do movimento de direitos humanos, da Comissão da Verdade, etc; para o não menos direitoso http://www.midiaamais.com.br/, cujos artigos não consegui ler até o fim, mas que são de ataques de direita a Obama, de ridicularização do movimento dos moradores do Pinheirinho, em SJC, de combate à decisão do STF em considerar as cotas constitucionais e, claro, negação da mudança climática e ataques ao IPCC, etc,; um site anti-movimento ambientalista de nome http://ecotretas.blogspot.com/, que por sua vez contém links neo-fascistas como “vermelho não” (http://vermelhosnao.blogspot.com.br/search/label/verdismo), que por sinal está fazendo a campanha “Não Veta, Dilma”, ou especializados em teorias conspiratórias como http://paraummundolivre.blogspot.com.br/ e até diretistas exóticos, defensores da restauração da monarquia em Portugal (http://quartarepublica.wordpress.com/) ou neo-salazaristas (http://nacionalismo-de-futuro.blogspot.com.br/).
Como coloquei em diversos momentos, não é a escolha política-ideológica que faz com que alguém tenha ou não razão em torno da questão climática. Tenho colegas em minha comunidade de pesquisa que simpatizam com os mais variados matizes político-ideológicos (o que por si só já dificultaria que nos juntássemos numa “conspiração”… como é mesmo… ah!… para “conquistar uma governança mundial da ONU via painéis de clima”, tipo de histeria típico da direita mais tresloucada dos EUA). A questão do clima é objetiva. Os mecanismos de controle do clima são conhecidos, incluindo o papel dos gases de efeito estufa. As medições, os resultados de modelos (atacados de maneira desonesta pelo entrevistado), os testemunhos paleoclimáticos, todos convergem. E dentre todas as possíveis hipóteses para o fenômeno do aquecimento do sistema climático, a contribuição antrópica via emissão de gases de efeito estufa foi a única a permanecer de pé após todos os testes. Constatar isso independe de ideologia. Basta abrir os olhos. O tipo de política pública a ser aplicada para lidar com os impactos, a adaptação às mudanças e a mitigação das mesmas, aí sim… é um terreno em que as escolhas políticas adquirem grau de liberdade.
O problema é que, para uma determinada franja político-ideológica, no caso a extrema-direita, há realmente incompatibilidade com qualquer agenda ambiental que possa significar controle público sobre o capital privado. Há também uma necessidade de ganhar respaldos afagando desejos escondidos da opinião pública (como o de que nada precisa ser feito a respeito das mudanças climáticas) e fazendo apelos ao nacionalismo (típico dos Mussolinis, dos Hitlers, dos Francos, dos Salazares e de tantas ditaduras de direita na América Latina) – ainda que eventualmente isso signifique adotar um discurso falsamente antiimperialista. Com esses objetivos “maiores”, que incluem sabotar a campanha pelo veto presidencial sobre o monstro que é o Código Florestal aprovado pelos deputados, para que compromisso com a verdade científica? Para que ética e tratamento respeitoso em relação aos demais colegas de mundo acadêmico?
É impressionante como aqueles que nos acusam de “fraude”, “conspiração”, etc., na verdade são exatamente os que as praticam. Como coloquei em outros textos que escrevi sobre o assunto, é preciso desmistificar cientificamente os pseudo-argumentos apresentados pelos negadores (e isso tenho feito em outros textos), mas como bem lembra o colega Michael Mann, eles são como a hidra. Sempre têm mais mentiras na manga para lançarem por aí e não têm preocupação nenhuma em apresentarem um todo coerente em oposição aos pontos de vista da comunidade científica. Interessa a eles semearem confusão, ganharem espaço político, atrasarem ações de proteção da estabilidade climática, darem tempo para os que os financiam na base (ainda que possa haver negadores não ligados diretamente à indústria de petróleo e outras, mas já ficou evidente a ligação desta com a campanha articulada anti-ciência do clima em escala mundial). A pseudo-ciência e a impostura intelectual são as cabeças da hidra. O coração do monstro é a agenda político-ideológica. Mas a espada da verdade é longa o suficiente para ferir-lhe de morte!
Alexandre A. Costa, Ph.D.
Professor Titular
Mestrado em Ciências Físicas Aplicadas
Universidade Estadual do Ceará
Em Defesa da Ciência do Clima (10 de maio de 2012 – postado no Facebook)
Tenho me preocupado muito com os ataques feitos recentemente à Ciência do Clima, dentre outros motivos, porque estes tem se constituído num amálgama estranho que reúne o Tea Party, a indústria petroquímica e pessoas que parecem acreditar numa grande conspiração imperialista para, ao impedir que queimem suas reservas de combustíveis fósseis, a periferia do capitalismo se “desenvolva”, o que, com o perdão da palavra, já é per si uma visão absolutamente tacanha de “desenvolvimento”.
Mas essa não é uma questão ideológica, mesmo porque se o fosse estaria eu distante de Al Gore. É uma questão científica, pois moléculas de CO2 não têm ideologia. O que elas são dotadas, assim como outras moléculas (caso do CH4 e do próprio vapor d’água), é de uma propriedade da qual não gozam os gases majoritários em nossa atmosfera, que é a de um modo de oscilação cuja frequência coincide com a de uma região do espectro eletromagnético conhecida como infravermelho. A retenção do calor é uma consequência da presença desses gases (mesmo tão minoritários) na atmosfera terrestre. Não fosse por eles, a Terra teria temperatura média de -18 graus, em contraste com os moderados 15, para não falar do papel dos mesmos em mantê-la entre limites amenos. A Terra não é Mercúrio que, por não ter atmosfera, devolve livremente a energia absorvida do Sol na porção em que é dia, levando-o a contrastes de temperatura de 430 graus durante o dia e -160 graus à noite. Felizmente, tampouco é Vênus, cuja cobertura de nuvens faz com que chegue à sua superfície menos energia solar do que na Terra, mas cujo efeito estufa, causado por sua atmosfera composta quase que exclusivamente por CO2, eleva sua temperatura a praticamente constantes 480 graus.
Desconhecer essas idéias científicas simples, de que o CO2 é um gás de efeito estufa (conhecido e medido por Tyndall, Arrhenius e outros, desde o século XIX), com mecanismo bem explicado pela Física de sua estrutura molecular; ignorar o conhecido efeito global que o CO2 tem sobre um planeta vizinho, o que é bem estabelecido pela astronomia desde o saudoso Sagan, não faz sentido, especialmente no meio acadêmico, onde encontram-se alguns dos negadores mais falantes. A esses eu gostaria de lembrar de algo básico no método científico. De um lado, a ciência não tem dogma, nem verdades definitivas. Suas verdades são sempre, por construção, parciais e provisórias (que bom, senão viraria algo chato e tedioso como, digamos, uma religião). No entanto, por outro lado, o conhecimento científico é cumulativo e, nesse sentido, não se pode andar para trás! Só quando uma teoria falha, se justifica uma nova e esta não pode ser apenas a negação da anterior, pois precisa ser capaz de reproduzir todos os seus méritos (caso da Mecânica Clássica e da Relatividade, que se reduz à primeira para baixas velocidades).
Não é uma questão de crença. “Monotonia” à parte, é ciência bem estabelecida, bem conhecida. Tanto quanto a Gravitação Universal (que também é “apenas” uma teoria) ou a Evolução das Espécies.
INJUSTIÇA, DESRESPEITO E SUBESTIMAÇÃO
Os Cientistas do Clima tem sofrido ataques, com base em factóides que em nenhum momento se assemelham à realidade de nossa área. Nenhuma Ciência é hoje tão pública e aberta. Quem quiser, pode obter facilmente, na maioria dos casos diretamente pela internet, dados observados do clima, que demonstram claramente o aquecimento global (www.cru.uea.ac.uk/cru/data/ dentre outros), dados de modelagem que estão sendo gerados agora e que certamente subsidiarão o 5o relatório do IPCC (http://cmip-pcmdi.llnl.gov/cmip5/data_portal.html) ou dados de testemunhos paleoclimáticos, que servem para analisar o clima do passado (www.ncdc.noaa.gov). Pode obter os relatórios do IPCC, emwww.ipcc.ch e seguir as referências, revisadas e publicadas em sua esmagadora maioria, principalmente no caso do Grupo de Trabalho que lida com as Bases Físicas do Sistema Climático, em revistas de grande impacto, sejam gerais (Science, Nature), sejam da área. Duvido que, em nossas universidades, cheias de laboratórios com convênios privados, sejam na engenharia de materiais ou na bioquímica, haja um segmento tão aberto, que tenha o desprendimento de sentar à mesa, compartilhar dados, levantar o estado-da-arte em sua ciência e elaborar coletivamente um relatório de síntese. Duvido! Desafio!
Os cientistas que participamos desses painéis não somos “representantes de governos”. Nada é criado ou inventado nesses painéis, além de uma síntese da Ciência que é produzida de maneira independente e publicada na literatura revisada por pares. Os que participam da comunidade acadêmica podem, inclusive, se informar melhor com facilidade, junto a colegas da comunidade científica brasileira que participaram e participam das iniciativas do IPCC e do PBMC sobre o funcionamento desses painéis, antes de emitir opinião, para que não terminem, na prática, difamando o que desconhece. Algumas pessoas, sem a menor conduta crítica em relação aos detratores do IPCC, repete-lhes a verborragia, quando poderiam ser céticos em relação aos “céticos”.
Mas não o são. Em nenhum momento, questionam as reais motivações de dois ou três (felizmente, são tão raros) que assumem a conduta lamentável da negação anti-ciência, ou por serem abertamente corruptos e serviçais da indústria petroquímica ou, simplesmente, por terem uma vaidade que não cabe no papel secundário que cumpririam caso estivessem, como nós, desprendendo, em geral quase anonimamente, enorme energia para colocar tijolo por tijolo no edifício da Ciência do Clima. É preciso saber distinguir entre o ceticismo honesto, genuíno, que é saudável em ciência, consonante com a dúvida sincera e a conduta crítica, da negação religiosa, baseada em fé e na necessidade cega de defender determinado ponto de vista, independente se o mesmo tem base real ou não e, principalmente, da canalhice pura e simples, que é o que é promovido por alguns dos negadores. O possível “sucesso” dessas idéias junto ao público, para mim, são terreno da psicologia social, mas a melhor analogia que tenho é a da popularidade de idéias religiosas, em geral mentiras reconfortantes que são preferidas em detrimento de verdades desagradáveis.
O verdadeiro ceticismo levou até onde os físicos de Berkeley foram (http://www.berkeleyearth.org/index.php). Inicialmente questionando os resultados obtidos por nossa comunidade, se municiaram de um enorme banco de dados de temperatura em escala mundial, mais amplo do que os que o Hadley Centre inglês e a NASA dispunham. Testaram outras metodologias, chegaram até a excluir as estações meteorológicas usadas por nossos centros de pesquisa. A postura inicial de Richard Muller, idealizador dessa iniciativa, era de tamanho questionamento em relação a nossos resultados que ele chegou a alavancar recursos da famigerada Fundação Koch, abertamente anti-Ciência do Clima. Mas o que Muller e seus parceiros encontraram? O mesmo resultado que já nos era conhecido. A Terra está aquecendo e este aquecimento se acelerou bastante nas últimas décadas do século XX. Este aquecimento se aproxima de um grau e portanto está muito acima de todas as flutuações naturais registradas desde que se tem registro instrumental. Aliás, confirmou o que também sabíamos: que os dados da Universidade de East Anglia (aqueles mesmos da farsa montada sob o nome altissonante de “climategate”, aqueles que foram perseguidos e cuja reputação foi ignominiosamente atacada, com repercussões em suas carreiras profissionais e vidas pessoais) contém um erro… para menos! O aquecimento sugerido pelos dados da CRU/UEA é um décimo de grau inferior aos das outras fontes de dados e, claro, entre nós, ninguém os acusa de desonestos por isso.
Outra impostura – e infelizmente, apesar da dureza do termo, acho que é neste caso em que ele se aplica – é a subestimação da inteligência de nossa comunidade, aliada ao desconhecimento dos materiais por ela produzidos. O 4o relatório do IPCC já contém um capítulo exclusivamente sobre Paleoclimatologia, isto é, sobre o clima do passado. Eu pessoalmente tenho dedicado grandes esforços na análise de testemunhos do clima passado e na modelagem das condições climáticas passadas. Existe uma preocupação permanente em discernir o sinal natural e separar, dele, o sinal antrópico, desde o primeiro relatório do IPCC. Para isso, avalia-se o papel das variações de atividade solar, as emissões dos vulcões, etc. Já avaliamos as possíveis influências naturais e as descartamos como possível causa para o aquecimento observado.
Nesse sentido, não há lugar para sofismas e tergiversações. Sobre os registros paleoclimáticos, que são capazes de recontar o histórico de temperatura e de concentração de gases de efeito estufa de 800 mil anos atrás até o presente, todos sabemos que, no passado, um pequeno aquecimento do planeta precedeu o aumento da concentração dos gases de efeito estufa. Isso se deu antes do encerramento de todas as eras glaciais. Mas é um raciocínio obtuso deduzir daí que o CO2 não exerce nenhum papel ou, nas palavras dos negadores “é consequência e não causa”. Existem diversos processos de retroalimentação no sistema climático e este é um dos melhores exemplos. As sutis variações da insolação e da distribuição desta sobre a superfície da Terra associadas aos ciclos orbitais são – e isto é do conhecimento de todos – muito pequenas para explicar as grandes diferenças de temperatura entre os períodos glaciais (“eras do gelo”) e os interglaciais (períodos quentes, mais breves, que as intercalaram). Mas um aquecimento sutil, após alguns séculos, mostrou-se suficiente para elevar a emissões naturais de CO2 e metano, que causam efeito estufa e amplificam o processo. Essa retroalimentação só era refreada, em condições livres da ação do homem, quando as condições orbitais mudavam novamente, levando a um resfriamento sutil, que induzia a captura de CO2 no sistema terrestre, que por sua vez amplificava o resfriamento e assim por diante.
Mas não é porque pessoas morrem de câncer e infarto que não se possa atribuir responsabilidades a um assassino! Porque pessoas morrem naturalmente de derrame, alguém acha possível dizer que “é impossível que um tiro mate alguém”? Ou que não se deva julgar mais ninguém por assassinato? Antes, era preciso um pequeno aquecimento para deflagrar emissões naturais e aumento de concentração de CO2, para daí o aquecimento se acelerar. Hoje, há uma fonte independente de CO2, estranha aos ciclos naturais e esta é a queima de combustíveis fósseis! Devo, aliás, frisar que até a análise isotópica (a composição é diferente entre combustíveis fósseis e outras fontes) é clara: a origem do CO2 excedente na atmosfera terrestre é sim, em sua maioria, petróleo, carvão, gás natural! Um mínimo de verdadeiro aprofundamento científico deixa claro que, hoje, o aumento das concentrações de CO2 na atmosfera é eminentemente antrópico e que é isso que vem acarretando as mudanças climáticas observadas. Não é possível mais tapar o sol, ou melhor, tapar os gases de efeito estufa com uma peneira! Os registros paleoclimáticos mostram que o aquecimento atual é inédito nos últimos 2500 anos. Mostram que a concentração atual de CO2 está 110 ppm acima do observado antes da era industrial e quase 100 ppm acima do que se viu nos últimos 800 mil anos. Mostram que esse número é maior do que a diferença entre a concentração de CO2 existente nos interglaciais e nas “eras do gelo” e que isso faz, sim, grande diferença sobre o clima.
QUAIS OS VERDADEIROS ERROS
Algumas pessoas se dizem céticas, críticas e desconfiadas em relação à maioria de nossa comunidade de cientistas do clima, mas não percebem o erro fundamental que cometem: a absoluta falta de ceticismo, criticidade e desconfiança em relação aos que nos detratam. A postura dos que combatem a Ciência do Clima sob financiamento da indústria petroquímica, ou em associação com setores partidários e da mídia mais reacionários é auto-explicativa. Interessa o acobertamento da realidade. Mas não só. Há desde essas pessoas que recebem diretamente recursos da indústria do petróleo a falastrões que há muito não têm atuação científica de verdade na área e, sem serem capazes de permanecer em evidência trabalhando seriamente para contribuir com o avançar de nossa ciência, debruçando-se sobre as verdadeiras incertezas, contribuindo para coletar dados, melhorar métodos e modelos, etc., apenas para manterem holofotes sobre si, têm atacado o restante da comunidade. Estranho e espalhafatoso como as penas de pavão. Prosaico como os mecanismos evolutivos que levaram tais penas a surgirem. Daí é preciso também combater o ponto de vista daqueles que dão a esse ataque um falso verniz “de esquerda”, pois lançam mão de teorias de conspiração, uma deturpação patológica do raciocínio crítico. Lutar com o alvo errado, com a arma errada, é pior do que desarmar para a luta.
O IPCC é perfeito? Não, é claro. Cometeu erros. Mas querem saber, de fato, quais são? Uma coisa precisa ficar claro a todos. As avaliações do IPCC tendem a ser conservadoras. As projeções de temperatura realizadas para após o ano 2000 estão essencialmente acertadas, mas sabe o que acontece com as projeções de elevação do nível dos oceanos e de degelo no Ártico? Estão subestimadas. Isso mesmo. O cenário verdadeiro é mais grave do que o 4o relatório do IPCC aponta. Mas de novo não é por uma questão política, mas pela limitação, na época, dos modelos de criosfera, incapazes de levar em conta processos importantes que levam ao degelo. Provavelmente, baseando-se em artigos que vêm sendo publicados nesse meio tempo, o 5o relatório será capaz de corrigir essas limitações e mostrar um quadro mais próximo da real gravidade do problema em 2013-2014 quando de sua publicação.
QUAL A VERDADEIRA QUESTÃO IDEOLÓGICA?
Não faz sentido “acreditar” ou não na gravidade, na evolução ou no efeito estufa. Não se trata de uma “opção ideológica” (apesar de haver, nos EUA, uma forte correlação entre ideologia e ciência junto ao eleitorado republicano mais reacionário, que dá ouvidos aos detratores da ciência do clima e que também querem ver Darwin fora das escolas).
A verdadeira questão ideológica, é que as mudanças climáticas são um processo de extrema desigualdade, da raiz, aos seus impactos. Quem mais se beneficiou das emissões dos gases de efeito estufa foram e continuam sendo as classes dominantes dos países capitalistas centrais. Juntamente com os mega-aglomerados do capital financeiro, a indústria petroquímica, o setor de mineração (que inclui mineração de carvão), o setor energético, etc. concentraram riquezas usando a atmosfera como sua grande lata de lixo. Mais do que a “pegada” de carbono atual (que é ainda extremamente desigual se compararmos americanos, europeus e australianos, de um lado, com africanos do outro), é mais díspar ainda a “pegada histórica” (isto é, o já emitido, o acumulado a partir das emissões de cada país), que faz da Europa e, em seguida, dos EUA, grandes emissores históricos.
Cruelmente, em contrapartida, os impactos das mudanças no clima recairão sobre os países mais pobres, sobre as pequenas nações, principalmente sobre os pobres dos países pobres, sobre os mais vulneráveis. Perda de territórios em países insulares, questões de segurança hídrica e alimentar em regiões semi-áridas (tão vastas no berço de nossa espécie, que é o continente africano), efeitos de eventos severos (que, com base física muito clara, devem se tornar mais frequentes num planeta aquecido), comprometimento de ecossistemas marinhos costeiros e florestas, atingindo pesca e atividades de coleta; inviabilização de culturas agrícolas tradicionais… tudo isso recai onde? Sobre o andar de baixo! O de cima fala em “adaptação” e tem muito mais instrumentos para se adaptar às mudanças. A nós, neste caso, interessa sermos conservadores quanto ao clima e frear esse “experimento” desastrado, desordenado, de alteração da composição química da atmosfera terrestre e do balanço energético planetário! Para a maioria dos 7 bilhões de habitantes dessa esfera, a estabilidade climática é importante!
Alguns dos mais ricos, na verdade, veem o aquecimento global como “oportunidade”… Claro, “oportunidade” de expandir o agronegócio para as futuras terras agricultáveis do norte do Canadá e da Sibéria e para explorar petróleo no oceano que se abrirá com o crescente degelo do Ártico.
Assim, é preciso perceber que há uma verdadeira impostura vagando por aí e a Ciência precisa ser defendida. Uma rocha é uma rocha; uma árvore é uma árvore; uma molécula de CO2 é uma molécula de CO2, independente de ideologia. Mas os de baixo só serão/seremos capazes de se/nos armarem/armarmos para transformar a sociedade se estiverem/estivermos bem informados e aí, é preciso combater os absurdos proferidos pelos detratores da Ciência do Clima.
Alexandre Costa é bacharel em Física e mestre em Física pela Universidade Federal do Ceará, Ph.D. em Ciências Atmosféricas pela Colorado State University, com pós-doutorado pela Universidade de Yale, com publicações em diversos periódicos científicos, incluindo Science, Journal of the Amospheric Sciences e Atmospheric Research. É bolsista de produtividade do CNPq e membro do Painel Brasileiro de Mudanças Climáticas.
ScienceDaily (Oct. 16, 2009) — Worried about climate change and want to learn more? You probably aren’t watching television then. A new study by George Mason University Communication Professor Xiaoquan Zhao suggests that watching television has no significant impact on viewers’ knowledge about the issue of climate change. Reading newspapers and using the web, however, seem to contribute to people’s knowledge about this issue.
The study, “Media Use and Global Warming Perceptions: A Snapshot of the Reinforcing Spirals,” looked at the relationship between media use and people’s perceptions of global warming. The study asked participants how often they watch TV, surf the Web, and read newspapers. They were also asked about their concern and knowledge of global warming and specifically its impact on the polar regions.
“Unlike many other social issues with which the public may have first-hand experience, global warming is an issue that many come to learn about through the media,” says Zhao. “The primary source of mediated information about global warming is the news.”
The results showed that people who read newspapers and use the Internet more often are more likely to be concerned about global warming and believe they are better educated about the subject. Watching more television, however, did not seem to help.
He also found that individuals concerned about global warming are more likely to seek out information on this issue from a variety of media and nonmedia sources. Other forms of media, such as the Oscar-winning documentary “The Inconvenient Truth” and the blockbuster thriller “The Day After Tomorrow,” have played important roles in advancing the public’s interest in this domain.
Politics also seemed to have an influence on people’s perceptions about the science of global warming. Republicans are more likely to believe that scientists are still debating the existence and human causes of global warming, whereas Democrats are more likely to believe that a scientific consensus has already been achieved on these matters.
“Some media forms have clear influence on people’s perceived knowledge of global warming, and most of it seems positive,” says Zhao. “Future research should focus on how to harness this powerful educational function.”
ScienceDaily (Nov. 21, 2011) — People who believe there is a lot of disagreement among scientists about global warming tend to be less certain that global warming is happening and less supportive of climate policy, researchers at George Mason, San Diego State, and Yale Universities report in a new study published in the journal Nature Climate Change.
A recent survey of climate scientists conducted by researchers at the University of Illinois found near unanimous agreement among climate scientists that human-caused global warming is happening.
This new George Mason University study, however, using results from a national survey of the American public, finds that many Americans believe that most climate scientists actually disagree about the subject.
In the national survey conducted in June 2010, two-thirds of respondents said they either believed there is a lot of disagreement among scientists about whether or not global warming is happening (45 percent), that most scientists think it is not happening (5 percent), or that they did not know enough to say (16 percent.) These respondents were less likely to support climate change policies and to view climate change as a lower priority.
By contrast, survey respondents who correctly understood that there is widespread agreement about global warming among scientists were themselves more certain that it is happening, and were more supportive of climate policies.
“Misunderstanding the extent of scientific agreement about climate change is important because it undermines people’s certainty that climate change is happening, which in turn reduces their conviction that America should find ways to deal with the problem,” says Edward Maibach, director of the Center for Climate Change Communication at George Mason University.
Maibach argues that a campaign should be mounted to correct this misperception. “It is no accident that so many Americans misunderstand the widespread scientific agreement about human-caused climate change. A well-financed disinformation campaign deliberately created a myth about there being lack of agreement. The climate science community should take all reasonable measures to put this myth to rest.”
ScienceDaily (Oct. 14, 2010) — Sixty-three percent of Americans believe that global warming is happening, but many do not understand why, according to a national study conducted by researchers at Yale University.
The report titled “Americans’ Knowledge of Climate Change” found that only 57 percent know what the greenhouse effect is, only 45 percent of Americans understand that carbon dioxide traps heat from the Earth’s surface, and just 50 percent understand that global warming is caused mostly by human activities. Large majorities incorrectly think that the hole in the ozone layer and aerosol spray cans cause global warming. Meanwhile, 75 percent of Americans have never heard of the related problems of ocean acidification or coral bleaching.
However, many Americans do understand that emissions from cars and trucks and the burning of fossil fuels contribute to global warming and that a transition to renewable energy sources is an important solution.
Americans also recognize their own limited understanding. Only 1 in 10 say that they are “very well-informed” about climate change, and 75 percent say they would like to know more about the issue. Likewise, 75 percent say that schools should teach children about climate change and 68 percent would welcome a national program to teach Americans more about the issue.
“This study demonstrates that Americans need to learn more about the causes, impacts and potential solutions to global warming,” said study director Anthony Leiserowitz of Yale University. “But it also shows that Americans want to learn more about climate change in order to make up their minds and take action.”
The online survey was conducted by Knowledge Networks from June 24 to July 22, 2010, with 2,030 American adults 18 and older. The margin of sampling error is plus- or minus-2 percent, with 95 percent confidence.
ScienceDaily (Mar. 27, 2008) — The more you know the less you care — at least that seems to be the case with global warming. A telephone survey of 1,093 Americans by two Texas A&M University political scientists and a former colleague indicates that trend, as explained in their recent article in the peer-reviewed journal Risk Analysis.
“More informed respondents both feel less personally responsible for global warming, and also show less concern for global warming,” states the article, titled “Personal Efficacy, the Information Environment, and Attitudes toward Global Warming and Climate Change in the USA.”
The study showed high levels of confidence in scientists among Americans led to a decreased sense of responsibility for global warming.
The diminished concern and sense of responsibility flies in the face of awareness campaigns about climate change, such as in the movies An Inconvenient Truth and Ice Age: The Meltdown and in the mainstream media’s escalating emphasis on the trend.
The research was conducted by Paul M. Kellstedt, a political science associate professor at Texas A&M; Arnold Vedlitz, Bob Bullock Chair in Government and Public Policy at Texas A&M’s George Bush School of Government and Public Service; and Sammy Zahran, formerly of Texas A&M and now an assistant professor of sociology at Colorado State University.
Kellstedt says the findings were a bit unexpected. The focus of the study, he says, was not to measure how informed or how uninformed Americans are about global warming, but to understand why some individuals who are more or less informed about it showed more or less concern.
“In that sense, we didn’t really have expectations about how aware or unaware people were of global warming,” he says.
But, he adds, “The findings that the more informed respondents were less concerned about global warming, and that they felt less personally responsible for it, did surprise us. We expected just the opposite.
“The findings, while rather modest in magnitude — there are other variables we measured which had much larger effects on concern for global warming — were statistically quite robust, which is to say that they continued to appear regardless of how we modeled the data.”
Measuring knowledge about global warming is a tricky business, Kellstedt adds.
“That’s true of many other things we would like to measure in surveys, of course, especially things that might embarrass people (like ignorance) or that they might feel social pressure to avoid revealing (like prejudice),” he says.
“There are no industry standards, so to speak, for measuring knowledge about global warming. We opted for this straightforward measure and realize that other measures might produce different results.”
Now, for better or worse, scientists have to deal with the public’s abundant confidence in them. “But it cannot be comforting to the researchers in the scientific community that the more trust people have in them as scientists, the less concerned they are about their findings,” the researchers conclude in their study.
ScienceDaily (Mar. 26, 2008) — British Prime Minister Gordon Brown recently declared climate change a top international threat, and Al Gore urged politicians to get involved to fight global warming. Results from a recent survey conducted by a University of Missouri professor reveal that the U.S. public, while aware of the deteriorating global environment, is concerned predominantly with local and national environmental issues.
Potomac River near Washington DC. The top three issues that the US public wants the government to address are protecting community drinking water, reducing pollution of U.S. rivers and lakes, and improving urban air pollution issues like smog. (Credit: Michele Hogan)
“The survey’s core result is that people care about their communities and express the desire to see government action taken toward local and national issues,” said David Konisky, a policy research scholar with the Institute of Public Policy. “People are hesitant to support efforts concerning global issues even though they believe that environmental quality is poorer at the global level than at the local and national level. This is surprising given the media attention that global warming has recently received and reflects the division of opinion about the severity of climate change.”
Konisky, an assistant professor in the Truman School of Public Affairs at MU, recently surveyed 1,000 adults concerning their attitudes about the environment. The survey polled respondents about their levels of concern for the environment and preferences for government action to address a wide set of environmental issues.
A strong majority of the public expressed general concern about the environment. According to the survey, the top three issues that the public wants the government to address are protecting community drinking water, reducing pollution of U.S. rivers and lakes, and improving urban air pollution issues like smog. In the survey, global warming ranks eighth in importance.
“Americans are clearly most concerned about pollution issues that might affect their personal health, or the health of their families,” Konisky said.
Additionally, Konisky and his colleagues found that the best predictor of individuals’ environmental preferences is their political attributes. They examined the relationship between party identification and political ideology and support for action to address environmental problems.
“The survey reinforced the stark differences in people’s environmental attitudes, depending on their political leanings,” Konisky said. “Democrats and political liberals clearly express more desire for governmental action to address environmental problems. Republicans and ideological conservatives are much less enthusiastic about further government intervention.”
Results from the survey were recently presented at the annual meeting of the Western Political Science Association in San Diego.
ScienceDaily (May 8, 2012) — Americans’ support for government action on global warming remains high but has dropped during the past two years, according to a new survey by Stanford researchers in collaboration with Ipsos Public Affairs. Political rhetoric and cooler-than-average weather appear to have influenced the shift, but economics doesn’t appear to have played a role.
The survey directed by Jon Krosnick, a senior fellow at the Stanford Woods Institute for the Environment, shows that support for a range of policies intended to reduce future climate change dropped by an average of 5 percentage points per year between 2010 and 2012.
In a 2010 Stanford survey, more than three-quarters of respondents expressed support for mandating more efficient and less polluting cars, appliances, homes, offices and power plants. Nearly 90 percent of respondents favored federal tax breaks to spur companies to produce more electricity from water, wind and solar energy. On average, 72 percent of respondents supported government action on climate change in 2010. By 2012, that support had dropped to 62 percent.
The drop was concentrated among Americans who distrust climate scientists, even more so among such people who identify themselves as Republicans. Americans who do not trust climate science were especially aware of and influenced by recent shifts in world temperature, and 2011 was tied for the coolest of the last 11 years.
Krosnick pointed out that during the recent campaign, all but one Republican presidential candidate expressed doubt about global warming, and some urged no government action to address the issue. Rick Santorum described belief in climate change as a “pseudo-religion,” while Ron Paul called it a “hoax.” Mitt Romney, the apparent Republican nominee, has said, “I can tell you the right course for America with regard to energy policy is to focus on job creation and not global warming.”
The Stanford-Ipsos study found no evidence that the decline in public support for government action was concentrated among respondents who lived in states struggling the most economically.
The study found that, overall, the majority of Americans continue to support many specific government actions to mitigate global warming’s effect. However, most Americans remain opposed to consumer taxes intended to decrease public use of electricity and gasoline.
Author Uncovers DNA Links Between Members of Tribe
MONTAGE KURT HOFFMAN
By Jon Entine
Published May 04, 2012, issue of May 11, 2012.
In his new book, “Legacy: A Genetic History of the Jewish People,” Harry Ostrer, a medical geneticist and professor at Albert Einstein College of Medicine in New York, claims that Jews are different, and the differences are not just skin deep. Jews exhibit, he writes, a distinctive genetic signature. Considering that the Nazis tried to exterminate Jews based on their supposed racial distinctiveness, such a conclusion might be a cause for concern. But Ostrer sees it as central to Jewish identity.
“Who is a Jew?” has been a poignant question for Jews throughout our history. It evokes a complex tapestry of Jewish identity made up of different strains of religious beliefs, cultural practices and blood ties to ancient Palestine and modern Israel. But the question, with its echoes of genetic determinism, also has a dark side.
Geneticists have long been aware that certain diseases, from breast cancer to Tay-Sachs, disproportionately affect Jews. Ostrer, who is also director of genetic and genomic testing at Montefiore Medical Center, goes further, maintaining that Jews are a homogeneous group with all the scientific trappings of what we used to call a “race.”
For most of the 3,000-year history of the Jewish people, the notion of what came to be known as “Jewish exceptionalism” was hardly controversial. Because of our history of inmarriage and cultural isolation, imposed or self-selected, Jews were considered by gentiles (and usually referred to themselves) as a “race.” Scholars from Josephus to Disraeli proudly proclaimed their membership in “the tribe.”
Legacy: A Genetic History of the Jewish People
By Harry Ostrer
Oxford University Press, 288 Pages, $24.95
Ostrer explains how this concept took on special meaning in the 20th century, as genetics emerged as a viable scientific enterprise. Jewish distinctiveness might actually be measurable empirically. In “Legacy,” he first introduces us to Maurice Fishberg, an upwardly mobile Russian-Jewish immigrant to New York at the fin de siècle. Fishberg fervently embraced the anthropological fashion of the era, measuring skull sizes to explain why Jews seemed to be afflicted with more diseases than other groups — what he called the “peculiarities of the comparative pathology of the Jews.” It turns out that Fishberg and his contemporary phrenologists were wrong: Skull shape provides limited information about human differences. But his studies ushered in a century of research linking Jews to genetics.
Ostrer divides his book into six chapters representing the various aspects of Jewishness: Looking Jewish, Founders, Genealogies, Tribes, Traits and Identity. Each chapter features a prominent scientist or historical figure who dramatically advanced our understanding of Jewishness. The snippets of biography lighten a dense forest of sometimes-obscure science. The narrative, which consists of a lot of potboiler history, is a slog at times. But for the specialist and anyone touched by the enduring debate over Jewish identity, this book is indispensable.
“Legacy” may cause its readers discomfort. To some Jews, the notion of a genetically related people is an embarrassing remnant of early Zionism that came into vogue at the height of the Western obsession with race, in the late 19th century. Celebrating blood ancestry is divisive, they claim: The authors of “The Bell Curve” were vilified 15 years ago for suggesting that genes play a major role in IQ differences among racial groups.
Furthermore, sociologists and cultural anthropologists, a disproportionate number of whom are Jewish, ridicule the term “race,” claiming there are no meaningful differences between ethnic groups. For Jews, the word still carries the especially odious historical association with Nazism and the Nuremberg Laws. They argue that Judaism has morphed from a tribal cult into a worldwide religion enhanced by thousands of years of cultural traditions.
Is Judaism a people or a religion? Or both? The belief that Jews may be psychologically or physically distinct remains a controversial fixture in the gentile and Jewish consciousness, and Ostrer places himself directly in the line of fire. Yes, he writes, the term “race” carries nefarious associations of inferiority and ranking of people. Anything that marks Jews as essentially different runs the risk of stirring either anti- or philo-Semitism. But that doesn’t mean we can ignore the factual reality of what he calls the “biological basis of Jewishness” and “Jewish genetics.” Acknowledging the distinctiveness of Jews is “fraught with peril,” but we must grapple with the hard evidence of “human differences” if we seek to understand the new age of genetics.
Although he readily acknowledges the formative role of culture and environment, Ostrer believes that Jewish identity has multiple threads, including DNA. He offers a cogent, scientifically based review of the evidence, which serves as a model of scientific restraint.
“On the one hand, the study of Jewish genetics might be viewed as an elitist effort, promoting a certain genetic view of Jewish superiority,” he writes. “On the other, it might provide fodder for anti-Semitism by providing evidence of a genetic basis for undesirable traits that are present among some Jews. These issues will newly challenge the liberal view that humans are created equal but with genetic liabilities.”
Jews, he notes, are one of the most distinctive population groups in the world because of our history of endogamy. Jews — Ashkenazim in particular — are relatively homogeneous despite the fact that they are spread throughout Europe and have since immigrated to the Americas and back to Israel. The Inquisition shattered Sephardi Jewry, leading to far more incidences of intermarriage and to a less distinctive DNA.
In traversing this minefield of the genetics of human differences, Ostrer bolsters his analysis with volumes of genetic data, which are both the book’s greatest strength and its weakness. Two complementary books on this subject — my own “Abraham’s Children: Race, Identity, and the DNA of the Chosen People” and “Jacob’s Legacy: A Genetic View of Jewish History” by Duke University geneticist David Goldstein, who is well quoted in both “Abraham’s Children” and “Legacy” — are more narrative driven, weaving history and genetics, and are consequently much more congenial reads.
The concept of the “Jewish people” remains controversial. The Law of Return, which establishes the right of Jews to come to Israel, is a central tenet of Zionism and a founding legal principle of the State of Israel. The DNA that tightly links Ashkenazi, Sephardi and Mizrahi, three prominent culturally and geographically distinct Jewish groups, could be used to support Zionist territorial claims — except, as Ostrer points out, some of the same markers can be found in Palestinians, our distant genetic cousins, as well. Palestinians, understandably, want their own right of return.
That disagreement over the meaning of DNA also pits Jewish traditionalists against a particular strain of secular Jewish liberals that has joined with Arabs and many non-Jews to argue for an end to Israel as a Jewish nation. Their hero is Shlomo Sand, an Austrian-born Israeli historian who reignited this complex controversy with the 2008 publication of “The Invention of the Jewish People.”
Sand contends that Zionists who claim an ancestral link to ancient Palestine are manipulating history. But he has taken his thesis from novelist Arthur Koestler’s 1976 book, “The Thirteenth Tribe,” which was part of an attempt by post-World War II Jewish liberals to reconfigure Jews not as a biological group, but as a religious ideology and ethnic identity.
The majority of the Ashkenazi Jewish population, as Koestler, and now Sand, writes, are not the children of Abraham but descendants of pagan Eastern Europeans and Eurasians, concentrated mostly in the ancient Kingdom of Khazaria in what is now Ukraine and Western Russia. The Khazarian nobility converted during the early Middle Ages, when European Jewry was forming.
Although scholars challenged Koestler’s and now Sand’s selective manipulation of the facts — the conversion was almost certainly limited to the tiny ruling class and not to the vast pagan population — the historical record has been just fragmentary enough to titillate determined critics of Israel, who turned both Koestler’s and Sand’s books into roaring best-sellers.
Fortunately, re-creating history now depends not only on pottery shards, flaking manuscripts and faded coins, but on something far less ambiguous: DNA. Ostrer’s book is an impressive counterpoint to the dubious historical methodology of Sand and his admirers. And, as a co-founder of the Jewish HapMap — the study of haplotypes, or blocks of genetic markers, that are common to Jews around the world — he is well positioned to write the definitive response.
In accord with most geneticists, Ostrer firmly rejects the fashionable postmodernist dismissal of the concept of race as genetically naive, opting for a more nuanced perspective.
When the human genome was first mapped a decade ago, Francis Collins, then head of the National Genome Human Research Institute, said: “Americans, regardless of ethnic group, are 99.9% genetically identical.” Added J. Craig Venter, who at the time was chief scientist at the private firm that helped sequenced the genome, Celera Genomics, “Race has no genetic or scientific basis.” Those declarations appeared to suggest that “race,” or the notion of distinct but overlapping genetic groups, is “meaningless.”
But Collins and Venter have issued clarifications of their much-misrepresented comments. Almost every minority group has faced, at one time or another, being branded as racially inferior based on a superficial understanding of how genes peculiar to its population work. The inclination by politicians, educators and even some scientists to underplay our separateness is certainly understandable. But it’s also misleading. DNA ensures that we differ not only as individuals, but also as groups.
However slight the differences (and geneticists now believe that they are significantly greater than 0.1%), they are defining. That 0.1% contains some 3 million nucleotide pairs in the human genome, and these determine such things as skin or hair color and susceptibility to certain diseases. They contain the map of our family trees back to the first modern humans.
Both the human genome project and disease research rest on the premise of finding distinguishable differences between individuals and often among populations. Scientists have ditched the term “race,” with all its normative baggage, and adopted more neutral terms, such as “population” and “clime,” which have much of the same meaning. Boiled down to its essence, race equates to “region of ancestral origin.”
Ostrer has devoted his career to investigating these extended family trees, which help explain the genetic basis of common and rare disorders. Today, Jews remain identifiable in large measure by the 40 or so diseases we disproportionately carry, the inescapable consequence of inbreeding. He traces the fascinating history of numerous “Jewish diseases,” such as Tay-Sachs, Gaucher, Niemann-Pick, Mucolipidosis IV, as well as breast and ovarian cancer. Indeed, 10 years ago I was diagnosed as carrying one of the three genetic mutations for breast and ovarian cancer that mark my family and me as indelibly Jewish, prompting me to write “Abraham’s Children.”
Like East Asians, the Amish, Icelanders, Aboriginals, the Basque people, African tribes and other groups, Jews have remained isolated for centuries because of geography, religion or cultural practices. It’s stamped on our DNA. As Ostrer explains in fascinating detail, threads of Jewish ancestry link the sizable Jewish communities of North America and Europe to Yemenite and other Middle Eastern Jews who have relocated to Israel, as well as to the black Lemba of southern Africa and to India’s Cochin Jews. But, in a twist, the links include neither the Bene Israel of India nor Ethiopian Jews. Genetic tests show that both groups are converts, contradicting their founding myths.
Why, then, are Jews so different looking, usually sharing the characteristics of the surrounding populations? Think of red-haired Jews, Jews with blue eyes or the black Jews of Africa. Like any cluster — a genetic term Ostrer uses in place of the more inflammatory “race” — Jews throughout history moved around and fooled around, although mixing occurred comparatively infrequently until recent decades. Although there are identifiable gene variations that are common among Jews, we are not a “pure” race. The time machine of our genes may show that most Jews have a shared ancestry that traces back to ancient Palestine but, like all of humanity, Jews are mutts.
About 80% of Jewish males and 50% of Jewish females trace their ancestry back to the Middle East. The rest entered the “Jewish gene pool” through conversion or intermarriage. Those who did intermarry often left the faith in a generation or two, in effect pruning the Jewish genetic tree. But many converts became interwoven into the Jewish genealogical line. Reflect on the iconic convert, the biblical Ruth, who married Boaz and became the great-grandmother of King David. She began as an outsider, but you don’t get much more Jewish than the bloodline of King David!
To his credit, Ostrer also addresses the third rail of discussions about Jewishness and race: the issue of intelligence. Jews were latecomers to the age of freethinking. While the Enlightenment swept through Christian Europe in the 17th century, the Haskalah did not gather strength until the early 19th century. By the beginning of the new millennium, however, Jews were thought of as among the smartest people on earth. The trend is most prominent in America, which has the largest concentration of Jews outside Israel and a history of tolerance.
Although Jews make up less than 3% of the population, they have won more than 25% of the Nobel Prizes awarded to American scientists since 1950. Jews also account for 20% of this country’s chief executives and make up 22% of Ivy League students. Psychologists and educational researchers have pegged their average IQ at 107.5 to 115, with their verbal IQ at more than 120, a stunning standard deviation above the average of 100 found in those of European ancestry. Like it or not, the IQ debate will become an increasingly important issue going forward, as medical geneticists focus on unlocking the mysteries of the brain.
Many liberal Jews maintain, at least in public, that the plethora of Jewish lawyers, doctors and comedians is the product of our cultural heritage, but the science tells a more complex story. Jewish success is a product of Jewish genes as much as of Jewish moms.
Is it “good for the Jews” to be exploring such controversial subjects? We can’t avoid engaging the most challenging questions in the age of genetics. Because of our history of endogamy, Jews are a goldmine for geneticists studying human differences in the quest to cure disease. Because of our cultural commitment to education, Jews are among the top genetic researchers in the world.
As humankind becomes more genetically sophisticated, identity becomes both more fluid and more fixed. Jews in particular can find threads of our ancestry literally anywhere, muddying traditional categories of nationhood, ethnicity, religious belief and “race.” But such discussions, ultimately, are subsumed by the reality of the common shared ancestry of humankind. Ostrer’s “Legacy” points out that — regardless of the pros and cons of being Jewish — we are all, genetically, in it together. And, in doing so, he gets it just right.
Jon Entine is the founder and director of the Genetic Literacy Project at George Mason University, where he is senior research fellow at the Center for Health and Risk Communication. His website is www.jonentine.com.
In the summer of 1816, a young British woman by the name of Mary Godwin and her boyfriend Percy Shelley went to visit Lord Byron in Lake Geneva, Switzerland. They had planned to spend much of the summer outdoors, but the eruption of Mount Tambora in Indonesia the previous year had changed the climate of Europe. The weather was so bad that they spent most of their time indoors, discussing the latest popular writings on science and the supernatural.
After reading a book of German ghost stories, somebody suggested they each write their own. Byron’s physician, John Polidori, came up with the idea for The Vampyre, published in 1819,1 which was the first of the “vampire-as-seducer” novels. Godwin’s story came to her in a dream, during which she saw “the pale student of unhallowed arts kneeling beside the thing he had put together.”2 Soon after that fateful summer, Godwin and Shelley married, and in 1818, Mary Shelley’s horror story was published under the title, Frankenstein, Or, the Modern Prometheus.3
Frankenstein lives on in the popular imagination as a cautionary tale against technology. We use the monster as an all-purpose modifier to denote technological crimes against nature. When we fear genetically modified foods we call them “frankenfoods” and “frankenfish.” It is telling that even as we warn against such hybrids, we confuse the monster with its creator. We now mostly refer to Dr. Frankenstein’s monster as Frankenstein. And just as we have forgotten that Frankenstein was the man, not the monster, we have also forgotten Frankenstein’s real sin.
Dr. Frankenstein’s crime was not that he invented a creature through some combination of hubris and high technology, but rather that heabandoned the creature to itself. When Dr. Frankenstein meets his creation on a glacier in the Alps, the monster claims that it was notborn a monster, but that it became a criminal only after being left alone by his horrified creator, who fled the laboratory once the horrible thing twitched to life. “Remember, I am thy creature,” the monster protests, “I ought to be thy Adam; but I am rather the fallen angel, whom thou drivest from joy for no misdeed… I was benevolent and good; misery made me a fiend. Make me happy, and I shall again be virtuous.”
Written at the dawn of the great technological revolutions that would define the 19th and 20th centuries, Frankenstein foresees that the gigantic sins that were to be committed would hide a much greater sin. It is not the case that we have failed to care for Creation, but that we have failed to care for our technological creations. We confuse the monster for its creator and blame our sins against Nature upon our creations. But our sin is not that we created technologies but that we failed to love and care for them. It is as if we decided that we were unable to follow through with the education of our children.4
Let Dr. Frankenstein’s sin serve as a parable for political ecology. At a time when science, technology, and demography make clear that we can never separate ourselves from the nonhuman world — that we, our technologies, and nature can no more be disentangled than we can remember the distinction between Dr. Frankenstein and his monster — this is the moment chosen by millions of well-meaning souls to flagellate themselves for their earlier aspiration to dominion, to repent for their past hubris, to look for ways of diminishing the numbers of their fellow humans, and to swear to make their footprints invisible?
The goal of political ecology must not be to stop innovating, inventing, creating, and intervening. The real goal must be to have the same type of patience and commitment to our creations as God the Creator, Himself. And the comparison is not blasphemous: we have taken the whole of Creation on our shoulders and have become coextensive with the Earth.
What, then, should be the work of political ecology? It is, I believe, tomodernize modernization, to borrow an expression proposed by Ulrich Beck.5 This challenge demands more of us than simply embracing technology and innovation. It requires exchanging the modernist notion of modernity for what I have called a “compositionist” one that sees the process of human development as neither liberation from Nature nor as a fall from it, but rather as a process of becoming ever-more attached to, and intimate with, a panoply of nonhuman natures.
1.
At the time of the plough we could only scratch the surface of the soil. Three centuries back, we could only dream, like Cyrano de Bergerac, of traveling to the moon. In the past, my Gallic ancestors were afraid of nothing except that the “sky will fall on their heads.”
Today we can fold ourselves into the molecular machinery of soil bacteria through our sciences and technologies. We run robots on Mars. We photograph and dream of further galaxies. And yet we fear that the climate could destroy us.
Everyday in our newspapers we read about more entanglements of all those things that were once imagined to be separable — science, morality, religion, law, technology, finance, and politics. But these things are tangled up together everywhere: in the Intergovernmental Panel on Climate Change, in the space shuttle, and in the Fukushima nuclear power plant.
If you envision a future in which there will be less and less of these entanglements thanks to Science, capital S, you are a modernist. But if you brace yourself for a future in which there will always be more of these imbroglios, mixing many more heterogeneous actors, at a greater and greater scale and at an ever-tinier level of intimacy requiring even more detailed care, then you are… what? A compositionist!
The dominant, peculiar story of modernity is of humankind’semancipation from Nature. Modernity is the thrusting-forward arrow of time — Progress — characterized by its juvenile enthusiasm, risk taking, frontier spirit, optimism, and indifference to the past. The spirit can be summarized in a single sentence: “Tomorrow, we will be able to separate more accurately what the world is really like from the subjective illusions we used to entertain about it.”
The very forward movement of the arrow of time and the frontier spirit associated with it (the modernizing front) is due to a certain conception of knowledge: “Tomorrow, we will be able to differentiate clearly what in the past was still mixed up, namely facts and values, thanks to Science.”
Science is the shibboleth that defines the right direction of the arrow of time because it, and only it, is able to cut into two well-separated parts what had, in the past, remained hopelessly confused: a morass of ideology, emotions, and values on the one hand, and, on the other, stark and naked matters of fact.
The notion of the past as an archaic and dangerous confusion arises directly from giving Science this role. A modernist, in this great narrative, is the one who expects from Science the revelation that Nature will finally be visible through the veils of subjectivity — and subjection — that hid it from our ancestors.
And here has been the great failure of political ecology. Just when all of the human and nonhuman associations are finally coming to the center of our consciousness, when science and nature and technology and politics become so confused and mixed up as to be impossible to untangle, just as these associations are beginning to be shaped in our political arenas and are triggering our most personal and deepest emotions, this is when a new apartheid is declared: leave Nature alone and let the humans retreat — as the English did on the beaches of Dunkirk in the 1940s.
Just at the moment when this fabulous dissonance inherent in the modernist project between what modernists say (emancipation from all attachments!) and what they do (create ever-more attachments!) is becoming apparent to all, along come those alleging to speak for Nature to say the problem lies in the violations and imbroglios — the attachments!
Instead of deciding that the great narrative of modernism (Emancipation) has always resulted in another history altogether (Attachments), the spirit of the age has interpreted the dissonance in quasi-apocalyptic terms: “We were wrong all along, let’s turn our back to progress, limit ourselves, and return to our narrow human confines, leaving the nonhumans alone in as pristine a Nature as possible, mea culpa, mea maxima culpa…”
Nature, this great shortcut of due political process, is now used to forbid humans to encroach. Instead of realizing at last that the emancipation narrative is bunk, and that modernism was always about attachments, modernist greens have suddenly shifted gears and have begun to oppose the promises of modernization.
Why do we feel so frightened at the moment that our dreams of modernization finally come true? Why do we suddenly turn pale and wish to fall back on the other side of Hercules’s columns, thinking we are being punished for having transgressed the sign: “Thou shall not transgress?” Was not our slogan until now, as Nordhaus and Shellenberger note in Break Through, “We shall overcome!”?6
In the name of indisputable facts portraying a bleak future for the human race, green politics has succeeded in leaving citizens nothing but a gloomy asceticism, a terror of trespassing Nature, and a diffidence toward industry, innovation, technology, and science. No wonder that, while political ecology claims to embody the political power of the future, it is reduced everywhere to a tiny portion of electoral strap-hangers. Even in countries where political ecology is a little more powerful, it contributes only a supporting force.
Political ecology has remained marginal because it has not grasped either its own politics or its own ecology. It thinks it is speaking of Nature, System, a hierarchical totality, a world without man, an assured Science, but it is precisely these overly ordered pronouncements that marginalize it.
Set in contrast to the modernist narrative, this idea of political ecology could not possibly succeed. There is beauty and strength in the modernist story of emancipation. Its picture of the future is so attractive, especially when put against such a repellent past, that it makes one wish to run forward to break all the shackles of ancient existence.
To succeed, an ecological politics must manage to be at least as powerful as the modernizing story of emancipation without imagining that we are emancipating ourselves from Nature. What the emancipation narrative points to as proof of increasing human mastery over and freedom from Nature — agriculture, fossil energy, technology — can be redescribed as the increasing attachmentsbetween things and people at an ever-expanding scale. If the older narratives imagined humans either fell from Nature or freed themselves from it, the compositionist narrative describes our ever-increasing degree of intimacy with the new natures we are constantly creating. Only “out of Nature” may ecological politics start again and anew.
2.
The paradox of “the environment” is that it emerged in public parlance just when it was starting to disappear. During the heyday of modernism, no one seemed to care about “the environment” because there existed a huge unknown reserve on which to discharge all bad consequences of collective modernizing actions. The environment is what appeared when unwanted consequences came back to haunt the originators of such actions.
But if the originators are true modernists, they will see the return of “the environment” as incomprehensible since they believed they were finally free of it. The return of consequences, like global warming, is taken as a contradiction, or even as a monstrosity, which it is, of course, but only according to the modernist’s narrative of emancipation. In the compositionist’s narrative of attachments, unintended consequences are quite normal — indeed, the most expected things on earth!
Environmentalists, in the American sense of the word, never managed to extract themselves from the contradiction that the environment is precisely not “what lies beyond and should be left alone” — this was the contrary, the view of their worst enemies! The environment is exactly what should be even more managed, taken up, cared for, stewarded, in brief, integrated and internalized in the very fabric of the polity.
France, for its part, has never believed in the notion of a pristine Nature that has so confused the “defense of the environment” in other countries. What we call a “national park” is a rural ecosystem complete with post offices, well-tended roads, highly subsidized cows, and handsome villages.
Those who wish to protect natural ecosystems learn, to their stupefaction, that they have to work harder and harder — that is, to intervene even more, at always greater levels of detail, with ever more subtle care — to keep them “natural enough” for Nature-intoxicated tourists to remain happy.
Like France’s parks, all of Nature needs our constant care, our undivided attention, our costly instruments, our hundreds of thousands of scientists, our huge institutions, our careful funding. But though we have Nature, and we have nurture, we don’t know what it would mean for Nature itself to be nurtured.7
The word “environmentalism” thus designates this turning point in history when the unwanted consequences are suddenly considered to be such a monstrosity that the only logical step appears to be to abstain and repent: “We should not have committed so many crimes; now we should be good and limit ourselves.” Or at least this is what people felt and thought before the breakthrough, at the time when there was still an “environment.”
But what is the breakthrough itself then? If I am right, the breakthrough involves no longer seeing a contradiction between the spirit of emancipation and its catastrophic outcomes, but accepting it as the normal duty of continuing to care for unwanted consequences, even if this means going further and further down into the imbroglios. Environmentalists say: “From now on we should limit ourselves.” Postenvironmentalists exclaim: “From now on, we should stop flagellating ourselves and take up explicitly and seriously what we have been doing all along at an ever-increasing scale, namely, intervening, acting, wanting, caring.” For environmentalists, the return of unexpected consequences appears as a scandal (which it is for the modernist myth of mastery). For postenvironmentalists, the other, unintended consequences are part and parcel of any action.
3.
One way to seize upon the breakthrough from environmentalism to postenvironmentalism is to reshape the very definition of the “precautionary principle.” This strange moral, legal, epistemological monster has appeared in European and especially French politics after many scandals due to the misplaced belief by state authority in the certainties provided by Science.8
When action is supposed to be nothing but the logical consequence of reason and facts (which the French, of all people, still believe), it is quite normal to wait for the certainty of science before administrators and politicians spring to action. The problem begins when experts fail to agree on the reasons and facts that have been taken as the necessary premises of any action. Then the machinery of decision is stuck until experts come to an agreement. It was in such a situation that the great tainted blood catastrophe of the 1980s ensued: before agreement was produced, hundreds of patients were transfused with blood contaminated by the AIDS virus.9
The precautionary principle was introduced to break this odd connection between scientific certainty and political action, stating that even in the absence of certainty, decisions could be made. But of course, as soon as it was introduced, fierce debates began on its meaning. Is it an environmentalist notion that precludes action or a postenvironmentalist notion that finally follows action through to its consequences?
Not surprisingly, the enemies of the precautionary principle — which President Chirac enshrined in the French Constitution as if the French, having indulged so much in rationalism, had to be protected against it by the highest legal pronouncements — took it as proof that no action was possible any more. As good modernists, they claimed that if you had to take so many precautions in advance, to anticipate so many risks, to include the unexpected consequences even before they arrived, and worse, to be responsible for them, then it was a plea for impotence, despondency, and despair. The only way to innovate, they claimed, is to bounce forward, blissfully ignorant of the consequences or at least unconcerned by what lies outside your range of action. Their opponents largely agreed. Modernist environmentalists argued that the principle of precaution dictated no action, no new technology, no intervention unless it could be proven with certainty that no harm would result. Modernists we were, modernists we shall be!
But for its postenvironmental supporters (of which I am one) the principle of precaution, properly understood, is exactly the change ofzeitgeist needed: not a principle of abstention — as many have come to see it — but a change in the way any action is considered, a deep tidal change in the linkage modernism established between science and politics. From now on, thanks to this principle, unexpected consequences are attached to their initiators and have to be followed through all the way.
4.
The link between technology and theology hinges on the notion of mastery. Descartes exclaimed that we should be “maîtres et possesseurs de la nature.”10
But what does it mean to be a master? In the modernist narrative, mastery was supposed to require such total dominance by the master that he was emancipated entirely from any care and worry. This is the myth about mastery that was used to describe the technical, scientific, and economic dominion of Man over Nature.
But if you think about it according to the compositionist narrative, this myth is quite odd: where have we ever seen a master freed from any dependence on his dependents? The Christian God, at least, is not a master who is freed from dependents, but who, on the contrary, gets folded into, involved with, implicated with, and incarnated into His Creation. God is so attached and dependent upon His Creation that he is continually forced (convinced? willing?) to save it. Once again, the sin is not to wish to have dominion over Nature, but to believe that this dominion means emancipation and not attachment.
If God has not abandoned His Creation and has sent His Son to redeem it, why do you, a human, a creature, believe that you can invent, innovate, and proliferate — and then flee away in horror from what you have committed? Oh, you the hypocrite who confesses of one sin to hide a much graver, mortal one! Has God fled in horror after what humans made of His Creation? Then have at least the same forbearance that He has.
The dream of emancipation has not turned into a nightmare. It was simply too limited: it excluded nonhumans. It did not care about unexpected consequences; it was unable to follow through with its responsibilities; it entertained a wholly unrealistic notion of what science and technology had to offer; it relied on a rather impious definition of God, and a totally absurd notion of what creation, innovation, and mastery could provide.
Which God and which Creation should we be for, knowing that, contrary to Dr. Frankenstein, we cannot suddenly stop being involved and “go home?” Incarnated we are, incarnated we will be. In spite of a centuries-old misdirected metaphor, we should, without any blasphemy, reverse the Scripture and exclaim: “What good is it for a man to gain his soul yet forfeit the whole world?” /
1. Polidori, John, et al. 1819. The Vampyre: A Tale. Printed for Sherwood, Neely, and Jones.
2. Shelley, Mary W., 1823. Frankenstein: Or, The Modern Prometheus. Printed for G. and W.B. Whittaker.
3. Ibid.
4. This is also the theme of: Latour, Bruno. 1996. Aramis or the Love of Technology. Translated by Catherine Porter. Cambridge, Mass: Harvard University Press.
5. Beck, Ulrich. 1992. Risk Society: Towards a New Modernity. London: Sage.
6. Nordhaus, Ted, and Michael Shellenberger. 2007. Break Through: From the Death of Environmentalism to the Politics of Possibility. Boston: Houghton Mifflin Harcourt.
7. Descola, Philippe. 2005. Par dela nature et culture. Paris: Gallimard.
8. Sadeleer, Nicolas de, 2006. Implementing the Precautionary Principle: Approaches from Nordic Countries and the EU. Earthscan Publ. Ltd.
9. Hermitte, Marie-Angele. 1996. Le Sang Et Le Droit. Essai Sur La Transfusion Sanguine. Paris: Le Seuil.
10. Descartes, Rene. 1637. Discourse on Method in Discourse on Method and Related Writings. Translated by Desmond M. Clark. 1999. Part 6, 44. New York: Penguin.
Contrariando teorias da sociolinguística, estudo sugere que adultos integrados em diferentes nichos sociais acompanham evolução da língua
08/05/2012
Por Karina Toledo
Agência FAPESP – Poderia um indivíduo adulto mudar sua gramática ao longo da vida? Para responder a essa pergunta, a professora Maria Célia Lima-Hernandes, da Faculdade de Filosofia, Letras e Ciências Humanas, da Universidade de São Paulo (USP), deu início à pesquisa que resultou no livroIndivíduo, Sociedade e Língua – Cara, tipo assim, fala sério!.
Recém-lançada pela Edusp, com auxílio da FAPESP, a obra é uma versão revista da tese de doutorado defendida por Lima-Hernandes em 2005, no Instituto de Estudos de Linguagem da Universidade Estadual de Campinas (Unicamp).
A autora investiga se um mesmo grupo de pessoas poderia ter sua gramática alterada em um espaço de 20 anos. Quatro palavras de base comparativa – “como”, “igual”, “feito” e “tipo” – foram escolhidas para testar a hipótese de que contatos sociais mais extensos desencadeariam mudanças na gramática da língua falada por adultos independentemente da idade, do sexo ou do grau de escolaridade.
“A teoria até então predominante na sociolinguística era a de que as mudanças na gramática seriam resultado da rebeldia adolescente. Os jovens, por acharem os pais caretas, procurariam usos inovadores para as palavras. Isso foi recentemente questionado por William Labov, professor da Universidade da Pensilvânia e precursor da Sociolinguística Quantitativa”, disse Lima-Hernandes.
Já para a corrente teórica liderada pelo linguista e filósofo Noam Chomsky, é a criança a força transformadora da língua. “A criança interpretaria as construções de um modo diferente produzindo uma nova gramática”, explicou Lima-Hernandes.
Mas, nas pesquisas que realizou antes mesmo de dar início ao doutorado, a autora encontrou evidências de mudanças linguísticas na idade adulta em várias línguas do mundo.
A confirmação veio quando comparou entrevistas de um grupo de 36 moradores do subúrbio do Rio de Janeiro que, 20 anos antes, haviam sido objeto de estudo do grupo de sua orientadora, Maria Luiza Braga, professora da Universidade Federal do Rio de Janeiro (UFRJ).
Lima-Hernandes observou inicialmente que usos inovadores da palavra “tipo” podiam ter sua incorporação na fala relacionada ao tipo de vida social que os falantes desenvolviam.
“Algumas pessoas simplesmente haviam parado de usar a palavra “tipo” ou só a usavam em suas categorias e funções normatizadas. Essas eram as que mantinham um círculo social restrito. Já as que tinham contato com pessoas de diferentes idades e participavam de nichos sociais variados usavam todos os tipos de “tipo”, ou seja, acompanharam a evolução da língua mesmo na idade adulta”, disse.
Por meio da análise de documentos históricos que datam do século 13 ao século 20, Lima-Hernandes resgatou também a trajetória de evolução das palavras “como”, “igual”, “feito” e “tipo”, mostrando os diferentes usos que surgiram com o passar dos anos.
“É possível perceber que a mudança no uso das palavras não vai em qualquer direção, não é aberta à criatividade aleatória como se pensa, mas respeita princípios cognitivos. O novo uso tem de estar ligado, de alguma forma, ao seu traço etimológico resiliente, ainda que os falantes não tenham a mínima consciência disso”, disse.
Indivíduo, Sociedade e Língua – Cara, tipo assim, fala sério!
Autora: Maria Célia Lima-Hernandes
Lançamento: dezembro de 2011
Preço: R$ 45
Páginas: 232
Methods are systematic, socially agreed upon ways to represent the world. Mixed methods integrate qualitative and quantitative evidence through intentional efforts to focus “on research questions that call for real-life contextual understandings, multi-level perspectives, and cultural influences” (Cresswell, et al, 2011, Best Practices for Mixed Methods Research in the Health Sciences, p 4).
Good anthropology will always benefit from the widest variety of data. High quality examples of combining qualitative and quantitative methods abound in anthropology today and have done so throughout our history. Although ethnography and qualitative methods remain central, it has always been true that other methods are commonly used as well in every field of anthropology.
SOME EXAMPLES
Elinor Ochs and colleagues at UCLA assembled what is arguably the richest family database in the world today (combining video, sociolinguistic, ethnographic, questionnaire, daily diary, material possession, stress hormone and other evidence) in their study of the everyday lives of two-parent, middle class working Los Angeles families and their children (www.celf.ucla.edu). Robert LeVine and collaborators combined sociolinguistic, ethnographic, systematic observational, demographic, historical and child assessment methods in their study of the connections between women’s gains in literacy, lower completed family size, improved health and changes in maternal care in communities around the world (Literacy and Mothering: How Women’s Schooling Changes the Lives of the World’s Children, 2012). The New Hope community based work and family support study (Duncan, Huston and Weisner, Higher Ground: New Hope for the Working Poor and their Children, 2007) used a random-assignment social experiment, survey, questionnaire, child assessment and qualitative ethnographic fieldwork to discover why the program was successful in improving the well-being of parents and children, and yet why sometimes only selectively so.
Andrew Fuligni, Nancy Gonzalez and I currently collaborate on a study of the daily activities, family responsibilities and obligations, and academic and behavioral outcomes of 428 Mexican American immigrant teens and parents in Los Angeles (first, second and later generations, documented and not). Methods include 14-day consecutive daily diaries, survey and questionnaire data, and school and behavior assessments. In addition, a 10% nested random sample of parents and teens from this larger sample participate in a qualitative study in the homes of parents and children in addition. We gave cameras to adolescents in ninth and tenth grades with instructions to take 25 pictures of people, places, events and activities important to them. We plugged the cameras into our laptops and talked with the teens about their photos. We asked questions such as: Who are these friends; oh you have a boyfriend? Tell me more about your soccer team. That’s your Mom cooking; what do you do for chores? That’s one of your teachers? What class is it; how is school going? Teens take photos of other family members’ photos such as grandparents they cannot visit in Mexico; one took a photo of the moon, mentioning the film Under the Same Moon (La Misma Luna).
The narratives then can be recorded, transcribed and uploaded to a mixed methods software program such as Dedoose (www.Dedoose.com), a web-based mixed method software tool. Indexing and coding are a matter of dragging and dropping codes on the relevant portions of the text. Quantitative data from the larger study also are uploaded and linked to adolescent and parent narratives and photos. Narratives can be coded; patterns in quantitative data can be enriched qualitatively. The same fieldworkers who went to the homes and did interviews, also often worked on analyses of quantitative data.
STRENGTH OF INTEGRATED METHODS
Methods and research designs are languages understood across the social sciences. To the extent that we can speak those languages in our work, we more likely will draw in those in other disciplines into conversations with us. A study that creatively integrates quantitative and qualitative methods sends a positive message to those fluent in only qualitative or quantitative methods that we take their methods (and so their identities and ideas) seriously. The increased believability in our and others’ work which often results is itself a criterion for successful mixed methods research. The use of integrated methods is growing across the social sciences; psychology (eg, Yoshikawa, et al, Developmental Psychology 44[344–54]), sociology (eg, Mario Small in Annual Review of Sociology 37[57–86]), psychiatry (Palinkas, et al, Psychiatric Services 62 [3]), public health (Plano-Clark, Qualitative Inquiry16 [6]), political science, education, economics and other fields are benefitting and sometimes looking to anthropology for collaboration. Policy and practice research benefits hugely from integrating qualitative and quantitative methods. Funders increasingly see integrated methods as a strength in grant proposals.
The stark binary contrast of the “two Q’s”—qualitative vs quantitative—is not very useful; it restricts our thinking and limits our conversations. The two Q’s oversimplifies the debates and obscures important shared goals common to all methods. A better narrative and discourse about methods should use a richer conceptual framework. The actual contrast with quantitative levels of measurement (ordinal, interval, ratio scales) should be nominal or categorical levels (words, categories, narratives, themes, patterns); both are useful. The contrast with naturalistic research should not be experimental but research that is contrived or controlled in some systematic way to aid understanding. A useful framework for anthropology should distinguish person and experience-centered, or context-centered and variable-centered methods, not a qualitative/quantitative binary. Such a methods conversation could then focus on the most important Q—our common questions.
Many of us use ethnographic settings, events or activities as our units of analysis to be sure we do not bracket out context that provides essential meaning. However, inquiry across levels of analysis beyond settings and beyond projects often requires mixed methods. We often deal with suspicions about the “bias” of ethnographic and qualitative methods. Mixed methods do not necessarily lead to common findings; there is method variance just as there is expectable heterogeneity, conflict and inconsistency in cultural beliefs and practices themselves. A more useful question is whether our methods have been systematically context-examined or remain context-unexamined—since all methods (whether qualitative or quantitative) entail a context or a set of presumptions and methods effects of some kinds.
Quantitative methods and statistical analyses have guidelines and procedures (not uncontested of course) for deciding if they are done well—if they met accepted standards and should be published and disseminated for example. These include judgments of reliability, validity, sample size and representativeness or generalizability, power, and so forth. Qualitative and ethnographic work can and should have recognized criteria as well, such as breadth, depth, holism, veridicality, specificity of context, meaning centered, narrative and behavioral coherence, shared cognitions, interpretive richness, and others. These are of course more variable, and not so easy to define, yet they are valuable and defensible if carefully described. These should be in addition to explicit descriptions of sampling, setting, and so forth. Reasonable, flexible mixed methods criteria are being developed in these respects (Weisner and Fiese in Journal of Family Psychology 25[6]). Recent NIH guidelines have been developed for the use of mixed methods in health research and in applications for funding (Cresswell, et al, 2011).
METHODS PLURALISM IN ANTHROPOLOGY
I would guess—or at least hope—that most anthropologists are fairly tolerant pluralists regarding methods. Most of us appreciate the vast range of qualitative and ethnographic methods and their integration, as in Russ Bernard’s Research Methods in Anthropology. Qualitative and Quantitative Approaches (2011). I suspect many if not most of us generally agree with this view or use mixed methods in our own research and teaching, and regularly cite such work even if we don’t do this ourselves. If we don’t do quantitative research, we may have partnered with others who do and are interested in similar questions, or we may have taught courses using books and papers with quantitative evidence. And yet it is fair to say that those who critique quantitative methods, or dismiss systematic methods altogether, including mixed methods, sometimes, without justification in my view, seek to claim the dominant view. To the contrary: the future of our field and the social sciences is far more likely to be characterized by interdisciplinary methodological pluralism, often including integrated mixed methods. Anthropology should be at the forefront of such research and practice, not critiquing from the margins or simply ignoring important methodological and research design innovations.
Donald Campbell long ago described this more modest, pluralist, pragmatic, skeptical, empirically based approach to methods: he argued that all methods are valuable and important, but that all methods are also weak in the sense that they are incomplete representations of the incredibly complex world that we hope to understand. Hence we should use the widest range of methods, so that the weaknesses of one method can be complemented by the strengths of another, and so that phenomena in the world that are holistic qualities best or only to be represented by narrative, text, photos or sound are represented that way, and phenomena best or only to be represented with numbers, variables and models are represented quantitatively. As a result, we will get closer to understanding the world, and then persuading others of the truth of what we discover and believe.
Thomas S Weisner (www.tweisner.com) is anthropology professor in the departments of psychiatry and anthropology at UCLA, and director of Center for Culture & Health. His research and teaching interests are in culture and human development; medical, psychological and cultural studies of families and children at risk; mixed methods; and evidence-informed policy.
For Zygmunt Bauman the world is marked by a division between power and politics. While politics is defined by nations, power no longer recognises national boundaries
Brasília – “Temos medo do Brasil.” Foi com um desabafo inesperado que a romancista moçambicana Paulina Chiziane chamou a atenção do público do seminário A Literatura Africana Contemporânea, que integra a programação da 1ª Bienal do Livro e da Leitura, em Brasília (DF). Ela se referia aos efeitos da presença, em Moçambique, de igrejas e templos brasileiros e de produtos culturais como as telenovelas que transmitem, na opinião dela, uma falsa imagem do país.
“Para nós, moçambicanos, a imagem do Brasil é a de um país branco ou, no máximo, mestiço. O único negro brasileiro bem-sucedido que reconhecemos como tal é o Pelé. Nas telenovelas, que são as responsáveis por definir a imagem que temos do Brasil, só vemos negros como carregadores ou como empregados domésticos. No topo [da representação social] estão os brancos. Esta é a imagem que o Brasil está vendendo ao mundo”, criticou a autora, destacando que essas representações contribuem para perpetuar as desigualdades raciais e sociais existentes em seu país.
“De tanto ver nas novelas o branco mandando e o negro varrendo e carregando, o moçambicano passa a ver tal situação como aparentemente normal”, sustenta Paulina, apontando para a mesma organização social em seu país.
A presença de igrejas brasileiras em território moçambicano também tem impactos negativos na cultura do país, na avaliação da escritora. “Quando uma ou várias igrejas chegam e nos dizem que nossa maneira de crer não é correta, que a melhor crença é a que elas trazem, isso significa destruir uma identidade cultural. Não há o respeito às crenças locais. Na cultura africana, um curandeiro é não apenas o médico tradicional, mas também o detentor de parte da história e da cultura popular”, detacou Paulina, criticando os governos dos dois países que permitem a intervenção dessas instituições.
Primeira mulher a publicar um livro em Moçambique, Paulina procura fugir de estereótipos em sua obra, principalmente, os que limitam a mulher ao papel de dependente, incapaz de pensar por si só, condicionada a apenas servir.
“Gosto muito dos poetas de meu país, mas nunca encontrei na literatura que os homens escrevem o perfil de uma mulher inteira. É sempre a boca, as pernas, um único aspecto. Nunca a sabedoria infinita que provém das mulheres”, disse Paulina, lembrando que, até a colonização europeia, cabia às mulheres desempenhar a função narrativa e de transmitir o conhecimento.
“Antes do colonialismo, a arte e a literatura eram femininas. Cabia às mulheres contar as histórias e, assim, socializar as crianças. Com o sistema colonial e o emprego do sistema de educação imperial, os homens passam a aprender a escrever e a contar as histórias. Por isso mesmo, ainda hoje, em Moçambique, há poucas mulheres escritoras”, disse Paulina.
“Mesmo independentes [a partir de 1975], passamos a escrever a partir da educação europeia que havíamos recebido, levando os estereótipos e preconceitos que nos foram transmitidos. A sabedoria africana propriamente dita, a que é conhecida pelas mulheres, continua excluída. Isso para não dizer que mais da metade da população moçambicana não fala português e poucos são os autores que escrevem em outras línguas moçambicanas”, disse Paulina.
Durante a bienal, foi relançado o livro Niketche, uma história de poligamia, de autoria da escritora moçambicana.
It’s a national embarrassment. It has resulted in large unnecessary costs for the U.S. economy and needless endangerment of our citizens. And it shouldn’t be occurring.
What am I talking about? The third rate status of numerical weather prediction in the U.S. It is a huge story, an important story, but one the media has not touched, probably from lack of familiarity with a highly technical subject. And the truth has been buried or unavailable to those not intimately involved in the U.S. weather prediction enterprise. This is an issue I have mentioned briefly in previous blogs, and one many of you have asked to learn more about. It’s time to discuss it.
Weather forecasting today is dependent on numerical weather prediction, the numerical solution of the equations that describe the atmosphere. The technology of weather prediction has improved dramatically during the past decades as faster computers, better models, and much more data (mainly satellites) have become available.
Supercomputers are used for numerical weather prediciton.
U.S. numerical weather prediction has fallen to third or fourth place worldwide, with the clear leader in global numerical weather prediction (NWP) being the European Center for Medium Range Weather Forecasting (ECMWF). And we have also fallen behind in ensembles (using many models to give probabilistic prediction) and high-resolution operational forecasting. We used to be the world leader decades ago in numerical weather prediction: NWP began and was perfected here in the U.S. Ironically, we have the largest weather research community in the world and the largest collection of universities doing cutting-edge NWP research (like the University of Washington!). Something is very, very wrong and I will talk about some of the issues here. And our nation needs to fix it.
But to understand the problem, you have to understand the competition and the players. And let me apologize upfront for the acronyms.
In the U.S., numerical weather prediction mainly takes place at the National Weather Service’s Environmental Modeling Center (EMC), a part of NCEP (National Centers for Environmental Prediction). They run a global model (GFS) and regional models (e.g., NAM).
The Europeans banded together decades ago to form the European Center for Medium-Range Forecasting (ECMWF), which runs a very good global model. Several European countries run regional models as well.
The United Kingdom Met Office (UKMET) runs an excellent global model and regional models. So does the Canadian Meteorological Center (CMC).
There are other major global NWP centers such as the Japanese Meteorological Agency (JMA), the U.S. Navy (FNMOC), the Australian center, one in Beijing, among others. All of these centers collect worldwide data and do global NWP.
The problem is that both objective and subjective comparisons indicate that the U.S. global model is number 3 or number 4 in quality, resulting in our forecasts being noticeably inferior to the competition. Let me show you a rather technical graph (produced by the NWS) that illustrates this. This figure shows the quality of the 500hPa forecast (about halfway up in the troposphere–approximately 18,000 ft) for the day 5 forecast. The top graph is a measure of forecast skill (closer to 1 is better) from 1996 to 2012 for several models (U.S.–black, GFS; ECMWF-red, Canadian: CMC-blue, UKMET: green, Navy: FNG, orange). The bottom graph shows the difference between the U.S. and other nation’s model skill.
You first notice that forecasts are all getting better. That’s good. But you will notice that the most skillful forecast (closest to one) is clearly the red one…the European Center. The second best is the UKMET office. The U.S. (GFS model) is third…roughly tied with the Canadians.
Here is a global model comparison done by the Canadian Meteorological Center, for various global models from 2009-2012 for the 120 h forecast. This is a plot of error (RMSE, root mean square error) again for 500 hPa, and only for North America. Guess who is best again (lowest error)?–the European Center (green circle). UKMET is next best, and the U.S. (NCEP, blue triangle) is back in the pack.
Lets looks at short-term errors. Here is a plot from a paper by Garrett Wedam, Lynn McMurdie and myself comparing various models at 24, 48, and 72 hr for sea level pressure along the West Coast. Bigger bar means more error. Guess who has the lowest errors by far? You guessed it, ECMWF.
I could show you a hundred of these plots, but the answers are very consistent. ECMWF is the worldwide gold standard in global prediction, with the British (UKMET) second. We are third or fourth (with the Canadians). One way to describe this, is that the ECWMF model is not only better at the short range, but has about one day of additional predictability: their 8 day forecast is about as skillful as our 7 day forecast. Another way to look at it is that with the current upward trend in skill they are 5-7 years ahead of the U.S.
Most forecasters understand the frequent superiority of the ECMWF model. If you read the NWS forecast discussion, which is available online, you will frequently read how they often depend not on the U.S. model, but the ECMWF. And during the January western WA snowstorm, it was the ECMWF model that first indicated the correct solution. Recently, I talked to the CEO of a weather/climate related firm that was moving up to Seattle. I asked them what model they were using: the U.S. GFS? He laughed, of course not…they were using the ECMWF.
A lot of U.S. firms are using the ECMWF and this is very costly, because the Europeans charge a lot to gain access to their gridded forecasts (hundreds of thousands of dollars per year). Can you imagine how many millions of dollars are being spent by U.S. companies to secure ECMWF predictions? But the cost of the inferior NWS forecasts are far greater than that, because many users cannot afford the ECMWF grids and the NWS uses their global predictions to drive the higher-resolution regional models–which are NOT duplicated by the Europeans. All of U.S. NWP is dragged down by these second-rate forecasts and the costs for the nation has to be huge, since so much of our economy is weather sensitive. Inferior NWP must be costing billions of dollars, perhaps many billions.
The question all of you must be wondering is why this bad situation exists. How did the most technologically advanced country in the world, with the largest atmospheric sciences community, end up with third-rate global weather forecasts? I believe I can tell you…in fact, I have been working on this issue for several decades (with little to show for it). Some reasons:
1. The U.S. has inadequate computer power available for numerical weather prediction. The ECMWF is running models with substantially higher resolution than ours because they have more resources available for NWP. This is simply ridiculous–the U.S. can afford the processors and disk space it would take. We are talking about millions or tens of millions of dollars at most to have the hardware we need. A part of the problem has been NWS procurement, that is not forward-leaning, using heavy metal IBM machines at very high costs.
2. The U.S. has used inferior data assimilation. A key aspect of NWP is to assimilate the observations to create a good description of the atmosphere. The European Center, the UKMET Office, and the Canadians using 4DVAR, an advanced approach that requires lots of computer power. We used an older, inferior approach (3DVAR). The Europeans have been using 4DVAR for 20 years! Right now, the U.S. is working on another advanced approach (ensemble-based data assimilation), but it is not operational yet.
3. The NWS numerical weather prediction effort has been isolated and has not taken advantage of the research community. NCEP’s Environmental Modeling Center (EMC) is well known for its isolation and “not invented here” attitude. While the European Center has lots of visitors and workshops, such things are a rarity at EMC. Interactions with the university community have been limited and EMC has been reluctant to use the models and approaches developed by the U.S. research community. (True story: some of the advances in probabilistic weather prediction at the UW has been adopted by the Canadians, while the NWS had little interest). The National Weather Service has invested very little in extramural research and when their budget is under pressure, university research is the first thing they reduce. And the U.S. NWP center has been housed in a decaying building outside of D.C.,one too small for their needs as well. (Good news… a new building should be available soon).
4. The NWS approach to weather related research has been ineffective and divided. The governmnent weather research is NOT in the NWS, but rather in NOAA. Thus, the head of the NWS and his leadership team do not have authority over folks doing research in support of his mission. This has been an extraordinarily ineffective and wasteful system, with the NOAA research teams doing work that often has a marginal benefit for the NWS.
5. Lack of leadership. This is the key issue. The folks in NCEP, NWS, and NOAA leadership have been willing to accept third-class status, providing lots of excuses, but not making the fundamental changes in organization and priority that could deal with the problem. Lack of resources for NWP is another issue…but that is a decision made by NOAA/NWS/Dept of Commerce leadership.
This note is getting long, so I will wait to talk about the other problems in the NWS weather modeling efforts, such as our very poor ensemble (probabilistic) prediction systems. One could write a paper on this…and I may.
I should stress that I am not alone in saying these things. A blue-ribbon panel did a review of NCEP in 2009 and came to similar conclusions (found here). And these issues are frequently noted at conferences, workshops, and meetings.
Let me note that the above is about the modeling aspects of the NWS, NOT the many people in the local forecast offices. This part of the NWS is first-rate. They suffer from inferior U.S. guidance and fortunately have access to the ECMWF global forecasts. And there are some very good people at NCEP that have lacked the resources required and suitable organization necessary to push forward effectively.
This problem at the National Weather Service is not a weather prediction problem alone, but an example of a deeper national malaise. It is related to other U.S. issues, like our inferior K-12 education system. Our nation, gaining world leadership in almost all areas, became smug, self-satisfied, and a bit lazy. We lost the impetus to be the best. We were satisfied to coast. And this attitude must end…in weather prediction, education, and everything else… or we will see our nation sink into mediocrity.
The U.S. can reclaim leadership in weather prediction, but I am not hopeful that things will change quickly without pressure from outside of the NWS. The various weather user communities and our congressional representatives must deliver a strong message to the NWS that enough is enough, that the time for accepting mediocrity is over. And the Weather Service requires the resources to be first rate, something it does not have at this point.
* * *
Saturday, April 7, 2012
Lack of Computer Power Undermines U.S. Numerical Weather Prediction (Revised)
In my last blog on this subject, I provided objective evidence of how U.S. numerical weather prediction (NWP), and particularly our global prediction skill, lags between major international centers, such as the European Centre for Medium Range Weather Forecasting (ECMWF), the UKMET office, and the Canadian Meteorological Center (CMC). I mentioned briefly how the problem extends to high-resolution weather prediction over the U.S. and the use of ensemble (many model runs) weather prediction, both globally and over the U.S. Our nation is clearly number one in meteorological research and we certainly have the knowledge base to lead the world in numerical weather prediction, but for a number of reasons we are not. The cost of inferior weather prediction is huge: in lives lost, injuries sustained, and economic impacts unmitigated. Truly, a national embarrassment. And one we must change.
In this blog, I will describe in some detail one major roadblock in giving the U.S. state-of-the-art weather prediction: inadequate computer resources. This situation should clearly have been addressed years ago by leadership in the National Weather Service, NOAA, and the Dept of Commerce, but has not, and I am convinced will not without outside pressure. It is time for the user community and our congressional representatives to intervene. To quote Samuel L. Jackson, enough is enough. (…)
In the U.S. we are trying to use less computer resources to do more tasks than the global leaders in numerical weather prediction. (Note: U.S. NWP is done by National Centers for Environmental Prediction’s (NCEP) Environmental Modeling Center (EMC)). This chart tells the story:
Courtesy of Bill Lapenta, EMC.
ECMWF does global high resolution and ensemble forecasts, and seasonal climate forecasts. UKMET office also does regional NWP (England is not a big country!) and regional air quality. NCEP does all of this plus much, much more (high resolution rapid update modeling, hurricane modeling, etc.). And NCEP has to deal with prediction over a continental-size country.
If you would expect the U.S. has a lot more computer power to balance all these responsibilities and tasks, you would be very wrong. Right now the U.S. NWS has two IBM supercomputers, each with 4992 processors (IBM Power6 processors). One computer does the operational work, the other is for back up (research and testing runs are done on the back-up). About 70 teraflops (trillion floating points operations per second) for each machine.
NCEP (U.S.) Computer
The European Centre has a newer IBM machine with 8192, much faster, processors that gets 182 terraflops (yes, over twice as fast and with far fewer tasks to do).
The UKMET office, serving a far, far smaller country, has two newer IBM machines, each with 7680 processors for 175 teraflops per machine.
Here is a figure, produced at NCEP that compares the relative computer power of NCEP’s machine with the European Centre’s. The shading indicates computational activity and the x-axis for each represents a 24-h period. The relative heights allows you to compare computer resources. Not only does the ECMWF have much more computer power, but they are more efficient in using it…packing useful computations into every available minute.
Courtesy of Bill Lapenta, EMC
Recently, NCEP had a request for proposals for a replacement computer system. You may not believe this, but the specifications were ONLY for a system at least equal to the one that have. A report in acomputer magazine suggests that perhaps this new system (IBM got the contract) might be slightly less powerful (around 150 terraflops) than one of the UKMET office systems…but that is not known at this point.
The Canadians? They have TWO machines like the European Centre’s!
So what kind of system does NCEP require to serve the nation in a reasonable way?
To start, we need to double the resolution of our global model to bring it into line with ECMWF (they are now 15 km global). Such resolution allows the global model to model regional features (such as our mountains). Doubling horizontal resolution requires 8 times more computer power. We need to use better physics (description of things like cloud processes and radiation). Double again. And we need better data assimilation (better use of observations to provide an improved starting point for the model). Double once more. So we need 32 times more computer power for the high-resolution global runs to allow us to catch up with ECMWF. Furthermore, we must do the same thing for the ensembles (running many lower resolution global simulations to get probabilistic information). 32 times more computer resources for that (we can use some of the gaps in the schedule of the high resolution runs to fit some of this in…that is what ECMWF does). There are some potential ways NCEP can work more efficiently as well. Right now NCEP runs our global model out to 384 hours four times a day (every six hours). To many of us this seems excessive, perhaps the longest periods (180hr plus) could be done twice a day. So lets begin with a computer 32 times faster that the current one.
Many workshops and meteorological meetings (such as one on improvements in model physics that was held at NCEP last summer—I was the chair) have made a very strong case that the U.S. requires an ensemble prediction system that runs at 4-km horizontal resolution. The current national ensemble system has a horizontal resolution about 32 km…and NWS plans to get to about 20 km in a few years…both are inadequate. Here is an example of the ensemble output (mean of the ensemble members) for the NWS and UW (4km) ensemble systems: the difference is huge–the NWS system does not even get close to modeling the impacts of the mountains. It is similarly unable to simulate large convective systems.
Current NWS( NCEP) “high resolution” ensembles (32 km)
4 km ensemble mean from UW system
Let me make one thing clear. Probabilistic prediction based on ensemble forecasts and reforecasting (running models back for years to get statistics of performance) is the future of weather prediction. The days of giving a single number for say temperature at day 5 are over. We need to let people know about uncertainty and probabilities. The NWS needs a massive increase of computer power to do this. It lacks this computer power now and does not seem destined to get it soon.
A real champion within NOAA of the need for more computer power is Tom Hamill, an expert on data assimilation and model post-processing. He and colleagues have put together a compelling case for more NWS computer resources for NWP. Read it here.
Back-of-the-envelope calculations indicates that a good first step– 4km national ensembles–would require about 20,000 processors to do so in a timely manner–but it would revolutionize weather prediction in the U.S., including forecasting convection and in mountainous areas. This high-resolution ensemble effort would meld with data assimilation over the long-term.
And then there is running super-high resolution numerical weather prediction to get fine-scale details right. Here in the NW my group runs a 1.3 km horizontal resolution forecast out twice a day for 48h. Such capability is needed for the entire country. It does not exist now due to inadequate computer resources.
The bottom line is that the NWS numerical modeling effort needs a huge increase of computer power to serve the needs of the country–and the potential impacts would be transformative. We could go from having a third-place effort, which is slipping back into the pack, to a world leader. Furthermore, the added computer power will finally allow NOAA to complete Observing System Simulation Experiments (OSSEs) and Observing System Experiments (OSEs) to make rational decisions about acquisitions of very expensive satellite systems. The fact that this is barely done today is really amazing and a potential waste of hundreds of millions of dollars on unnecessary satellite systems.
But do to so will require a major jump in computational power, a jump our nation can easily afford. I would suggest that NWS’s EMC should begin by securing at least a 100,000 processor machine, and down the road something considerably larger. Keep in mind my department has about 1000 processors in our computational clusters, so this is not as large as you think.
For a country with several billion-dollar weather disasters a year, investment in reasonable computer resrouces for NWP is obvious.
The cost? Well, I asked Art Mann of Silicon Mechanics (a really wonderful local vendor of computer clusters) to give me rough quote: using fast AMD chips, you could have such a 100K core machine for 11 million dollars. (this is without any discount!) OK, this is the U.S. government and they like expensive, heavy metal machines….lets go for 25 million dollars. The National Center for Atmospheric Research (NCAR) is getting a new machine with around 75,000 processors and the cost will be around 25-35 million dollars. NCEP will want two machines, so lets budget 60 million dollars. We spend this much money on a single jet fighter, but we can’t invest this amount to greatly improve forecasts and public safety in the U.S.? We have machines far larger than this for breaking codes, doing simulations of thermonuclear explosions, and simulating climate change.
Yes, a lot of money, but I suspect the cost of the machine would be paid back in a few months from improved forecasts. Last year we had quite a few (over ten) billion-dollar storms….imagine the benefits of forecasting even a few of them better. Or the benefits to the wind energy and utility industries, or U.S. aviation, of even modestly improved forecasts. And there is no doubt such computer resources would improve weather prediction. The list of benefits is nearly endless. Recent estimates suggest that normal weather events cost the U.S. economy nearly 1/2 trillion dollars a year. Add to that hurricanes, tornadoes, floods, and other extreme weather. The business case is there.
As someone with an insider’s view of the process, it is clear to me that the current players are not going to move effectively without some external pressure. In fact, the budgetary pressure on the NWS is very intense right now and they are cutting away muscle and bone at this point (like reducing IT staff in the forecast offices by over 120 people and cutting back on extramural research). I believe it is time for weather sensitive industries and local government, together with t he general public, to let NOAA management and our congressional representatives know that this acute problem needs to be addressed and addressed soon. We are acquiring huge computer resources for climate simulations, but only a small fraction of that for weather prediction…which can clearly save lives and help the economy. Enough is enough.
Jean-Luc Godard, Director: “The so-called “digital” is not a mere technical medium, but a medium of thought. And when modern democracies turn technical thought into a separate domain, those modern democracies incline towards totalitarianism.”
“Best practices” is the worst practice. The idea that we should examine successful organizations and then imitate what they do if we also want to be successful is something that first took hold in the business world but has now unfortunately spread to the field of education. If imitation were the path to excellence, art museums would be filled with paint-by-number works.
The fundamental flaw of a “best practices” approach, as any student in a half-decent research-design course would know, is that it suffers from what is called “selection on the dependent variable.” If you only look at successful organizations, then you have no variation in the dependent variable: they all have good outcomes. When you look at the things that successful organizations are doing, you have no idea whether each one of those things caused the good outcomes, had no effect on success, or was actually an impediment that held organizations back from being even more successful. An appropriate research design would have variation in the dependent variable; some have good outcomes and some have bad ones. To identify factors that contribute to good outcomes, you would, at a minimum, want to see those factors more likely to be present where there was success and less so where there was not.
“Best practices” lacks scientific credibility, but it has been a proven path to fame and fortune for pop-management gurus like Tom Peters, with In Search of Excellence, and Jim Collins, with Good to Great. The fact that many of the “best” companies they featured subsequently went belly-up—like Atari and Wang Computers, lauded by Peters, and Circuit City and Fannie Mae, by Collins—has done nothing to impede their high-fee lecture tours. Sometimes people just want to hear a confident person with shiny teeth tell them appealing stories about the secrets to success.
With Surpassing Shanghai, Marc Tucker hopes to join the ranks of the “best practices” gurus. He, along with a few of his colleagues at the National Center on Education and the Economy, has examined the education systems in some other countries with successful outcomes so that the U.S. can become similarly successful. Tucker coauthors the chapter on Japan, as well as an introductory and two concluding chapters. Tucker’s collaborators write chapters featuring Shanghai, Finland, Singapore, and Canada. Their approach to greatness in American education, as Linda Darling-Hammond phrases it in the foreword, is to ensure that “our strategies must emulate the best of what has been accomplished in public education both from here and abroad.”
But how do we know what those best practices are? The chapters on high-achieving countries describe some of what those countries are doing, but the characteristics they feature may have nothing to do with success or may even be a hindrance to greater success. Since the authors must pick and choose what characteristics they highlight, it is also quite possible that countries have successful education systems because of factors not mentioned at all. Since there is no scientific method to identifying the critical features of success in the best-practices approach, we simply have to trust the authority of the authors that they have correctly identified the relevant factors and have properly perceived the causal relationships.
But Surpassing Shanghai is even worse than the typical best-practices work, because Tucker’s concluding chapters, in which he summarizes the common best practices and draws policy recommendations, have almost no connection to the preceding chapters on each country. That is, the case studies of Shanghai, Finland, Japan, Singapore, and Canada attempt to identify the secrets to success in each country, a dubious-enough enterprise, and then Tucker promptly ignores all of the other chapters when making his general recommendations.
Tucker does claim to be drawing on the insights of his coauthors, but he never actually references the other chapters in detail. He never names his coauthors or specifically draws on them for his conclusions. In fact, much of what Tucker claims as common lessons of what his coauthors have observed from successful countries is contradicted in chapters that appear earlier in the book. And some of the common lessons they do identify, Tucker chooses to ignore.
For example, every country case study in Surpassing Shanghai, with the exception of the one on Japan coauthored by Marc Tucker, emphasizes the importance of decentralization in producing success. In Shanghai the local school system “received permission to create its own higher education entrance examination. This heralded a trend of exam decentralization, which was key to localized curricula.” The chapter on Finland describes the importance of the decision “to devolve increasing levels of authority and responsibility for education from the Ministry of Education to municipalities and schools…. [T]here were no central initiatives that the government was trying to push through the system.” Singapore is similarly described: “Moving away from the centralized top-down system of control, schools were organized into geographic clusters and given more autonomy…. It was felt that no single accountability model could fit all schools. Each school therefore set its own goals and annually assesses its progress toward meeting them…” And the chapter on Canada teaches us that “the most striking feature of the Canadian system is its decentralization.”
Tucker makes no mention of this common decentralization theme in his conclusions and recommendations. Instead, he claims the opposite as the common lesson of successful countries: “students must all meet a common basic education standard aligned to a national or provincial curriculum… Further, in these countries, the materials prepared by textbook publishers and the publishers of supplementary materials are aligned with the national curriculum framework.” And “every high-performing country…has a unit of government that is clearly in charge of elementary and secondary education…In such countries, the ministry has an obligation to concern itself with the design of the system as a whole…”
Conversely, Tucker emphasizes that “the dominant elements of the American education reform agenda” are noticeably absent from high-performing countries, including “the use of market mechanisms, such as charter schools and vouchers….” But if Tucker had read the chapter on Shanghai, he would have found a description of a system by which “students choose schools in other neighborhoods by paying a sponsorship fee. It is the Chinese version of school choice, a hot issue in the United States.” And although the chapter on Canada fails to make any mention of it, Canada has an extensive system of school choice, offering options that vary by language and religious denomination. According to recently published research by David Card, Martin Dooley, and Abigail Payne, competition among these options is a significant contributor to academic achievement in Canada.
There is a reason that promoters of best-practices approaches are called “gurus.” Their expertise must be derived from a mystical sphere, because it cannot be based on a scientific appraisal of the evidence. Marc Tucker makes no apology for his nonscientific approach. In fact, he denounces “the clinical research model used in medical research” when assessing education policies. The problem, he explains, is that no country would consent to “randomly assigning entire national populations to the education systems of another country or to certain features of the education system of another country.” On the contrary, countries, states, and localities can and do randomly assign “certain features of the education system,” and we have learned quite a lot from that scientific process. In the international arena, Tucker may want to familiarize himself with the excellent work being done by Michael Kremer and Karthik Muralidharan utilizing random assignment around the globe.
In addition, social scientists have developed practices to observe and control for differences in the absence of random assignment that have allowed extensive and productive analyses of the effectiveness of educational practices in different countries. In particular, the recent work of Ludger Woessmann, Martin West, and Eric Hanushek has utilized the PISA and TIMSS international test results that Tucker finds so valuable, but they have done so with the scientific methods that Tucker rejects. Even well-constructed case study research, like that done by Charles Glenn, can draw useful lessons across countries. The problem with the best-practices approach is not entirely that it depends on case studies, but that by avoiding variation in the dependent variable it prevents any scientific identification of causation.
Tucker’s hostility to scientific approaches is more understandable, given that his graduate training was in theater rather than a social science. Perhaps that is also why Tucker’s book reminds me so much of The Music Man. Tucker is like “Professor” Harold Hill come to town to sell us a bill of goods. His expertise is self-appointed, and his method, the equivalent of “the think system,” is obvious quackery. And the Gates Foundation, which has for some reason backed Tucker and his organization with millions of dollars, must be playing the residents of River City, because they have bought this pitch and are pouring their savings into a band that can never play music except in a fantasy finale.
Best practices really are the worst.
Jay P. Greene is professor of education reform at the University of Arkansas and a fellow at the George W. Bush Institute.
Surpassing Shanghai: An Agenda for American Education Built on the World’s Leading Systems Edited by Marc Tucker Harvard Education Press, 2011, $49.99; 288 pages.
Autor de “A Invenção da Cultura”, Roy Wagner conheceu, pela primeira vez, indígenas da América do Sul e participou de ritual
Manaus, 08 de Agosto de 2011
ELAÍZE FARIAS
Antropólogo Norte Americano dialoga com índios do Amazônia – FOTO: ALEXANDRE FONSECA/ACRITICA
Antropólogo norte-americano dialoga com índios da Amazônia. FOTO: ALEXANDRE FONSECA/ACRITICA
Antropólogo norte-americano dialoga com índios da Amazônia. FOTO: ALEXANDRE FONSECA/ACRITICA
Antropólogo norte-americano dialoga com índios da Amazônia. FOTO: ALEXANDRE FONSECA/ACRITICA
Antropólogo norte-americano dialoga com índios da Amazônia. FOTO: ALEXANDRE FONSECA/ACRITICA
“Todo entendimento de uma outra cultura é uma experiência com a sua própria”, diz o norte-americano Roy Wagner, um dos principais nomes da antropologia contemporânea mundial, no livro “A Invenção da Cultura”.
Foi exatamente essa equivalência entre culturas que Roy Wagner vivenciou em sua primeira visita à Amazônia, na semana passada.
Em Manaus, Wagner realizou aula magna de abertura de ano letivo, participou de uma mesa redonda com graduandos e pós-graduandos indígenas da Universidade Federal do Amazonas (Ufam), visitou duas malocas de grupos indígenas que vivem na zona rural da capital amazonense e testemunhou o que ele chamou de “multiperspectivos”.
Autor da teoria sobre “a invenção e a noção da cultura”, que resultou no conceito de “antropologia reversa”, Wagner notabilizou-se pelos estudos que desenvolveu desde os anos 60 na Melanésia e na Nova Guiné (Oceania). Mas, somente agora, aos 73 anos, é que teve oportunidade de conhecer os povos nativos da América do Sul.
No sábado (06), último dia em Manaus, Roy Wagner conheceu e participou de um ritual dos índios tukano, tuyuka e dessana, em uma maloca localizada a quatro horas de Manaus em viagem de barco de recreio.
Na maloca, o indígena tuyuka Higino Tuyuka, que veio de São Gabriel da Cachoeira (a 851 quilômetros de Manaus), cidade onde 90% da população é indígena, apenas para participar das atividades e dialogar nos eventos com Roy Wagner, fez uma demonstração de um ritual de iniciação e apresentou ao antropólogo uma bebida típica chamada kahpí, de efeito alucinógeno e que é destinada apenas aos homens.
Perspectivas
“São muitas perspectivas se encontrando. Não considero um encontro de uma cultura nativa com um antropólogo, mas entre culturas compartilhando os mesmos espaços”, disse Wagner ao portal acrítica.com, ao final da experiência com os indígenas.
Esta foi a primeira vez que Wagner teve contato com os povos nativos da América do Sul, desde que começou seu trabalho como etnográfico e antropólogo.
Nas atividades desenvolvidas em Manaus, ele participou “uma conversa intercambiada sobre as cosmologias” e identificou semelhanças entre os ameríndios e os povos que estudou na Oceania.
A principal delas refere-se à relação entre o humano e os animais. “Na Austrália, os aborígenes têm uma relação, em sua cosmologia, com os corvos. Os animais são incorporados no mundo dos humanos. Aqui, vemos que os indígenas tem uma associação com os peixes. São os peixe-gente”, disse.
No seu diálogo com os indígenas brasileiros, Wagner, contudo, conta que encontrou uma característica específica: a preferência pelas “origens”. “Os povos daqui falam muito sobre o início, sobre a origem, a estrela Dalva, em contraste, por exemplo, com os povos aborígenes, que falam mais do poente, para a morte”, descreveu.
Intelectuais
Roy Wagner veio a Manaus numa articulação do Instituto Brasil Plural, que vincula a Universidade Federal do Amazonas e a Universidade Federal de Santa Catarina.
Sua vinda ao Amazonas não estava prevista inicialmente. Convidado pelos professores do Programa de Pós-Graduação em Antropologia Social (PPGAS) da Ufam, ele aceitou o convite para dialogar com os intelectuais indígenas – professores e estudantes.
A agenda do antropológo inclui palestras em Florianópolis (SC), Brasília (DF), Rio de Janeiro (RJ) e São Paulo (SP).
“Os intelectuais indígenas são aqueles que detêm as suas formas específicas do conhecimento. Alguns não são necessariamente pessoas que passaram pela universidade, mas que detém um profundo conhecimento”, descreveu o professor Carlos Dias, do PPGAS.
Carlos Dias disse que Roy Wagner ficou muito impressionado com a experiência vivenciada no Amazonas, sobretudo pela interlocução com os indígenas com os quais teve oportunidade de conversar.
Dias contou que, no domingo (08), o orientando de Wagner entrou em contato com os professores da Ufam e contou que “o grande momento no Brasil do antropólogo foi sua vinda à Amazônia”.
Conforme Carlos, em seu contato com os indígenas, Wagner encontrou uma grande quantidade de paralelos em termos cosmológicos entre os ameríndios e os povos que estudou, no passado.
“O Roy Wagner cria uma nova teoria de noção da cultura quando leva a sério essas novas formas de pensar. Capturar o outro através de seu conhecimento.
João Paulo Barreto, indígena tukano e mestrando em antropologia da Ufam, comentou que Wagner ficou surpreso com a apresentação de perspectivas na visão indígena. Isto ocorreu quando o líder Higino Tuyuka, durante o ritual, relacionou o cocar utilizado por João Paulo com as estruturas da maloca.
Estévão Barreto, também tukano e mestre em Sociedade e Cultura da Amazônia, destacou que a presença de Roy Wagner indicou a necessidade de promover o diálogo “ciência indígena e o saber científico”.
Igualdade
Roy Wagner tem formação em literatura inglesa, história, astronomia e antropologia.
Seus trabalhos mais conhecidos foram realizados entre os Dabiri, na Nova Guiné, e entre os aborígenes, na Austrália. Sua obra mais conhecida, “A Invenção da Cultura”, foi lançada em 1975 e teve uma revisão em 1981. No Brasil, o livro foi traduzido apenas em 2010.
No Brasil, seu principal interlocutor é o antropólogo Eduardo Viveiros de Castro, autor do conceito de Perspectivismo.
No livro “A Invenção da Cultura”, Wagner diz que “o antropólogo usa sua própria cultura para investigar outras, e para estudar a cultura em geral”. Ou seja, “a idéia de cultura coloca o pesquisador em pé de igualdade com seus objetos de estudo: cada qual ‘pertence a uma cultura’.”.
Para Roy Wagner, “um antropólogo ‘experencia´, de um modo ou de outro, seu objeto de estudo; ele o faz através do universo de seus próprios significados, e então se vale dessa experiência carregada de significados para comunicar uma compreensão aos membros de própria cultura”.
* * *
Antropólogo autor de “A Invenção da Cultura” ministra aula magna na Ufam nesta quinta
Roy Wagner é um dos maiores importantes antropólogos da atualidade. o norte-americano vem pela primeira vez ao Brasil
Manaus, 03 de Agosto de 2011
ACRITICA.COM
Um dos mais renomados antropólogos da atualidade, o norte-americano Roy Wagner, ministra aula magna de abertura do semestre do curso de mestrado em Antropologia da Universidade Federal do Amazonas (Ufam), nesta quinta-feira (04), às 9h, no auditório Rio Solimões do Instituto de Ciências Humanas e Letras (ICHL/Ufam).
O professor do Programa de Pós-Graduação em Antropologia Social (PPGAS), Gilton Mendes, disse que Roy Wagner interessou-se pelo convite de vir a Manaus estimulado pela ideia de conversar com “conhecedores sobre a antropologia indígena amazônica”.
Autor de “A Invenção da Cultura”, Roy Wagner estudou astronomia, literatura inglesa e história na Universidade de Harvard, e fez sua pós-graduação em antropologia na Universidade de Chicago.
O livro “A Invenção da Cultura” foi lançado em 1975, mas só teve edição no Brasil no ano passado. Era uma das obras mais esperadas pelo meio antropólogo nos últimos anos no país.
No dia 5 de agosto, Roy Wagner participará de uma mesa-redonda intitulada ‘Conversações Melanésias e Amazônia’, com os pesquisadores indígenas. Promovida pelo Programa de Pós-Graduação em Antropologia Social em conjunto com o Núcleo de Estudos da Amazônia Indígena (Neai).
O evento acontecerá às 15h, na Rua Coronel Sérgio Pessoa, 147, na Praça dos Remédios, Centro de Manaus. A mesa-redonda contará com a participação especial de Justin Shaffner, da Universidade de Cambridge (EUA).
Indígenas
Roy Wagner iniciou seu trabalho de campo entre os Daribi no monte Karimui, na Nova Guiné, sobre quem escreveu e publicou sua monografia dedicada aos princípios daribi de definição de clã e aliança.
A partir da etnografia daribi, Wagner desenvolveu uma teoria geral sobre a invenção de significado e sobre a noção de cultura, publicada em “A invenção da cultura”, que ganhou nova edição revista e ampliada em 1981.
A obra radicaliza uma reflexão sobre o polêmico conceito de cultura em antropologia: a partir da consideração dos modos de conceitualização nativos, ela reformula a própria disciplina antropológica.
Para Wagner, não se trata de entender o que outros povos produzem como “cultura” a partir de um dado universal (a “natureza”), mas antes, o que é concebido como dado por outras populações. Com isto, a própria noção de “natureza” como dado universal e de “cultura” ficam sob suspeição.
Sua vinda ao Brasil faz parte das iniciativas programadas do Instituto Brasil Plural, uma rede de pesquisadores articulada pelos Programas de Pós-Graduação da Universidade de Santa Catarina (UFSC) e da Universidade Federal do Amazonas (Ufam), financiada pelo CNPq, a Fapesc e a Fapeam.
ScienceDaily (Apr. 17, 2012) — Childhood exposure to lead dust has been linked to lasting physical and behavioral effects, and now lead dust from vehicles using leaded gasoline has been linked to instances of aggravated assault two decades after exposure, says Tulane toxicologist Howard W. Mielke.
Vehicles using leaded gasoline that contaminated cities’ air decades ago have increased aggravated assault in urban areas, researchers say.
The new findings are published in the journal Environment International by Mielke, a research professor in the Department of Pharmacology at the Tulane University School of Medicine, and demographer Sammy Zahran at the Center for Disaster and Risk Analysis at Colorado State University.
The researchers compared the amount of lead released in six cities: Atlanta, Chicago, Indianapolis, Minneapolis, New Orleans and San Diego, during the years 1950-1985. This period saw an increase in airborne lead dust exposure due to the use of leaded gasoline. There were correlating spikes in the rates of aggravated assault approximately two decades later, after the exposed children grew up.
After controlling for other possible causes such as community and household income, education, policing effort and incarceration rates, Mielke and Zahran found that for every one percent increase in tonnages of environmental lead released 22 years earlier, the present rate of aggravated assault was raised by 0.46 percent.
“Children are extremely sensitive to lead dust, and lead exposure has latent neuroanatomical effects that severely impact future societal behavior and welfare,” says Mielke. “Up to 90 per cent of the variation in aggravated assault across the cities is explained by the amount of lead dust released 22 years earlier.” Tons of lead dust were released between 1950 and 1985 in urban areas by vehicles using leaded gasoline, and improper handling of lead-based paint also has contributed to contamination.
Você precisa fazer login para comentar.