GLOBAL warming isn’t a prediction. It is happening. That is why I was so troubled to read a recent interview with President Obama in Rolling Stone in which he said that Canada would exploit the oil in its vast tar sands reserves “regardless of what we do.”
If Canada proceeds, and we do nothing, it will be game over for the climate.
Canada’s tar sands, deposits of sand saturated with bitumen, contain twice the amount of carbon dioxide emitted by global oil use in our entire history. If we were to fully exploit this new oil source, and continue to burn our conventional oil, gas and coal supplies, concentrations of carbon dioxide in the atmosphere eventually would reach levels higher than in the Pliocene era, more than 2.5 million years ago, when sea level was at least 50 feet higher than it is now. That level of heat-trapping gases would assure that the disintegration of the ice sheets would accelerate out of control. Sea levels would rise and destroy coastal cities. Global temperatures would become intolerable. Twenty to 50 percent of the planet’s species would be driven to extinction. Civilization would be at risk.
That is the long-term outlook. But near-term, things will be bad enough. Over the next several decades, the Western United States and the semi-arid region from North Dakota to Texas will develop semi-permanent drought, with rain, when it does come, occurring in extreme events with heavy flooding. Economic losses would be incalculable. More and more of the Midwest would be a dust bowl. California’s Central Valley could no longer be irrigated. Food prices would rise to unprecedented levels.
If this sounds apocalyptic, it is. This is why we need to reduce emissions dramatically. President Obama has the power not only to deny tar sands oil additional access to Gulf Coast refining, which Canada desires in part for export markets, but also to encourage economic incentives to leave tar sands and other dirty fuels in the ground.
The global warming signal is now louder than the noise of random weather, as I predicted would happen by now in the journal Science in 1981. Extremely hot summers have increased noticeably. We can say with high confidence that the recent heat waves in Texas and Russia, and the one in Europe in 2003, which killed tens of thousands, were not natural events — they were caused by human-induced climate change.
We have known since the 1800s that carbon dioxide traps heat in the atmosphere. The right amount keeps the climate conducive to human life. But add too much, as we are doing now, and temperatures will inevitably rise too high. This is not the result of natural variability, as some argue. The earth is currently in the part of its long-term orbit cycle where temperatures would normally be cooling. But they are rising — and it’s because we are forcing them higher with fossil fuel emissions.
The concentration of carbon dioxide in the atmosphere has risen from 280 parts per million to 393 p.p.m. over the last 150 years. The tar sands contain enough carbon — 240 gigatons — to add 120 p.p.m. Tar shale, a close cousin of tar sands found mainly in the United States, contains at least an additional 300 gigatons of carbon. If we turn to these dirtiest of fuels, instead of finding ways to phase out our addiction to fossil fuels, there is no hope of keeping carbon concentrations below 500 p.p.m. — a level that would, as earth’s history shows, leave our children a climate system that is out of their control.
We need to start reducing emissions significantly, not create new ways to increase them. We should impose a gradually rising carbon fee, collected from fossil fuel companies, then distribute 100 percent of the collections to all Americans on a per-capita basis every month. The government would not get a penny. This market-based approach would stimulate innovation, jobs and economic growth, avoid enlarging government or having it pick winners or losers. Most Americans, except the heaviest energy users, would get more back than they paid in increased prices. Not only that, the reduction in oil use resulting from the carbon price would be nearly six times as great as the oil supply from the proposed pipeline from Canada, rendering the pipeline superfluous, according to economic models driven by a slowly rising carbon price.
But instead of placing a rising fee on carbon emissions to make fossil fuels pay their true costs, leveling the energy playing field, the world’s governments are forcing the public to subsidize fossil fuels with hundreds of billions of dollars per year. This encourages a frantic stampede to extract every fossil fuel through mountaintop removal, longwall mining, hydraulic fracturing, tar sands and tar shale extraction, and deep ocean and Arctic drilling.
President Obama speaks of a “planet in peril,” but he does not provide the leadership needed to change the world’s course. Our leaders must speak candidly to the public — which yearns for open, honest discussion — explaining that our continued technological leadership and economic well-being demand a reasoned change of our energy course. History has shown that the American public can rise to the challenge, but leadership is essential.
The science of the situation is clear — it’s time for the politics to follow. This is a plan that can unify conservatives and liberals, environmentalists and business. Every major national science academy in the world has reported that global warming is real, caused mostly by humans, and requires urgent action. The cost of acting goes far higher the longer we wait — we can’t wait any longer to avoid the worst and be judged immoral by coming generations.
James Hansen directs the NASA Goddard Institute for Space Studies and is the author of “Storms of My Grandchildren.”
The study identified that high biodiversity areas also had high linguistic diversity
The decline of linguistic and cultural diversity is linked to the loss of biodiversity, a study has suggested.
The authors said that 70% of the world’s languages were found within the planet’s biodiversity hotspots.
Data showed that as these important environmental areas were degraded over time, cultures and languages in the area were also being lost.
The results of the study have been published in the Proceedings of the National Academy of Sciences (PNAS).
“Biologists estimate annual loss of species at 1,000 times or more greater than historic rates, and linguists predict that 50-90% of the world’s languages will disappear by the end of the century,” the researchers wrote.
Lead author Larry Gorenflo from Penn State University, in the US, said previous studies had identified a geographical connection between the two, but did not offer the level of detail required.
Dr Gorenflo told BBC News that the limitation to the data was that either the languages were listed by country or there was a dot on the map to indicate the location.
“But what you did not know was if the area extended two kilometres or 200 kilometres, so you really did not get a sense of the extent of the language,” he explained.
“We used improved language data to really get a more solid sense of how languages and biodiversity co-occurred and an understanding of how geographically extensive the language was.”
He said the study achieved this by also looking at smaller areas with high biodiversity, such as national parks or other protected habitats.
“When we did that, not only did we get a sense of co-occurrence at a regional scale, but we also got a sense that co-occurrence was found at a much finer scale,” he said.
“We are not quite sure yet why this happens, but in a lot of cases it may well be that biodiversity evolved as part-and-parcel of cultural diversity, and vice versa.”
In their paper, the researchers pointed out that, out of the 6,900 or more languages spoken on Earth, more than 4,800 occurred in regions containing high biodiversity.
Dr Gorenflo described these locations as “very important landscapes” which were “getting fewer and fewer” but added that the study’s data could help provide long-term security.
“It provides a wonderful opportunity to integrate conservation efforts – you can have people who can get funding for biological conservation, and they can collaborate with people who can get funding for linguistic or cultural conservation,” he suggested.
“In the past, it was hard to get biologists to look at people.
“That has really changed dramatically in the past few years. One thing that a lot of biologists and ecologists are now seeing is that people are part of these ecosystems.”
ScienceDaily (Feb. 22, 2011) — A new paper by George Mason University researchers shows that ‘Climategate’ — the unauthorized release in late 2009 of stolen e-mails between climate scientists in the U.S. and United Kingdom — undermined belief in global warming and possibly also trust in climate scientists among TV meteorologists in the United States, at least temporarily.
In the largest and most representative survey of television weathercasters to date, George Mason University’s Center for Climate Change Communication and Center for Social Science Research asked these meteorologists early in 2010, when news stories about the climate e-mails were breaking, several questions about their awareness of the issue, attention to the story and impact of the story on their beliefs about climate change. A large majority (82 percent) of the respondents indicated they had heard of Climategate, and nearly all followed the story at least “a little.”
Among the respondents who indicated that they had followed the story, 42 percent indicated the story made them somewhat or much more skeptical that global warming is occurring.These results stand in stark contrast to the findings of several independent investigations of the emails, conducted later, that concluded no scientific misconduct had occurred and nothing in the emails should cause doubts about the fact which show that global warming is occurring.
The results, which were published in the journal Bulletin of the American Meteorology Society, also showed that the doubts were most pronounced among politically conservative weathercasters and those who either do not believe in global warming or do not yet know. The study showed that age was not a factor nor was professional credentials, but men — independent of political ideology and belief in global warming — were more likely than their female counterparts to say that Climategate made them doubt that global warming was happening.
“Our study shows that TV weathercasters — like most people — are motivated consumers of information in that their beliefs influence what information they choose to see, how they evaluate information, and the conclusions they draw from it,” says Ed Maibach, one of the researchers. “Although subsequent investigations showed that the climate scientists had done nothing wrong, the allegation of wrongdoing undermined many weathercasters’ confidence in the conclusions of climate science, at least temporarily.”
The poll of weathercasters was conducted as part of a larger study funded by the National Science Foundation on American television meteorologists. Maibach and others are now working with a team of TV meteorologists to test what audience members learn when weathercasters make efforts to educate their viewers about the relationship between the changing global climate and local weather conditions.
Ultimately, the team hopes to answer key research questions about how to help television meteorologists nationwide become an effective source of informal science education about climate change.
“Most members of the public consider television weather reporters to be a trusted source of information about global warming — only scientists are viewed as more trustworthy,” says Maibach. “Our research here is based on the premise that weathercasters, if given the opportunity and resources, can become an important source of climate change education for a broad cross section of Americans.”
ScienceDaily (Mar. 29, 2010) — In a time when only a handful of TV news stations employ a dedicated science reporter, TV weathercasters may seem like the logical people to fill that role, and in many cases they do.
In the largest and most representative survey of television weathercasters to date, George Mason University’s Center for Climate Change Communication shows that two-thirds of weathercasters are interested in reporting on climate change, and many say they are already filling a role as an informal science educator.
“Our surveys of the public have shown that many Americans are looking to their local TV weathercaster for information about global warming,” says Edward Maibach, director of the Center for Climate Change Communication. “The findings of this latest survey show that TV weathercasters play — or can play — an important role as informal climate change educators.”
According to the survey, climate change is already one of the most common science topics TV weathercasters discuss — most commonly at speaking events, but also at the beginning or end of their on-air segments, on blogs and web sites, on the radio and in newspaper columns.
Weathercasters also indicated that they are interested in personalizing the story for their local viewers — reporting on local stories such as potential flooding/drought, extreme heat events, air quality and crops. About one-quarter of respondents said they have already seen evidence of climate change in their local weather patterns.
“Only about 10 percent of TV stations have a dedicated specialist to cover these topics,” says University of Texas journalism professor Kristopher Wilson, a collaborator on the survey. “By default, and in many cases by choice, science stories become the domain of the only scientifically trained person in the newsroom — weathercasters.”
Many of the weathercasters said that having access to resources such as climate scientists to interview and high-quality graphics and animations to use on-air would increase their ability to educate the public about climate change.
However, despite their interest in reporting more on this issue, the majority of weathercasters (61 percent) feel there is a lot of disagreement among scientists about the issue of global warming. Though 54 percent indicated that global warming is happening, 25 percent indicated it isn’t, and 21 percent say they don’t know yet.
“A recent survey showed that more than 96 percent of leading climate scientists are convinced that global warming is real and that human activity is a significant cause of the warming,” says Maibach. “Climate scientists may need to make their case directly to America’s weathercasters, because these two groups appear to have a very different understanding about the scientific consensus on climate change.”
This survey is one part of a National Science Foundation-funded research project on meteorologists. Using this data, Maibach and his research team will next conduct a field test of 30-second, broadcast-quality educational segments that TV weathercasters can use in their daily broadcasts to educate viewers about the link between predicted (or current) extreme weather events in that media market and the changing global climate.
Ultimately, the team hopes to answer key research questions supporting efforts to activate TV meteorologists nationwide as an important source of informal science education about climate change.
Comentário de Alexandre A. Costa, um dos mais respeitados meteorologistas do Brasil, sobre a entrevista:
A Negação da Mudança Climática e a Direita Organizada (10 de maio de 2012 – postado no Facebook)
Vocês devem ter assistido ou ouvido falar da entrevista recentemente veiculada no programa do Jô, com o Sr. Ricardo Felício que, mesmo sendo professor da Geografia da USP, atacou a comunidade de cientistas do clima, esboçou uma série de teorias conspiratórias e cometeu absurdos que não fazem sentido científico algum como as afirmações de que “não há elevação do nível do mar”, “o efeito estufa não existe”, “a camada de ozônio não existe”, “a Floresta Amazônica se reconstituiria em 20 anos após ser desmatada” e chegou ao auge ao apresentar uma explicação desprovida de sentido para a alta
temperatura de Vênus, apresentando uma interpretação totalmente absurda da lei dos gases.
Enfim, o que levaria uma pessoa que, a princípio é ligada à comunidade acadêmica, a postura tão absurda? Primeiro, achei tratar-se de alpinismo midiático. Como o currículo da figura não mostra nenhuma produção minimamente relevante, achei apenas que bater no “mainstream” fosse uma maneira de chamar atenção, atrair publicidade, ganhar fama, etc. Ingenuidade minha.
Entrevistador: “Você conhece alguma instituição que apóie o seu pensamento? Como ela funciona? E o que ela faz?” Ridardo Felício: “Recomendo que procurem, aqui no Brasil, a MSIa – Movimento de Solidariedade Ibero-Americana.”
Mas quem é essa MSIa? Um grupo de extrema-direita especialista em teorias conspiratórias e em ataques ao Greenpeace (“um instrumento político das oligarquias internacionais”), ao Movimento de Trabalhadores Sem Terra — MST (“um instrumento de guerra contra o Estado Brasileiro), o Foro de São Paulo (“reúne grupos revolucionários que objetivam desestabilizar as Forças Armadas”), a Pastoral da Terra, etc. Eu mesmo fui no site dessa organização e a última desse pessoal é uma campanha contra a Comissão da Verdade, a favor dos militares (“A quem interessa uma crise militar”)! Para quem quiser conhecer os posicionamentos desse pessoal, basta checar em http://www.msia.org.br/
Eis que um pouco mais de busca e achei o Ricardo Felicio sendo citado (‘”A ONU achou um jeito de implementar seu governo global, e o mundo será gerido por painéis pseudocientíficos””) onde? No site http://www.midiasemmascara.org/ do ultra-direitista Olavo de Carvalho…
Parece ser sintomático que às vésperas do final do prazo para veto do Código ruralista, alguém com esse tipo de vínculo (a MSIa se associa à UDR) venha dizer que se pode desmatar a Amazônia que a mesma se regenera em vinte anos… É interessante que a acusação de uma agenda “ambientalista”, “comunista”, de “governança internacional” ou qualquer que seja o delírio que os negadores da mudança climática colocam ao tentarem politizar-ideologizar a questão apenas mostram de onde vem essa politização-ideologização e com que matiz.
Como costumo dizer, moléculas de CO2 não têm ideologia e absorvem radiação infravermelho, independente da existência não só de posições políticas, mas até dos humanos que as expressam. O aumento de suas concentrações na atmosfera terrestre não poderiam ter outro efeito que não o de aquecimento do sistema climático global. Negar uma verdade científica óbvia então só faz sentido para aqueles que têm interesses atingidos. E fica claro. Esse senhor, que academicamente é um farsante é, na verdade, um militante de direita. Parafraseando aqueles que tanto o admiram, precisa aparecer na mídia sem a máscara de “professor da USP”, “climatologista”, etc., mas sim com sua verdadeira face.
Alexandre A. Costa, Ph.D.
Professor Titular
Mestrado em Ciências Físicas Aplicadas
Universidade Estadual do Ceará
A Negação das Mudanças Climáticas e a Direita Organizada – Parte II: Mais Revelações (13 de maio de 2012 – postado no Facebook)
Não é difícil continuar a ligar os pontos, após a aparição do Sr. Ricardo Felício no programa do Jô Soares. Por que alguém se disporia a se expor ao ridículo daquela forma? Como alguém seria capaz de, na posição de doutor em Geografia, professor da USP e “climatologista”, assassinar não apenas o conhecimento científico recente, mas leis básicas da Física, conhecimentos fundamentais de Química, Ecologia, etc.? O que levaria alguém a insultar de forma tão grosseira a comunidade acadêmica brasileira e internacional, principalmente a nós, Cientistas do Clima?
O que pretendo mostrar é que para chegar a esse ponto, é preciso ter motivações. E estas, meus caros, não são de mera vaidade, desejo pelo estrelato, etc. É uma agenda.
Para os que quiserem continuar comigo a rastrear a motivação por trás dessa tal entrevista, peço que visitem, mesmo que isso dê a eles alguma audiência, o repositório dos vídeos do pop-star tupiniquim da negação das mudanças climáticas em http://www.youtube.com/user/TvFakeClimate. Lá, os links são para o conhecido site http://www.msia.org.br/ do “Movimento de Solidariedade Íbero-Americana”, cujo nome pomposo esconde o neo-fascismo LeRouchista, especializado em teorias conspiratórias e manipulação e inimigo visceral, como se pode ver em seu site, do MST, do movimento feminista, do movimento de direitos humanos, da Comissão da Verdade, etc; para o não menos direitoso http://www.midiaamais.com.br/, cujos artigos não consegui ler até o fim, mas que são de ataques de direita a Obama, de ridicularização do movimento dos moradores do Pinheirinho, em SJC, de combate à decisão do STF em considerar as cotas constitucionais e, claro, negação da mudança climática e ataques ao IPCC, etc,; um site anti-movimento ambientalista de nome http://ecotretas.blogspot.com/, que por sua vez contém links neo-fascistas como “vermelho não” (http://vermelhosnao.blogspot.com.br/search/label/verdismo), que por sinal está fazendo a campanha “Não Veta, Dilma”, ou especializados em teorias conspiratórias como http://paraummundolivre.blogspot.com.br/ e até diretistas exóticos, defensores da restauração da monarquia em Portugal (http://quartarepublica.wordpress.com/) ou neo-salazaristas (http://nacionalismo-de-futuro.blogspot.com.br/).
Como coloquei em diversos momentos, não é a escolha política-ideológica que faz com que alguém tenha ou não razão em torno da questão climática. Tenho colegas em minha comunidade de pesquisa que simpatizam com os mais variados matizes político-ideológicos (o que por si só já dificultaria que nos juntássemos numa “conspiração”… como é mesmo… ah!… para “conquistar uma governança mundial da ONU via painéis de clima”, tipo de histeria típico da direita mais tresloucada dos EUA). A questão do clima é objetiva. Os mecanismos de controle do clima são conhecidos, incluindo o papel dos gases de efeito estufa. As medições, os resultados de modelos (atacados de maneira desonesta pelo entrevistado), os testemunhos paleoclimáticos, todos convergem. E dentre todas as possíveis hipóteses para o fenômeno do aquecimento do sistema climático, a contribuição antrópica via emissão de gases de efeito estufa foi a única a permanecer de pé após todos os testes. Constatar isso independe de ideologia. Basta abrir os olhos. O tipo de política pública a ser aplicada para lidar com os impactos, a adaptação às mudanças e a mitigação das mesmas, aí sim… é um terreno em que as escolhas políticas adquirem grau de liberdade.
O problema é que, para uma determinada franja político-ideológica, no caso a extrema-direita, há realmente incompatibilidade com qualquer agenda ambiental que possa significar controle público sobre o capital privado. Há também uma necessidade de ganhar respaldos afagando desejos escondidos da opinião pública (como o de que nada precisa ser feito a respeito das mudanças climáticas) e fazendo apelos ao nacionalismo (típico dos Mussolinis, dos Hitlers, dos Francos, dos Salazares e de tantas ditaduras de direita na América Latina) – ainda que eventualmente isso signifique adotar um discurso falsamente antiimperialista. Com esses objetivos “maiores”, que incluem sabotar a campanha pelo veto presidencial sobre o monstro que é o Código Florestal aprovado pelos deputados, para que compromisso com a verdade científica? Para que ética e tratamento respeitoso em relação aos demais colegas de mundo acadêmico?
É impressionante como aqueles que nos acusam de “fraude”, “conspiração”, etc., na verdade são exatamente os que as praticam. Como coloquei em outros textos que escrevi sobre o assunto, é preciso desmistificar cientificamente os pseudo-argumentos apresentados pelos negadores (e isso tenho feito em outros textos), mas como bem lembra o colega Michael Mann, eles são como a hidra. Sempre têm mais mentiras na manga para lançarem por aí e não têm preocupação nenhuma em apresentarem um todo coerente em oposição aos pontos de vista da comunidade científica. Interessa a eles semearem confusão, ganharem espaço político, atrasarem ações de proteção da estabilidade climática, darem tempo para os que os financiam na base (ainda que possa haver negadores não ligados diretamente à indústria de petróleo e outras, mas já ficou evidente a ligação desta com a campanha articulada anti-ciência do clima em escala mundial). A pseudo-ciência e a impostura intelectual são as cabeças da hidra. O coração do monstro é a agenda político-ideológica. Mas a espada da verdade é longa o suficiente para ferir-lhe de morte!
Alexandre A. Costa, Ph.D.
Professor Titular
Mestrado em Ciências Físicas Aplicadas
Universidade Estadual do Ceará
Em Defesa da Ciência do Clima (10 de maio de 2012 – postado no Facebook)
Tenho me preocupado muito com os ataques feitos recentemente à Ciência do Clima, dentre outros motivos, porque estes tem se constituído num amálgama estranho que reúne o Tea Party, a indústria petroquímica e pessoas que parecem acreditar numa grande conspiração imperialista para, ao impedir que queimem suas reservas de combustíveis fósseis, a periferia do capitalismo se “desenvolva”, o que, com o perdão da palavra, já é per si uma visão absolutamente tacanha de “desenvolvimento”.
Mas essa não é uma questão ideológica, mesmo porque se o fosse estaria eu distante de Al Gore. É uma questão científica, pois moléculas de CO2 não têm ideologia. O que elas são dotadas, assim como outras moléculas (caso do CH4 e do próprio vapor d’água), é de uma propriedade da qual não gozam os gases majoritários em nossa atmosfera, que é a de um modo de oscilação cuja frequência coincide com a de uma região do espectro eletromagnético conhecida como infravermelho. A retenção do calor é uma consequência da presença desses gases (mesmo tão minoritários) na atmosfera terrestre. Não fosse por eles, a Terra teria temperatura média de -18 graus, em contraste com os moderados 15, para não falar do papel dos mesmos em mantê-la entre limites amenos. A Terra não é Mercúrio que, por não ter atmosfera, devolve livremente a energia absorvida do Sol na porção em que é dia, levando-o a contrastes de temperatura de 430 graus durante o dia e -160 graus à noite. Felizmente, tampouco é Vênus, cuja cobertura de nuvens faz com que chegue à sua superfície menos energia solar do que na Terra, mas cujo efeito estufa, causado por sua atmosfera composta quase que exclusivamente por CO2, eleva sua temperatura a praticamente constantes 480 graus.
Desconhecer essas idéias científicas simples, de que o CO2 é um gás de efeito estufa (conhecido e medido por Tyndall, Arrhenius e outros, desde o século XIX), com mecanismo bem explicado pela Física de sua estrutura molecular; ignorar o conhecido efeito global que o CO2 tem sobre um planeta vizinho, o que é bem estabelecido pela astronomia desde o saudoso Sagan, não faz sentido, especialmente no meio acadêmico, onde encontram-se alguns dos negadores mais falantes. A esses eu gostaria de lembrar de algo básico no método científico. De um lado, a ciência não tem dogma, nem verdades definitivas. Suas verdades são sempre, por construção, parciais e provisórias (que bom, senão viraria algo chato e tedioso como, digamos, uma religião). No entanto, por outro lado, o conhecimento científico é cumulativo e, nesse sentido, não se pode andar para trás! Só quando uma teoria falha, se justifica uma nova e esta não pode ser apenas a negação da anterior, pois precisa ser capaz de reproduzir todos os seus méritos (caso da Mecânica Clássica e da Relatividade, que se reduz à primeira para baixas velocidades).
Não é uma questão de crença. “Monotonia” à parte, é ciência bem estabelecida, bem conhecida. Tanto quanto a Gravitação Universal (que também é “apenas” uma teoria) ou a Evolução das Espécies.
INJUSTIÇA, DESRESPEITO E SUBESTIMAÇÃO
Os Cientistas do Clima tem sofrido ataques, com base em factóides que em nenhum momento se assemelham à realidade de nossa área. Nenhuma Ciência é hoje tão pública e aberta. Quem quiser, pode obter facilmente, na maioria dos casos diretamente pela internet, dados observados do clima, que demonstram claramente o aquecimento global (www.cru.uea.ac.uk/cru/data/ dentre outros), dados de modelagem que estão sendo gerados agora e que certamente subsidiarão o 5o relatório do IPCC (http://cmip-pcmdi.llnl.gov/cmip5/data_portal.html) ou dados de testemunhos paleoclimáticos, que servem para analisar o clima do passado (www.ncdc.noaa.gov). Pode obter os relatórios do IPCC, emwww.ipcc.ch e seguir as referências, revisadas e publicadas em sua esmagadora maioria, principalmente no caso do Grupo de Trabalho que lida com as Bases Físicas do Sistema Climático, em revistas de grande impacto, sejam gerais (Science, Nature), sejam da área. Duvido que, em nossas universidades, cheias de laboratórios com convênios privados, sejam na engenharia de materiais ou na bioquímica, haja um segmento tão aberto, que tenha o desprendimento de sentar à mesa, compartilhar dados, levantar o estado-da-arte em sua ciência e elaborar coletivamente um relatório de síntese. Duvido! Desafio!
Os cientistas que participamos desses painéis não somos “representantes de governos”. Nada é criado ou inventado nesses painéis, além de uma síntese da Ciência que é produzida de maneira independente e publicada na literatura revisada por pares. Os que participam da comunidade acadêmica podem, inclusive, se informar melhor com facilidade, junto a colegas da comunidade científica brasileira que participaram e participam das iniciativas do IPCC e do PBMC sobre o funcionamento desses painéis, antes de emitir opinião, para que não terminem, na prática, difamando o que desconhece. Algumas pessoas, sem a menor conduta crítica em relação aos detratores do IPCC, repete-lhes a verborragia, quando poderiam ser céticos em relação aos “céticos”.
Mas não o são. Em nenhum momento, questionam as reais motivações de dois ou três (felizmente, são tão raros) que assumem a conduta lamentável da negação anti-ciência, ou por serem abertamente corruptos e serviçais da indústria petroquímica ou, simplesmente, por terem uma vaidade que não cabe no papel secundário que cumpririam caso estivessem, como nós, desprendendo, em geral quase anonimamente, enorme energia para colocar tijolo por tijolo no edifício da Ciência do Clima. É preciso saber distinguir entre o ceticismo honesto, genuíno, que é saudável em ciência, consonante com a dúvida sincera e a conduta crítica, da negação religiosa, baseada em fé e na necessidade cega de defender determinado ponto de vista, independente se o mesmo tem base real ou não e, principalmente, da canalhice pura e simples, que é o que é promovido por alguns dos negadores. O possível “sucesso” dessas idéias junto ao público, para mim, são terreno da psicologia social, mas a melhor analogia que tenho é a da popularidade de idéias religiosas, em geral mentiras reconfortantes que são preferidas em detrimento de verdades desagradáveis.
O verdadeiro ceticismo levou até onde os físicos de Berkeley foram (http://www.berkeleyearth.org/index.php). Inicialmente questionando os resultados obtidos por nossa comunidade, se municiaram de um enorme banco de dados de temperatura em escala mundial, mais amplo do que os que o Hadley Centre inglês e a NASA dispunham. Testaram outras metodologias, chegaram até a excluir as estações meteorológicas usadas por nossos centros de pesquisa. A postura inicial de Richard Muller, idealizador dessa iniciativa, era de tamanho questionamento em relação a nossos resultados que ele chegou a alavancar recursos da famigerada Fundação Koch, abertamente anti-Ciência do Clima. Mas o que Muller e seus parceiros encontraram? O mesmo resultado que já nos era conhecido. A Terra está aquecendo e este aquecimento se acelerou bastante nas últimas décadas do século XX. Este aquecimento se aproxima de um grau e portanto está muito acima de todas as flutuações naturais registradas desde que se tem registro instrumental. Aliás, confirmou o que também sabíamos: que os dados da Universidade de East Anglia (aqueles mesmos da farsa montada sob o nome altissonante de “climategate”, aqueles que foram perseguidos e cuja reputação foi ignominiosamente atacada, com repercussões em suas carreiras profissionais e vidas pessoais) contém um erro… para menos! O aquecimento sugerido pelos dados da CRU/UEA é um décimo de grau inferior aos das outras fontes de dados e, claro, entre nós, ninguém os acusa de desonestos por isso.
Outra impostura – e infelizmente, apesar da dureza do termo, acho que é neste caso em que ele se aplica – é a subestimação da inteligência de nossa comunidade, aliada ao desconhecimento dos materiais por ela produzidos. O 4o relatório do IPCC já contém um capítulo exclusivamente sobre Paleoclimatologia, isto é, sobre o clima do passado. Eu pessoalmente tenho dedicado grandes esforços na análise de testemunhos do clima passado e na modelagem das condições climáticas passadas. Existe uma preocupação permanente em discernir o sinal natural e separar, dele, o sinal antrópico, desde o primeiro relatório do IPCC. Para isso, avalia-se o papel das variações de atividade solar, as emissões dos vulcões, etc. Já avaliamos as possíveis influências naturais e as descartamos como possível causa para o aquecimento observado.
Nesse sentido, não há lugar para sofismas e tergiversações. Sobre os registros paleoclimáticos, que são capazes de recontar o histórico de temperatura e de concentração de gases de efeito estufa de 800 mil anos atrás até o presente, todos sabemos que, no passado, um pequeno aquecimento do planeta precedeu o aumento da concentração dos gases de efeito estufa. Isso se deu antes do encerramento de todas as eras glaciais. Mas é um raciocínio obtuso deduzir daí que o CO2 não exerce nenhum papel ou, nas palavras dos negadores “é consequência e não causa”. Existem diversos processos de retroalimentação no sistema climático e este é um dos melhores exemplos. As sutis variações da insolação e da distribuição desta sobre a superfície da Terra associadas aos ciclos orbitais são – e isto é do conhecimento de todos – muito pequenas para explicar as grandes diferenças de temperatura entre os períodos glaciais (“eras do gelo”) e os interglaciais (períodos quentes, mais breves, que as intercalaram). Mas um aquecimento sutil, após alguns séculos, mostrou-se suficiente para elevar a emissões naturais de CO2 e metano, que causam efeito estufa e amplificam o processo. Essa retroalimentação só era refreada, em condições livres da ação do homem, quando as condições orbitais mudavam novamente, levando a um resfriamento sutil, que induzia a captura de CO2 no sistema terrestre, que por sua vez amplificava o resfriamento e assim por diante.
Mas não é porque pessoas morrem de câncer e infarto que não se possa atribuir responsabilidades a um assassino! Porque pessoas morrem naturalmente de derrame, alguém acha possível dizer que “é impossível que um tiro mate alguém”? Ou que não se deva julgar mais ninguém por assassinato? Antes, era preciso um pequeno aquecimento para deflagrar emissões naturais e aumento de concentração de CO2, para daí o aquecimento se acelerar. Hoje, há uma fonte independente de CO2, estranha aos ciclos naturais e esta é a queima de combustíveis fósseis! Devo, aliás, frisar que até a análise isotópica (a composição é diferente entre combustíveis fósseis e outras fontes) é clara: a origem do CO2 excedente na atmosfera terrestre é sim, em sua maioria, petróleo, carvão, gás natural! Um mínimo de verdadeiro aprofundamento científico deixa claro que, hoje, o aumento das concentrações de CO2 na atmosfera é eminentemente antrópico e que é isso que vem acarretando as mudanças climáticas observadas. Não é possível mais tapar o sol, ou melhor, tapar os gases de efeito estufa com uma peneira! Os registros paleoclimáticos mostram que o aquecimento atual é inédito nos últimos 2500 anos. Mostram que a concentração atual de CO2 está 110 ppm acima do observado antes da era industrial e quase 100 ppm acima do que se viu nos últimos 800 mil anos. Mostram que esse número é maior do que a diferença entre a concentração de CO2 existente nos interglaciais e nas “eras do gelo” e que isso faz, sim, grande diferença sobre o clima.
QUAIS OS VERDADEIROS ERROS
Algumas pessoas se dizem céticas, críticas e desconfiadas em relação à maioria de nossa comunidade de cientistas do clima, mas não percebem o erro fundamental que cometem: a absoluta falta de ceticismo, criticidade e desconfiança em relação aos que nos detratam. A postura dos que combatem a Ciência do Clima sob financiamento da indústria petroquímica, ou em associação com setores partidários e da mídia mais reacionários é auto-explicativa. Interessa o acobertamento da realidade. Mas não só. Há desde essas pessoas que recebem diretamente recursos da indústria do petróleo a falastrões que há muito não têm atuação científica de verdade na área e, sem serem capazes de permanecer em evidência trabalhando seriamente para contribuir com o avançar de nossa ciência, debruçando-se sobre as verdadeiras incertezas, contribuindo para coletar dados, melhorar métodos e modelos, etc., apenas para manterem holofotes sobre si, têm atacado o restante da comunidade. Estranho e espalhafatoso como as penas de pavão. Prosaico como os mecanismos evolutivos que levaram tais penas a surgirem. Daí é preciso também combater o ponto de vista daqueles que dão a esse ataque um falso verniz “de esquerda”, pois lançam mão de teorias de conspiração, uma deturpação patológica do raciocínio crítico. Lutar com o alvo errado, com a arma errada, é pior do que desarmar para a luta.
O IPCC é perfeito? Não, é claro. Cometeu erros. Mas querem saber, de fato, quais são? Uma coisa precisa ficar claro a todos. As avaliações do IPCC tendem a ser conservadoras. As projeções de temperatura realizadas para após o ano 2000 estão essencialmente acertadas, mas sabe o que acontece com as projeções de elevação do nível dos oceanos e de degelo no Ártico? Estão subestimadas. Isso mesmo. O cenário verdadeiro é mais grave do que o 4o relatório do IPCC aponta. Mas de novo não é por uma questão política, mas pela limitação, na época, dos modelos de criosfera, incapazes de levar em conta processos importantes que levam ao degelo. Provavelmente, baseando-se em artigos que vêm sendo publicados nesse meio tempo, o 5o relatório será capaz de corrigir essas limitações e mostrar um quadro mais próximo da real gravidade do problema em 2013-2014 quando de sua publicação.
QUAL A VERDADEIRA QUESTÃO IDEOLÓGICA?
Não faz sentido “acreditar” ou não na gravidade, na evolução ou no efeito estufa. Não se trata de uma “opção ideológica” (apesar de haver, nos EUA, uma forte correlação entre ideologia e ciência junto ao eleitorado republicano mais reacionário, que dá ouvidos aos detratores da ciência do clima e que também querem ver Darwin fora das escolas).
A verdadeira questão ideológica, é que as mudanças climáticas são um processo de extrema desigualdade, da raiz, aos seus impactos. Quem mais se beneficiou das emissões dos gases de efeito estufa foram e continuam sendo as classes dominantes dos países capitalistas centrais. Juntamente com os mega-aglomerados do capital financeiro, a indústria petroquímica, o setor de mineração (que inclui mineração de carvão), o setor energético, etc. concentraram riquezas usando a atmosfera como sua grande lata de lixo. Mais do que a “pegada” de carbono atual (que é ainda extremamente desigual se compararmos americanos, europeus e australianos, de um lado, com africanos do outro), é mais díspar ainda a “pegada histórica” (isto é, o já emitido, o acumulado a partir das emissões de cada país), que faz da Europa e, em seguida, dos EUA, grandes emissores históricos.
Cruelmente, em contrapartida, os impactos das mudanças no clima recairão sobre os países mais pobres, sobre as pequenas nações, principalmente sobre os pobres dos países pobres, sobre os mais vulneráveis. Perda de territórios em países insulares, questões de segurança hídrica e alimentar em regiões semi-áridas (tão vastas no berço de nossa espécie, que é o continente africano), efeitos de eventos severos (que, com base física muito clara, devem se tornar mais frequentes num planeta aquecido), comprometimento de ecossistemas marinhos costeiros e florestas, atingindo pesca e atividades de coleta; inviabilização de culturas agrícolas tradicionais… tudo isso recai onde? Sobre o andar de baixo! O de cima fala em “adaptação” e tem muito mais instrumentos para se adaptar às mudanças. A nós, neste caso, interessa sermos conservadores quanto ao clima e frear esse “experimento” desastrado, desordenado, de alteração da composição química da atmosfera terrestre e do balanço energético planetário! Para a maioria dos 7 bilhões de habitantes dessa esfera, a estabilidade climática é importante!
Alguns dos mais ricos, na verdade, veem o aquecimento global como “oportunidade”… Claro, “oportunidade” de expandir o agronegócio para as futuras terras agricultáveis do norte do Canadá e da Sibéria e para explorar petróleo no oceano que se abrirá com o crescente degelo do Ártico.
Assim, é preciso perceber que há uma verdadeira impostura vagando por aí e a Ciência precisa ser defendida. Uma rocha é uma rocha; uma árvore é uma árvore; uma molécula de CO2 é uma molécula de CO2, independente de ideologia. Mas os de baixo só serão/seremos capazes de se/nos armarem/armarmos para transformar a sociedade se estiverem/estivermos bem informados e aí, é preciso combater os absurdos proferidos pelos detratores da Ciência do Clima.
Alexandre Costa é bacharel em Física e mestre em Física pela Universidade Federal do Ceará, Ph.D. em Ciências Atmosféricas pela Colorado State University, com pós-doutorado pela Universidade de Yale, com publicações em diversos periódicos científicos, incluindo Science, Journal of the Amospheric Sciences e Atmospheric Research. É bolsista de produtividade do CNPq e membro do Painel Brasileiro de Mudanças Climáticas.
ScienceDaily (Oct. 16, 2009) — Worried about climate change and want to learn more? You probably aren’t watching television then. A new study by George Mason University Communication Professor Xiaoquan Zhao suggests that watching television has no significant impact on viewers’ knowledge about the issue of climate change. Reading newspapers and using the web, however, seem to contribute to people’s knowledge about this issue.
The study, “Media Use and Global Warming Perceptions: A Snapshot of the Reinforcing Spirals,” looked at the relationship between media use and people’s perceptions of global warming. The study asked participants how often they watch TV, surf the Web, and read newspapers. They were also asked about their concern and knowledge of global warming and specifically its impact on the polar regions.
“Unlike many other social issues with which the public may have first-hand experience, global warming is an issue that many come to learn about through the media,” says Zhao. “The primary source of mediated information about global warming is the news.”
The results showed that people who read newspapers and use the Internet more often are more likely to be concerned about global warming and believe they are better educated about the subject. Watching more television, however, did not seem to help.
He also found that individuals concerned about global warming are more likely to seek out information on this issue from a variety of media and nonmedia sources. Other forms of media, such as the Oscar-winning documentary “The Inconvenient Truth” and the blockbuster thriller “The Day After Tomorrow,” have played important roles in advancing the public’s interest in this domain.
Politics also seemed to have an influence on people’s perceptions about the science of global warming. Republicans are more likely to believe that scientists are still debating the existence and human causes of global warming, whereas Democrats are more likely to believe that a scientific consensus has already been achieved on these matters.
“Some media forms have clear influence on people’s perceived knowledge of global warming, and most of it seems positive,” says Zhao. “Future research should focus on how to harness this powerful educational function.”
ScienceDaily (Nov. 21, 2011) — People who believe there is a lot of disagreement among scientists about global warming tend to be less certain that global warming is happening and less supportive of climate policy, researchers at George Mason, San Diego State, and Yale Universities report in a new study published in the journal Nature Climate Change.
A recent survey of climate scientists conducted by researchers at the University of Illinois found near unanimous agreement among climate scientists that human-caused global warming is happening.
This new George Mason University study, however, using results from a national survey of the American public, finds that many Americans believe that most climate scientists actually disagree about the subject.
In the national survey conducted in June 2010, two-thirds of respondents said they either believed there is a lot of disagreement among scientists about whether or not global warming is happening (45 percent), that most scientists think it is not happening (5 percent), or that they did not know enough to say (16 percent.) These respondents were less likely to support climate change policies and to view climate change as a lower priority.
By contrast, survey respondents who correctly understood that there is widespread agreement about global warming among scientists were themselves more certain that it is happening, and were more supportive of climate policies.
“Misunderstanding the extent of scientific agreement about climate change is important because it undermines people’s certainty that climate change is happening, which in turn reduces their conviction that America should find ways to deal with the problem,” says Edward Maibach, director of the Center for Climate Change Communication at George Mason University.
Maibach argues that a campaign should be mounted to correct this misperception. “It is no accident that so many Americans misunderstand the widespread scientific agreement about human-caused climate change. A well-financed disinformation campaign deliberately created a myth about there being lack of agreement. The climate science community should take all reasonable measures to put this myth to rest.”
ScienceDaily (Oct. 14, 2010) — Sixty-three percent of Americans believe that global warming is happening, but many do not understand why, according to a national study conducted by researchers at Yale University.
The report titled “Americans’ Knowledge of Climate Change” found that only 57 percent know what the greenhouse effect is, only 45 percent of Americans understand that carbon dioxide traps heat from the Earth’s surface, and just 50 percent understand that global warming is caused mostly by human activities. Large majorities incorrectly think that the hole in the ozone layer and aerosol spray cans cause global warming. Meanwhile, 75 percent of Americans have never heard of the related problems of ocean acidification or coral bleaching.
However, many Americans do understand that emissions from cars and trucks and the burning of fossil fuels contribute to global warming and that a transition to renewable energy sources is an important solution.
Americans also recognize their own limited understanding. Only 1 in 10 say that they are “very well-informed” about climate change, and 75 percent say they would like to know more about the issue. Likewise, 75 percent say that schools should teach children about climate change and 68 percent would welcome a national program to teach Americans more about the issue.
“This study demonstrates that Americans need to learn more about the causes, impacts and potential solutions to global warming,” said study director Anthony Leiserowitz of Yale University. “But it also shows that Americans want to learn more about climate change in order to make up their minds and take action.”
The online survey was conducted by Knowledge Networks from June 24 to July 22, 2010, with 2,030 American adults 18 and older. The margin of sampling error is plus- or minus-2 percent, with 95 percent confidence.
ScienceDaily (Mar. 27, 2008) — The more you know the less you care — at least that seems to be the case with global warming. A telephone survey of 1,093 Americans by two Texas A&M University political scientists and a former colleague indicates that trend, as explained in their recent article in the peer-reviewed journal Risk Analysis.
“More informed respondents both feel less personally responsible for global warming, and also show less concern for global warming,” states the article, titled “Personal Efficacy, the Information Environment, and Attitudes toward Global Warming and Climate Change in the USA.”
The study showed high levels of confidence in scientists among Americans led to a decreased sense of responsibility for global warming.
The diminished concern and sense of responsibility flies in the face of awareness campaigns about climate change, such as in the movies An Inconvenient Truth and Ice Age: The Meltdown and in the mainstream media’s escalating emphasis on the trend.
The research was conducted by Paul M. Kellstedt, a political science associate professor at Texas A&M; Arnold Vedlitz, Bob Bullock Chair in Government and Public Policy at Texas A&M’s George Bush School of Government and Public Service; and Sammy Zahran, formerly of Texas A&M and now an assistant professor of sociology at Colorado State University.
Kellstedt says the findings were a bit unexpected. The focus of the study, he says, was not to measure how informed or how uninformed Americans are about global warming, but to understand why some individuals who are more or less informed about it showed more or less concern.
“In that sense, we didn’t really have expectations about how aware or unaware people were of global warming,” he says.
But, he adds, “The findings that the more informed respondents were less concerned about global warming, and that they felt less personally responsible for it, did surprise us. We expected just the opposite.
“The findings, while rather modest in magnitude — there are other variables we measured which had much larger effects on concern for global warming — were statistically quite robust, which is to say that they continued to appear regardless of how we modeled the data.”
Measuring knowledge about global warming is a tricky business, Kellstedt adds.
“That’s true of many other things we would like to measure in surveys, of course, especially things that might embarrass people (like ignorance) or that they might feel social pressure to avoid revealing (like prejudice),” he says.
“There are no industry standards, so to speak, for measuring knowledge about global warming. We opted for this straightforward measure and realize that other measures might produce different results.”
Now, for better or worse, scientists have to deal with the public’s abundant confidence in them. “But it cannot be comforting to the researchers in the scientific community that the more trust people have in them as scientists, the less concerned they are about their findings,” the researchers conclude in their study.
ScienceDaily (Mar. 26, 2008) — British Prime Minister Gordon Brown recently declared climate change a top international threat, and Al Gore urged politicians to get involved to fight global warming. Results from a recent survey conducted by a University of Missouri professor reveal that the U.S. public, while aware of the deteriorating global environment, is concerned predominantly with local and national environmental issues.
Potomac River near Washington DC. The top three issues that the US public wants the government to address are protecting community drinking water, reducing pollution of U.S. rivers and lakes, and improving urban air pollution issues like smog. (Credit: Michele Hogan)
“The survey’s core result is that people care about their communities and express the desire to see government action taken toward local and national issues,” said David Konisky, a policy research scholar with the Institute of Public Policy. “People are hesitant to support efforts concerning global issues even though they believe that environmental quality is poorer at the global level than at the local and national level. This is surprising given the media attention that global warming has recently received and reflects the division of opinion about the severity of climate change.”
Konisky, an assistant professor in the Truman School of Public Affairs at MU, recently surveyed 1,000 adults concerning their attitudes about the environment. The survey polled respondents about their levels of concern for the environment and preferences for government action to address a wide set of environmental issues.
A strong majority of the public expressed general concern about the environment. According to the survey, the top three issues that the public wants the government to address are protecting community drinking water, reducing pollution of U.S. rivers and lakes, and improving urban air pollution issues like smog. In the survey, global warming ranks eighth in importance.
“Americans are clearly most concerned about pollution issues that might affect their personal health, or the health of their families,” Konisky said.
Additionally, Konisky and his colleagues found that the best predictor of individuals’ environmental preferences is their political attributes. They examined the relationship between party identification and political ideology and support for action to address environmental problems.
“The survey reinforced the stark differences in people’s environmental attitudes, depending on their political leanings,” Konisky said. “Democrats and political liberals clearly express more desire for governmental action to address environmental problems. Republicans and ideological conservatives are much less enthusiastic about further government intervention.”
Results from the survey were recently presented at the annual meeting of the Western Political Science Association in San Diego.
ScienceDaily (May 8, 2012) — Americans’ support for government action on global warming remains high but has dropped during the past two years, according to a new survey by Stanford researchers in collaboration with Ipsos Public Affairs. Political rhetoric and cooler-than-average weather appear to have influenced the shift, but economics doesn’t appear to have played a role.
The survey directed by Jon Krosnick, a senior fellow at the Stanford Woods Institute for the Environment, shows that support for a range of policies intended to reduce future climate change dropped by an average of 5 percentage points per year between 2010 and 2012.
In a 2010 Stanford survey, more than three-quarters of respondents expressed support for mandating more efficient and less polluting cars, appliances, homes, offices and power plants. Nearly 90 percent of respondents favored federal tax breaks to spur companies to produce more electricity from water, wind and solar energy. On average, 72 percent of respondents supported government action on climate change in 2010. By 2012, that support had dropped to 62 percent.
The drop was concentrated among Americans who distrust climate scientists, even more so among such people who identify themselves as Republicans. Americans who do not trust climate science were especially aware of and influenced by recent shifts in world temperature, and 2011 was tied for the coolest of the last 11 years.
Krosnick pointed out that during the recent campaign, all but one Republican presidential candidate expressed doubt about global warming, and some urged no government action to address the issue. Rick Santorum described belief in climate change as a “pseudo-religion,” while Ron Paul called it a “hoax.” Mitt Romney, the apparent Republican nominee, has said, “I can tell you the right course for America with regard to energy policy is to focus on job creation and not global warming.”
The Stanford-Ipsos study found no evidence that the decline in public support for government action was concentrated among respondents who lived in states struggling the most economically.
The study found that, overall, the majority of Americans continue to support many specific government actions to mitigate global warming’s effect. However, most Americans remain opposed to consumer taxes intended to decrease public use of electricity and gasoline.
For Zygmunt Bauman the world is marked by a division between power and politics. While politics is defined by nations, power no longer recognises national boundaries
It’s a national embarrassment. It has resulted in large unnecessary costs for the U.S. economy and needless endangerment of our citizens. And it shouldn’t be occurring.
What am I talking about? The third rate status of numerical weather prediction in the U.S. It is a huge story, an important story, but one the media has not touched, probably from lack of familiarity with a highly technical subject. And the truth has been buried or unavailable to those not intimately involved in the U.S. weather prediction enterprise. This is an issue I have mentioned briefly in previous blogs, and one many of you have asked to learn more about. It’s time to discuss it.
Weather forecasting today is dependent on numerical weather prediction, the numerical solution of the equations that describe the atmosphere. The technology of weather prediction has improved dramatically during the past decades as faster computers, better models, and much more data (mainly satellites) have become available.
Supercomputers are used for numerical weather prediciton.
U.S. numerical weather prediction has fallen to third or fourth place worldwide, with the clear leader in global numerical weather prediction (NWP) being the European Center for Medium Range Weather Forecasting (ECMWF). And we have also fallen behind in ensembles (using many models to give probabilistic prediction) and high-resolution operational forecasting. We used to be the world leader decades ago in numerical weather prediction: NWP began and was perfected here in the U.S. Ironically, we have the largest weather research community in the world and the largest collection of universities doing cutting-edge NWP research (like the University of Washington!). Something is very, very wrong and I will talk about some of the issues here. And our nation needs to fix it.
But to understand the problem, you have to understand the competition and the players. And let me apologize upfront for the acronyms.
In the U.S., numerical weather prediction mainly takes place at the National Weather Service’s Environmental Modeling Center (EMC), a part of NCEP (National Centers for Environmental Prediction). They run a global model (GFS) and regional models (e.g., NAM).
The Europeans banded together decades ago to form the European Center for Medium-Range Forecasting (ECMWF), which runs a very good global model. Several European countries run regional models as well.
The United Kingdom Met Office (UKMET) runs an excellent global model and regional models. So does the Canadian Meteorological Center (CMC).
There are other major global NWP centers such as the Japanese Meteorological Agency (JMA), the U.S. Navy (FNMOC), the Australian center, one in Beijing, among others. All of these centers collect worldwide data and do global NWP.
The problem is that both objective and subjective comparisons indicate that the U.S. global model is number 3 or number 4 in quality, resulting in our forecasts being noticeably inferior to the competition. Let me show you a rather technical graph (produced by the NWS) that illustrates this. This figure shows the quality of the 500hPa forecast (about halfway up in the troposphere–approximately 18,000 ft) for the day 5 forecast. The top graph is a measure of forecast skill (closer to 1 is better) from 1996 to 2012 for several models (U.S.–black, GFS; ECMWF-red, Canadian: CMC-blue, UKMET: green, Navy: FNG, orange). The bottom graph shows the difference between the U.S. and other nation’s model skill.
You first notice that forecasts are all getting better. That’s good. But you will notice that the most skillful forecast (closest to one) is clearly the red one…the European Center. The second best is the UKMET office. The U.S. (GFS model) is third…roughly tied with the Canadians.
Here is a global model comparison done by the Canadian Meteorological Center, for various global models from 2009-2012 for the 120 h forecast. This is a plot of error (RMSE, root mean square error) again for 500 hPa, and only for North America. Guess who is best again (lowest error)?–the European Center (green circle). UKMET is next best, and the U.S. (NCEP, blue triangle) is back in the pack.
Lets looks at short-term errors. Here is a plot from a paper by Garrett Wedam, Lynn McMurdie and myself comparing various models at 24, 48, and 72 hr for sea level pressure along the West Coast. Bigger bar means more error. Guess who has the lowest errors by far? You guessed it, ECMWF.
I could show you a hundred of these plots, but the answers are very consistent. ECMWF is the worldwide gold standard in global prediction, with the British (UKMET) second. We are third or fourth (with the Canadians). One way to describe this, is that the ECWMF model is not only better at the short range, but has about one day of additional predictability: their 8 day forecast is about as skillful as our 7 day forecast. Another way to look at it is that with the current upward trend in skill they are 5-7 years ahead of the U.S.
Most forecasters understand the frequent superiority of the ECMWF model. If you read the NWS forecast discussion, which is available online, you will frequently read how they often depend not on the U.S. model, but the ECMWF. And during the January western WA snowstorm, it was the ECMWF model that first indicated the correct solution. Recently, I talked to the CEO of a weather/climate related firm that was moving up to Seattle. I asked them what model they were using: the U.S. GFS? He laughed, of course not…they were using the ECMWF.
A lot of U.S. firms are using the ECMWF and this is very costly, because the Europeans charge a lot to gain access to their gridded forecasts (hundreds of thousands of dollars per year). Can you imagine how many millions of dollars are being spent by U.S. companies to secure ECMWF predictions? But the cost of the inferior NWS forecasts are far greater than that, because many users cannot afford the ECMWF grids and the NWS uses their global predictions to drive the higher-resolution regional models–which are NOT duplicated by the Europeans. All of U.S. NWP is dragged down by these second-rate forecasts and the costs for the nation has to be huge, since so much of our economy is weather sensitive. Inferior NWP must be costing billions of dollars, perhaps many billions.
The question all of you must be wondering is why this bad situation exists. How did the most technologically advanced country in the world, with the largest atmospheric sciences community, end up with third-rate global weather forecasts? I believe I can tell you…in fact, I have been working on this issue for several decades (with little to show for it). Some reasons:
1. The U.S. has inadequate computer power available for numerical weather prediction. The ECMWF is running models with substantially higher resolution than ours because they have more resources available for NWP. This is simply ridiculous–the U.S. can afford the processors and disk space it would take. We are talking about millions or tens of millions of dollars at most to have the hardware we need. A part of the problem has been NWS procurement, that is not forward-leaning, using heavy metal IBM machines at very high costs.
2. The U.S. has used inferior data assimilation. A key aspect of NWP is to assimilate the observations to create a good description of the atmosphere. The European Center, the UKMET Office, and the Canadians using 4DVAR, an advanced approach that requires lots of computer power. We used an older, inferior approach (3DVAR). The Europeans have been using 4DVAR for 20 years! Right now, the U.S. is working on another advanced approach (ensemble-based data assimilation), but it is not operational yet.
3. The NWS numerical weather prediction effort has been isolated and has not taken advantage of the research community. NCEP’s Environmental Modeling Center (EMC) is well known for its isolation and “not invented here” attitude. While the European Center has lots of visitors and workshops, such things are a rarity at EMC. Interactions with the university community have been limited and EMC has been reluctant to use the models and approaches developed by the U.S. research community. (True story: some of the advances in probabilistic weather prediction at the UW has been adopted by the Canadians, while the NWS had little interest). The National Weather Service has invested very little in extramural research and when their budget is under pressure, university research is the first thing they reduce. And the U.S. NWP center has been housed in a decaying building outside of D.C.,one too small for their needs as well. (Good news… a new building should be available soon).
4. The NWS approach to weather related research has been ineffective and divided. The governmnent weather research is NOT in the NWS, but rather in NOAA. Thus, the head of the NWS and his leadership team do not have authority over folks doing research in support of his mission. This has been an extraordinarily ineffective and wasteful system, with the NOAA research teams doing work that often has a marginal benefit for the NWS.
5. Lack of leadership. This is the key issue. The folks in NCEP, NWS, and NOAA leadership have been willing to accept third-class status, providing lots of excuses, but not making the fundamental changes in organization and priority that could deal with the problem. Lack of resources for NWP is another issue…but that is a decision made by NOAA/NWS/Dept of Commerce leadership.
This note is getting long, so I will wait to talk about the other problems in the NWS weather modeling efforts, such as our very poor ensemble (probabilistic) prediction systems. One could write a paper on this…and I may.
I should stress that I am not alone in saying these things. A blue-ribbon panel did a review of NCEP in 2009 and came to similar conclusions (found here). And these issues are frequently noted at conferences, workshops, and meetings.
Let me note that the above is about the modeling aspects of the NWS, NOT the many people in the local forecast offices. This part of the NWS is first-rate. They suffer from inferior U.S. guidance and fortunately have access to the ECMWF global forecasts. And there are some very good people at NCEP that have lacked the resources required and suitable organization necessary to push forward effectively.
This problem at the National Weather Service is not a weather prediction problem alone, but an example of a deeper national malaise. It is related to other U.S. issues, like our inferior K-12 education system. Our nation, gaining world leadership in almost all areas, became smug, self-satisfied, and a bit lazy. We lost the impetus to be the best. We were satisfied to coast. And this attitude must end…in weather prediction, education, and everything else… or we will see our nation sink into mediocrity.
The U.S. can reclaim leadership in weather prediction, but I am not hopeful that things will change quickly without pressure from outside of the NWS. The various weather user communities and our congressional representatives must deliver a strong message to the NWS that enough is enough, that the time for accepting mediocrity is over. And the Weather Service requires the resources to be first rate, something it does not have at this point.
* * *
Saturday, April 7, 2012
Lack of Computer Power Undermines U.S. Numerical Weather Prediction (Revised)
In my last blog on this subject, I provided objective evidence of how U.S. numerical weather prediction (NWP), and particularly our global prediction skill, lags between major international centers, such as the European Centre for Medium Range Weather Forecasting (ECMWF), the UKMET office, and the Canadian Meteorological Center (CMC). I mentioned briefly how the problem extends to high-resolution weather prediction over the U.S. and the use of ensemble (many model runs) weather prediction, both globally and over the U.S. Our nation is clearly number one in meteorological research and we certainly have the knowledge base to lead the world in numerical weather prediction, but for a number of reasons we are not. The cost of inferior weather prediction is huge: in lives lost, injuries sustained, and economic impacts unmitigated. Truly, a national embarrassment. And one we must change.
In this blog, I will describe in some detail one major roadblock in giving the U.S. state-of-the-art weather prediction: inadequate computer resources. This situation should clearly have been addressed years ago by leadership in the National Weather Service, NOAA, and the Dept of Commerce, but has not, and I am convinced will not without outside pressure. It is time for the user community and our congressional representatives to intervene. To quote Samuel L. Jackson, enough is enough. (…)
In the U.S. we are trying to use less computer resources to do more tasks than the global leaders in numerical weather prediction. (Note: U.S. NWP is done by National Centers for Environmental Prediction’s (NCEP) Environmental Modeling Center (EMC)). This chart tells the story:
Courtesy of Bill Lapenta, EMC.
ECMWF does global high resolution and ensemble forecasts, and seasonal climate forecasts. UKMET office also does regional NWP (England is not a big country!) and regional air quality. NCEP does all of this plus much, much more (high resolution rapid update modeling, hurricane modeling, etc.). And NCEP has to deal with prediction over a continental-size country.
If you would expect the U.S. has a lot more computer power to balance all these responsibilities and tasks, you would be very wrong. Right now the U.S. NWS has two IBM supercomputers, each with 4992 processors (IBM Power6 processors). One computer does the operational work, the other is for back up (research and testing runs are done on the back-up). About 70 teraflops (trillion floating points operations per second) for each machine.
NCEP (U.S.) Computer
The European Centre has a newer IBM machine with 8192, much faster, processors that gets 182 terraflops (yes, over twice as fast and with far fewer tasks to do).
The UKMET office, serving a far, far smaller country, has two newer IBM machines, each with 7680 processors for 175 teraflops per machine.
Here is a figure, produced at NCEP that compares the relative computer power of NCEP’s machine with the European Centre’s. The shading indicates computational activity and the x-axis for each represents a 24-h period. The relative heights allows you to compare computer resources. Not only does the ECMWF have much more computer power, but they are more efficient in using it…packing useful computations into every available minute.
Courtesy of Bill Lapenta, EMC
Recently, NCEP had a request for proposals for a replacement computer system. You may not believe this, but the specifications were ONLY for a system at least equal to the one that have. A report in acomputer magazine suggests that perhaps this new system (IBM got the contract) might be slightly less powerful (around 150 terraflops) than one of the UKMET office systems…but that is not known at this point.
The Canadians? They have TWO machines like the European Centre’s!
So what kind of system does NCEP require to serve the nation in a reasonable way?
To start, we need to double the resolution of our global model to bring it into line with ECMWF (they are now 15 km global). Such resolution allows the global model to model regional features (such as our mountains). Doubling horizontal resolution requires 8 times more computer power. We need to use better physics (description of things like cloud processes and radiation). Double again. And we need better data assimilation (better use of observations to provide an improved starting point for the model). Double once more. So we need 32 times more computer power for the high-resolution global runs to allow us to catch up with ECMWF. Furthermore, we must do the same thing for the ensembles (running many lower resolution global simulations to get probabilistic information). 32 times more computer resources for that (we can use some of the gaps in the schedule of the high resolution runs to fit some of this in…that is what ECMWF does). There are some potential ways NCEP can work more efficiently as well. Right now NCEP runs our global model out to 384 hours four times a day (every six hours). To many of us this seems excessive, perhaps the longest periods (180hr plus) could be done twice a day. So lets begin with a computer 32 times faster that the current one.
Many workshops and meteorological meetings (such as one on improvements in model physics that was held at NCEP last summer—I was the chair) have made a very strong case that the U.S. requires an ensemble prediction system that runs at 4-km horizontal resolution. The current national ensemble system has a horizontal resolution about 32 km…and NWS plans to get to about 20 km in a few years…both are inadequate. Here is an example of the ensemble output (mean of the ensemble members) for the NWS and UW (4km) ensemble systems: the difference is huge–the NWS system does not even get close to modeling the impacts of the mountains. It is similarly unable to simulate large convective systems.
Current NWS( NCEP) “high resolution” ensembles (32 km)
4 km ensemble mean from UW system
Let me make one thing clear. Probabilistic prediction based on ensemble forecasts and reforecasting (running models back for years to get statistics of performance) is the future of weather prediction. The days of giving a single number for say temperature at day 5 are over. We need to let people know about uncertainty and probabilities. The NWS needs a massive increase of computer power to do this. It lacks this computer power now and does not seem destined to get it soon.
A real champion within NOAA of the need for more computer power is Tom Hamill, an expert on data assimilation and model post-processing. He and colleagues have put together a compelling case for more NWS computer resources for NWP. Read it here.
Back-of-the-envelope calculations indicates that a good first step– 4km national ensembles–would require about 20,000 processors to do so in a timely manner–but it would revolutionize weather prediction in the U.S., including forecasting convection and in mountainous areas. This high-resolution ensemble effort would meld with data assimilation over the long-term.
And then there is running super-high resolution numerical weather prediction to get fine-scale details right. Here in the NW my group runs a 1.3 km horizontal resolution forecast out twice a day for 48h. Such capability is needed for the entire country. It does not exist now due to inadequate computer resources.
The bottom line is that the NWS numerical modeling effort needs a huge increase of computer power to serve the needs of the country–and the potential impacts would be transformative. We could go from having a third-place effort, which is slipping back into the pack, to a world leader. Furthermore, the added computer power will finally allow NOAA to complete Observing System Simulation Experiments (OSSEs) and Observing System Experiments (OSEs) to make rational decisions about acquisitions of very expensive satellite systems. The fact that this is barely done today is really amazing and a potential waste of hundreds of millions of dollars on unnecessary satellite systems.
But do to so will require a major jump in computational power, a jump our nation can easily afford. I would suggest that NWS’s EMC should begin by securing at least a 100,000 processor machine, and down the road something considerably larger. Keep in mind my department has about 1000 processors in our computational clusters, so this is not as large as you think.
For a country with several billion-dollar weather disasters a year, investment in reasonable computer resrouces for NWP is obvious.
The cost? Well, I asked Art Mann of Silicon Mechanics (a really wonderful local vendor of computer clusters) to give me rough quote: using fast AMD chips, you could have such a 100K core machine for 11 million dollars. (this is without any discount!) OK, this is the U.S. government and they like expensive, heavy metal machines….lets go for 25 million dollars. The National Center for Atmospheric Research (NCAR) is getting a new machine with around 75,000 processors and the cost will be around 25-35 million dollars. NCEP will want two machines, so lets budget 60 million dollars. We spend this much money on a single jet fighter, but we can’t invest this amount to greatly improve forecasts and public safety in the U.S.? We have machines far larger than this for breaking codes, doing simulations of thermonuclear explosions, and simulating climate change.
Yes, a lot of money, but I suspect the cost of the machine would be paid back in a few months from improved forecasts. Last year we had quite a few (over ten) billion-dollar storms….imagine the benefits of forecasting even a few of them better. Or the benefits to the wind energy and utility industries, or U.S. aviation, of even modestly improved forecasts. And there is no doubt such computer resources would improve weather prediction. The list of benefits is nearly endless. Recent estimates suggest that normal weather events cost the U.S. economy nearly 1/2 trillion dollars a year. Add to that hurricanes, tornadoes, floods, and other extreme weather. The business case is there.
As someone with an insider’s view of the process, it is clear to me that the current players are not going to move effectively without some external pressure. In fact, the budgetary pressure on the NWS is very intense right now and they are cutting away muscle and bone at this point (like reducing IT staff in the forecast offices by over 120 people and cutting back on extramural research). I believe it is time for weather sensitive industries and local government, together with t he general public, to let NOAA management and our congressional representatives know that this acute problem needs to be addressed and addressed soon. We are acquiring huge computer resources for climate simulations, but only a small fraction of that for weather prediction…which can clearly save lives and help the economy. Enough is enough.
“Best practices” is the worst practice. The idea that we should examine successful organizations and then imitate what they do if we also want to be successful is something that first took hold in the business world but has now unfortunately spread to the field of education. If imitation were the path to excellence, art museums would be filled with paint-by-number works.
The fundamental flaw of a “best practices” approach, as any student in a half-decent research-design course would know, is that it suffers from what is called “selection on the dependent variable.” If you only look at successful organizations, then you have no variation in the dependent variable: they all have good outcomes. When you look at the things that successful organizations are doing, you have no idea whether each one of those things caused the good outcomes, had no effect on success, or was actually an impediment that held organizations back from being even more successful. An appropriate research design would have variation in the dependent variable; some have good outcomes and some have bad ones. To identify factors that contribute to good outcomes, you would, at a minimum, want to see those factors more likely to be present where there was success and less so where there was not.
“Best practices” lacks scientific credibility, but it has been a proven path to fame and fortune for pop-management gurus like Tom Peters, with In Search of Excellence, and Jim Collins, with Good to Great. The fact that many of the “best” companies they featured subsequently went belly-up—like Atari and Wang Computers, lauded by Peters, and Circuit City and Fannie Mae, by Collins—has done nothing to impede their high-fee lecture tours. Sometimes people just want to hear a confident person with shiny teeth tell them appealing stories about the secrets to success.
With Surpassing Shanghai, Marc Tucker hopes to join the ranks of the “best practices” gurus. He, along with a few of his colleagues at the National Center on Education and the Economy, has examined the education systems in some other countries with successful outcomes so that the U.S. can become similarly successful. Tucker coauthors the chapter on Japan, as well as an introductory and two concluding chapters. Tucker’s collaborators write chapters featuring Shanghai, Finland, Singapore, and Canada. Their approach to greatness in American education, as Linda Darling-Hammond phrases it in the foreword, is to ensure that “our strategies must emulate the best of what has been accomplished in public education both from here and abroad.”
But how do we know what those best practices are? The chapters on high-achieving countries describe some of what those countries are doing, but the characteristics they feature may have nothing to do with success or may even be a hindrance to greater success. Since the authors must pick and choose what characteristics they highlight, it is also quite possible that countries have successful education systems because of factors not mentioned at all. Since there is no scientific method to identifying the critical features of success in the best-practices approach, we simply have to trust the authority of the authors that they have correctly identified the relevant factors and have properly perceived the causal relationships.
But Surpassing Shanghai is even worse than the typical best-practices work, because Tucker’s concluding chapters, in which he summarizes the common best practices and draws policy recommendations, have almost no connection to the preceding chapters on each country. That is, the case studies of Shanghai, Finland, Japan, Singapore, and Canada attempt to identify the secrets to success in each country, a dubious-enough enterprise, and then Tucker promptly ignores all of the other chapters when making his general recommendations.
Tucker does claim to be drawing on the insights of his coauthors, but he never actually references the other chapters in detail. He never names his coauthors or specifically draws on them for his conclusions. In fact, much of what Tucker claims as common lessons of what his coauthors have observed from successful countries is contradicted in chapters that appear earlier in the book. And some of the common lessons they do identify, Tucker chooses to ignore.
For example, every country case study in Surpassing Shanghai, with the exception of the one on Japan coauthored by Marc Tucker, emphasizes the importance of decentralization in producing success. In Shanghai the local school system “received permission to create its own higher education entrance examination. This heralded a trend of exam decentralization, which was key to localized curricula.” The chapter on Finland describes the importance of the decision “to devolve increasing levels of authority and responsibility for education from the Ministry of Education to municipalities and schools…. [T]here were no central initiatives that the government was trying to push through the system.” Singapore is similarly described: “Moving away from the centralized top-down system of control, schools were organized into geographic clusters and given more autonomy…. It was felt that no single accountability model could fit all schools. Each school therefore set its own goals and annually assesses its progress toward meeting them…” And the chapter on Canada teaches us that “the most striking feature of the Canadian system is its decentralization.”
Tucker makes no mention of this common decentralization theme in his conclusions and recommendations. Instead, he claims the opposite as the common lesson of successful countries: “students must all meet a common basic education standard aligned to a national or provincial curriculum… Further, in these countries, the materials prepared by textbook publishers and the publishers of supplementary materials are aligned with the national curriculum framework.” And “every high-performing country…has a unit of government that is clearly in charge of elementary and secondary education…In such countries, the ministry has an obligation to concern itself with the design of the system as a whole…”
Conversely, Tucker emphasizes that “the dominant elements of the American education reform agenda” are noticeably absent from high-performing countries, including “the use of market mechanisms, such as charter schools and vouchers….” But if Tucker had read the chapter on Shanghai, he would have found a description of a system by which “students choose schools in other neighborhoods by paying a sponsorship fee. It is the Chinese version of school choice, a hot issue in the United States.” And although the chapter on Canada fails to make any mention of it, Canada has an extensive system of school choice, offering options that vary by language and religious denomination. According to recently published research by David Card, Martin Dooley, and Abigail Payne, competition among these options is a significant contributor to academic achievement in Canada.
There is a reason that promoters of best-practices approaches are called “gurus.” Their expertise must be derived from a mystical sphere, because it cannot be based on a scientific appraisal of the evidence. Marc Tucker makes no apology for his nonscientific approach. In fact, he denounces “the clinical research model used in medical research” when assessing education policies. The problem, he explains, is that no country would consent to “randomly assigning entire national populations to the education systems of another country or to certain features of the education system of another country.” On the contrary, countries, states, and localities can and do randomly assign “certain features of the education system,” and we have learned quite a lot from that scientific process. In the international arena, Tucker may want to familiarize himself with the excellent work being done by Michael Kremer and Karthik Muralidharan utilizing random assignment around the globe.
In addition, social scientists have developed practices to observe and control for differences in the absence of random assignment that have allowed extensive and productive analyses of the effectiveness of educational practices in different countries. In particular, the recent work of Ludger Woessmann, Martin West, and Eric Hanushek has utilized the PISA and TIMSS international test results that Tucker finds so valuable, but they have done so with the scientific methods that Tucker rejects. Even well-constructed case study research, like that done by Charles Glenn, can draw useful lessons across countries. The problem with the best-practices approach is not entirely that it depends on case studies, but that by avoiding variation in the dependent variable it prevents any scientific identification of causation.
Tucker’s hostility to scientific approaches is more understandable, given that his graduate training was in theater rather than a social science. Perhaps that is also why Tucker’s book reminds me so much of The Music Man. Tucker is like “Professor” Harold Hill come to town to sell us a bill of goods. His expertise is self-appointed, and his method, the equivalent of “the think system,” is obvious quackery. And the Gates Foundation, which has for some reason backed Tucker and his organization with millions of dollars, must be playing the residents of River City, because they have bought this pitch and are pouring their savings into a band that can never play music except in a fantasy finale.
Best practices really are the worst.
Jay P. Greene is professor of education reform at the University of Arkansas and a fellow at the George W. Bush Institute.
Surpassing Shanghai: An Agenda for American Education Built on the World’s Leading Systems Edited by Marc Tucker Harvard Education Press, 2011, $49.99; 288 pages.
by G M Peter Swann [gmpswann@yahoo.co.uk]
World Economics Association Newsletter 2(2), April.2012, page 6.
In the February issue of this newsletter, Steve Keen (2012) makes some very good points about the use of mathematics in economics. Perhaps we should say that the problem is not so much the use of mathematics as the abuse of mathematics.
A particular issue that worries me is when econometricians make liberal use of assumptions, without realising how strong these are.
Consider the following example. First, you are shown a regression summary of the relationship between Y and X, estimated from 402 observations. The conventional t-statistic for the coefficient on X is 3.0. How would you react to that?
Most economists would remark that t = 3.0 implies significance at the 1% level, which is a strong confirmation of the relationship. Indeed, many researchers mark significance at the 1% level with three stars!
Second, consider the scatter diagram below. This also shows two variables Y and X, and is also based on 402 observations. What does this say about the relationship between Y and X?
Figure 1
I have shown this diagram to several colleagues and students, and typical reactions are either that there is no relationship, or that the relationship could be almost anything.
But the surprising fact is that the data in Figure 1 are exactly the same data as used to estimate the regression summary described earlier. How can such an amorphous scatter of points represent a statistically significant relationship? It is the result of a standard assumption of OLS regression: that the explanatory variable(s) X is/are independent of the noise term u.
So long as this independence assumption is true, we can estimate the relationship with surprising precision. To see this, rewrite the conventional t-statistic as,
, where ψ is a signal to noise ratio (describing the clarity of the scatter-plot) and N-k is the number of degrees of freedom (Swann, 2012). This formula can be used for bivariate and multivariate models.
In Figure 1, ψ is 0.15, which is quite low, but N-k = 400, which is large enough to make t = 3.0. More generally, even if the signal to noise ratio is very low, so that the relationship between Y and X is imperceptible from a scatter-plot, we can always estimate a significant tstatistic – so long as we have a large enough number of observations, and so long as the independence assumption is true. But there is something doubtful about this ‘significance’.
Is the independence assumption justified? In a context where data are noisy, where rough proxy variables are used, where endogeneity is pervasive, and so on, it does seem an exceptionally strong assumption.
What happens if we relax the independence assumption? When the signal to noise ratio is very low, the estimated relationship depends entirely on the assumption that replaces it. Swann (2012) shows that the relationship in Figure 1 could indeed be almost anything – depending on what we assume about the noise variable(s).
Some have suggested that this is not a problem in practice, because signal to noise ratios are usually large enough to avoid this difficulty. But, on the contrary, some evidence suggests the problem is generally worse than indicated by Figure 1.
Swann (2012) examined 100 econometric studies taken from 20 leading economics journals, yielding a sample of 2220 parameter estimates and the corresponding signal to noise ratios. Focussing on the parameter estimates that are significant (at the 5% level or better), we find that almost 80% of those have a signal to noise ratio even lower than that in Figure 1.
In summary, it appears that the problem of ‘doubtful significance’ is pervasive. The great majority of ‘significant relationships’ in this sample would be imperceptible from the corresponding scatter-plot. The ‘significance’ indicated by a high t-statistic derives from the large number of observations and the (very strong) independence assumption.
References
Keen S. (2012) “Maths for Pluralist Economics”, World Economics Association Newsletter 2 (1), 10-11
[Editor’s note: If you are interested in this topic, you may also wish to read D.A. Hollanders, “Five methodological fallacies in applied econometrics”, real-world economics review, issue no. 57, 6 September 2011, pp. 115-126, http://www.paecon.net/PAEReview/issue57/Hollanders57.pdf%5D
By Rob Garnett [r.garnett@tcu.edu]
World Economics Association Newsletter 2(2), April.2012, page 4
In “Why Pluralism?” (2011), Stuart Birks calls for “greater discussion, deliberation, and cross-fertilization of ideas” among schools of economic thought as an antidote to each school’s autarkic tendency to “see itself as owning the ‘truth’ for its area.” As a philosophical postscript, I want to underscore the catholic reach of Birks’s remarks — his genial reminder, properly addressed to all economists, of the minimal requirements for academic inquiry.
The case for academic pluralism in economics is motivated by the ubiquity of “myside bias” (Klein 2011). Whether methodological, ideological, paradigmatic, or all of the above, such groupthink fuels intellectual segregation and bigotry. It turns schools into echo chambers, sealed off from the critical feedback loops that check hubris and propel scholarly progress.
Pluralists know that “The causes of faction cannot be removed . . . Relief is only to be sought in the means of containing its effects” (Hamilton, Madison, and Jay [1788] 2001, 45). So even as they celebrate paradigmatic diversity, they insist that scholars observe two liberal precepts:
1. academic discourse is a commons, no ‘area’ of which can be owned by any school; and
2. within these spaces of inquiry, scholars bear certain ethical duties as academic citizens.
Academic pluralism is the duty to practice “methodological awareness and toleration” (Backhouse 2001, 163) and “to constantly [seek] to learn from those who [do] not share [one’s] ideological or methodological perspective” (Boettke 2004, 379). It is “academic” because it coincides with the epistemological and ethical norms of modern academic freedom (American Association of University Professors 1940). It is “pluralist” because it entails a commitment to conduct one’s scholarly business in a non-sectarian manner.
Could a critical mass of economists ever be persuaded to enact these scholarly virtues? Yes! But admirers of these virtues must be prepared to teach by example. When Warren Samuels passed away in last August, he was eulogized as a first-rate scholar who advanced pluralism by enacting it consistently over his long career. As the Austrian economist Peter Boettke recalls:
Prior to meeting Warren, I think it would be accurate to say that I divided the world neatly into those who are stupid, those who are evil, and those who are smart and good enough to agree with me. . . . Warren destroyed that simple intellectual picture of the world. . . . He didn’t overturn my intellectual commitments . . . but he made [me] more selfcritical and less self-satisfied, and hopefully a better scholar [and] teacher (Boettke 2011).
The pluralism Warren Samuels personified can be achieved by most economic scholars, teachers, and students to a reasonable degree. If we want economics to regain its standing as a serious and humane social science, we must find more ways to activate these dormant capabilities.
References
American Association of University Professors (1940) Statement of Principles on Academic Freedom and Tenure. Washington, DC.
Backhouse, R. E. (2001) On the Credentials of Methodological Pluralism. In J. E. Biddle, J. B. Davis, and S. G. Medema (Eds.), Economics Broadly Considered: Essays in Honor of Warren J. Samuels, 161-181. London: Routledge.
Boettke, P. J. (2004) Obituary: Don Lavoie (1950-2001). Journal of Economic Methodology 11 (3): 377-379.
Birks, S. (2011) “Why Pluralism?” World Economics Association Newsletter, vol. 1, no. 1.
Hamilton, A., Madison, J., and Jay, J. (2001) [1788] The Federalist. Gideon edition. G. W. Carey and J. McClellan (eds.) Indianapolis, IN: Liberty Fund.
Klein, D. B. (2011) “I Was Wrong, and So Are You.” The Atlantic, December.
[Editor’s note: Readers may also be interested in Garnett, R. F. (Ed.). (1999). What do economists know? London: Routledge]
Today’s guest blog post is by cultural anthropologist and AAA member, Chad Huddleston. He is an Assistant Professor at St. Louis University in the Sociology, Anthropology and Criminal Justice department.
Recently, a host of new shows, such as Doomsday Preppers on NatGeo and Doomsday Bunkers on Discovery Channel, has focused on people with a wide array of concerns about possible events that may threaten their lives. Both of these shows focus on what are called ‘preppers.’ While the people that may have performed these behaviors in the past might have been called ‘survivalists,’ many ‘preppers’ have distanced themselves from that term, due to its cultural baggage: stereotypical anti-government, gun-loving, racist, extremists that are most often associated with the fundamentalist (politically and religiously) right side of the spectrum.
I’ve been doing fieldwork with preppers for the past two years, focusing on a group called Zombie Squad. It is ‘the nation’s premier non-stationary cadaver suppression task force,’ as well as a grassroots, 501(c)3 charity organization. Zombie Squad’s story is that while the zombie removal business is generally slow, there is no reason to be unprepared. So, while it is waiting for the “zombpacolpyse,” it focuses its time on disaster preparedness education for the membership and community.
The group’s position is that being prepared for zombies means that you are prepared for anything, especially those events that are much more likely than a zombie uprising – tornadoes, an interruption in services, ice storms, flooding, fires, and earthquakes.
For many in this group, Hurricane Katrina was the event that solidified their resolve to prep. They saw what we all saw – a natural disaster in which services were not available for most, leading to violence, death and chaos. Their argument is that the more prepared the public is before a disaster occurs, the less resources they will require from first responders and those agencies that come after them.
In fact, instead of being a victim of natural disaster, you can be an active responder yourself, if you are prepared. Prepare they do. Members are active in gaining knowledge of all sorts – first aid, communications, tactical training, self-defense, first responder disaster training, as well as many outdoor survival skills, like making fire, building shelters, hunting and filtering water.
This education is individual, feeding directly into the online forum they maintain (which has just under 30,000 active members from all over the world), and by monthly local meetings all over the country, as well as annual national gatherings in southern Missouri, where they socialize, learn survival skills and practice sharpshooting.
Sound like those survivalists of the past? Emphatically no. Zombie Squad’s message is one of public education and awareness, very successful charity drives for a wide array of organizations, and inclusion of all ethnicities, genders, religions and politics. Yet, the group is adamant on leaving politics and religion out of discussions on the group and prepping. You will not find exclusive language on their forum or in their media. That is not to say that the individuals in the group do not have opinions on one side or the other of these issues, but it is a fact that those issues are not to be discussed within the community of Zombie Squad.
Considering the focus on ‘future doom’ and the types of fears that are being pushed on the shows mentioned above, usually involve protecting yourself from disaster and then other people that have survived the disaster, Zombie Squad is a refreshing twist to the ‘prepper’ discourse. After all, if a natural disaster were to befall your region, whom would you rather be knocking at your door: ‘raiders’ or your neighborhood Zombie Squad member?
And the answer is no: they don’t really believe in zombies.
HISTORIAS OLVIDADAS DE BUENOS AIRES: UN HOMBRE DECIA HABER INVENTADO LA MAQUINA DE LA LLUVIA
Sucedió el 2 de enero de 1939, cuando un ingeniero llamado Juan Baigorri le aseguró al director de Meteorología que haría llover sobre la ciudad. Y llovió.
Héctor Gambini. DE LA REDACCION DE CLARIN.
Lunes 17.06.2002
“Como respuesta a la censura a mi procedimiento, regalo —por intermedio de Crítica— una lluvia a Buenos Aires para el 2 de enero de 1939″. La frase salió en el diario a fines del 38 y era un desafío público al director de Meteorología Nacional, para quien el autor de los dichos no era más que un embustero. Un ingeniero provocador que decía haber inventado la máquina de hacer llover.
Cuando llegó el 1° de enero, los porteños tenían el desafío tan presente que chocaban copas de madrugada con los ojos clavados en el cielo limpio. El día fue tan caluroso y húmedo que hasta la tarea de sentarse bajo la parra a mirar las nubes raquíticas que pasaban por Buenos Aires resultaba un entretenimiento cansador. Pero llegó la noche y nada.
En la mañana del 2, la ciudad volvió al trabajo. Y nada. Ni rastros de la lluvia. Pero no había viento ni para mover un pétalo de rosa. Y las nubecitas blancas y enfermizas de la tarde anterior iban echando cuerpo y color. Primero grises plomo. Después virando hacia el negro. Cada vez más. Hasta que una brisa de suspiro apareció de la nada con un aliento de humedad en suspensión. Gotitas sin peso ni para llegar al suelo. Y otras gotitas finas detrás, que ya tocaban el asfalto. Y otras gordas como ñoquis, que ahora hacían dibujos en los charcos incipientes. Enseguida,tormenta eléctrica y chaparrón violento. Una catarata que caía del cielo mientras Crítica paraba las rotativas para salir al mediodía con el título principal de la quinta edición, en tipografía catástrofe: “Como lo pronosticó Baigorri, hoy llovió”, debajo de una volanta que daba información acerca de lo que acababa de ocurrir en Buenos Aires:“Baigorri consiguió que tres millones de personas dirijan sus miradas al cielo”.
El tal Baigorri había nacido en Entre Ríos a fines del siglo anterior. Hijo de un militar amigo del general Roca, llegó a Buenos Aires para hacer la secundaria en el Colegio Nacional. Cuando egresó viajó a Italia para estudiar geofísica y se recibió de ingeniero en la Universidad de Milán.
En esos años —principios de la década del 30— comenzó a viajar por el mundo, contratado por diferentes petroleras. Estuvo en diversos países de Europa, Asia y Africa. Y también en Estados Unidos, desde donde volvió contratado por YPF.
Con su mujer y su hijo se instaló en Caballito. Junto a sus bultos de familia hizo trasladar desde el aeropuerto un aparato con antenas expandibles, que guardó celosamente en un placard. “Más o menos estoy adaptado a Buenos Aires, pero hay mucha humedad”, se quejaba.
Una mañana se decidió. Tomó unos aparatos y los utilizó para ir midiendo la humedad por los barrios porteños. Se paró frente a una casa de Araujo y Falcón, en Villa Luro. Las agujas le indicaban que era la zona más alta de cuanto había recorrido. Compró esa casa, que tenía un altillo perfecto para un laboratorio.
Allí se fue “desarrollando” la función de la extraña máquina, un artefacto que, a los dichos de Baigorri, provocaba que el cielo rompiese en lluvia cada vez que la encendiera. Según él, ocurría por un mecanismo de electromagnetismo que concentraba nubes en el área de influencia del aparato.
Era 1938 y los diarios hablaban de los recientes suicidios de Leopoldo Lugones y Alfonsina Storni. Y de los fraudes en las elecciones parlamentarias que ponían al presidente Roberto Ortiz al borde de la renuncia. River inauguraba el Monumental.
Baigorri buscaba demostrar que podía manejar la lluvia y buscó el patrocinio del Ferrocarril Central Argentino. El gerente inglés oyó la propuesta y sonrió, malicioso. “¿Y usted podría hacerlo en cualquier lugar?”, preguntó, tropezando con las palabras en español. Baigorri contestó que sí, y el inglés desafió, sarcástico: “Bueno, haga llover en Santiago del Estero”.
Hacia allí salió el ingeniero, con su extraña máquina y un perito agrónomo de acompañante, que viajaba para controlarlo. A los pocos días volvieron y el perito certificó que, en una estancia de una localidad llamada Estación Pinto, Baigorri se puso a trabajar y a las ocho horas llovió.
Su fama comenzó a crecer y llegó con él, en tren, a Buenos Aires. Hasta viajaron dos periodistas de The Times, de Londres, para entrevistarlo. En el otro rincón, el ingeniero Calmarini, director de Meteorología, salió a decir que todo era un invento infame o, a lo sumo, obra de la casualidad.
Aprovechando la polémica y con el tema instalado en la calle, Crítica fue a entrevistar a Baigorri. De allí salió el desafío para el 2 de enero. Ante el silencio de Meteorología, el ingeniero subió la apuesta: le mandó al funcionario nacional un paraguas de regalo . Junto al bulto, una tarjeta:“Para que lo use el 2 de enero”. Fue el día en que los porteños se desvelaron para mirar el cielo, esperando la lluvia.
Baigorri comenzó a viajar por el interior y a “hacer llover” con su máquina en diferentes localidades, con suerte dispar.
En 1951 fue asesor ad honórem del Ministerio de Asuntos Técnicos. Al año siguiente desempolvó su viejo invento y viajó a La Pampa. Llegó, encendió la batería y empezó a llover, aunque ya la gente dudaba de sus méritos:“Iba a llover de todos modos”, decían.
Baigorri se recluyó en un largo silencio. Ya viudo, pasaba horas en el altillo de Villa Luro. Leonor, la mujer que hoy vive en esa casa, contó a Clarín:“Cada vez que llovía la gente rodeaba la casa y se ponía a mirar hacia el altillo”. Allí mismo Baigorri se negó a atender a un emisario que decía venir en nombre de un empresario norteamericano para comprarle la fórmula. “Mi invento es argentino y será para exclusivo beneficio de los argentinos”, le contestó.
Anciano y solo, vendió la casa y se mudó a lo de un amigo francés, que le prestó una habitación en un departamento. Murió en el otoño de 1972, hace justo 30 años. Tenía 81 y había llegado al hospital solo, con problemas en los bronquios.
Nadie más supo de la extraña máquina de las antenas. Ni si Baigorri dejó un sucesor secreto para que la activara como homenaje durante su propio sepelio: cuando lo estaban enterrando, en el cementerio de la Chacarita, se largó a llover.
Crowds and Haze in Shanghai Jeremy Vandel via Flickr
Forty years after its initial publication, a study called The Limits to Growth is looking depressingly prescient. Commissioned by an international think tank called the Club of Rome, the 1972 report found that if civilization continued on its path toward increasing consumption, the global economy would collapse by 2030. Population losses would ensue, and things would generally fall apart.
The study was — and remains — nothing if not controversial, with economists doubting its predictions and decrying the notion of imposing limits on economic growth. Australian researcher Graham Turner has examined its assumptions in great detail during the past several years, and apparently his latest research falls in line with the report’s predictions, according to Smithsonian Magazine. The world is on track for disaster, the magazine says.
The study, initially completed at MIT, relied on several computer models of economic trends and estimated that if things didn’t change much, and humans continued to consume natural resources apace, the world would run out at some point. Oil will peak (some argue it has) before dropping down the other side of the bell curve, yet demand for food and services would only continue to rise. Turner says real-world data from 1970 to 2000 tracks with the study’s draconian predictions: “There is a very clear warning bell being rung here. We are not on a sustainable trajectory,” he tells Smithsonian.
Is this impossible to fix? No, according to both Turner and the original study. If governments enact stricter policies and technologies can be improved to reduce our environmental footprint, economic growth doesn’t have to become a market white dwarf, marching toward inevitable implosion. But just how to do that is another thing entirely.
These geological deposits make the Bighorn Basin area of Wyoming ideal for studying the PETM. (Credit: Aaron Diefendorf)
ScienceDaily (Apr. 2, 2012) — A series of global warming events called hyperthermals that occurred more than 50 million years ago had a similar origin to a much larger hyperthermal of the period, the Pelaeocene-Eocene Thermal Maximum (PETM), new research has found. The findings, published in Nature Geoscience online on April 1, 2012, represent a breakthrough in understanding the major “burp” of carbon, equivalent to burning the entire reservoir of fossil fuels on Earth, that occurred during the PETM.
“As geologists, it unnerves us that we don’t know where this huge amount of carbon released in the PETM comes from,” says Will Clyde, associate professor of Earth sciences at the University of New Hampshire and a co-author on the paper. “This is the first breakthrough we’ve had in a long time. It gives us a new understanding of the PETM.” The work confirms that the PETM was not a unique event – the result, perhaps, of a meteorite strike – but a natural part of Earth’s carbon cycle.
Working in the Bighorn Basin region of Wyoming, a 100-mile-wide area with a semi-arid climate and stratified rocks that make it ideal for studying the PETM, Clyde and lead author Hemmo Abels of Utrecht University in the Netherlands found the first evidence of the smaller hyperthermal events on land. Previously, the only evidence of such events were from marine records.
“By finding these smaller hyperthermal events in continental records, it secures their status as global events, not just an ocean process. It means they are atmospheric events,” Clyde says.
Their findings confirm that, like the smaller hyperthermals of the era that released carbon into the atmosphere, the release of carbon in the PETM had a similar origin. In addition, the warming-to-carbon release of the PETM and the other hyperthermals are similarly scaled, which the authors interpret as an indication of a similar mechanism of carbon release during all hyperthermals, including the PETM.
“It points toward the fact that we’re dealing with the same source of carbon,” Clyde says.
Working in two areas of the Bighorn Basin just east of Yellowstone National Park – Gilmore Hill and Upper Deer Creek – Clyde and Abels sampled rock and soil to measure carbon isotope records. They then compared these continental recordings of carbon release to equivalent marine records already in existence.
During the PETM, temperatures rose between five and seven degrees Celsius in approximately 10,000 years — “a geological instant,” Clyde calls it. This rise in temperature coincided exactly with a massive global change in mammals, as land bridges opened up connecting the continents. Prior to the PETM, North America had no primates, ancient horses, or split-hoofed mammals like deer or cows.
Scientists look to the PETM for clues about the current warming of Earth, although Clyde cautions that “Earth 50 million years ago was very different than it is today, so it’s not a perfect analog.” While scientists still don’t fully understand the causes of these hyperthermal events, “they seem to be triggered by warming,” Clyde says. It’s possible, he says, that less dramatic warming events destabilized these large amounts of carbon, releasing them into the atmosphere where they, in turn, warmed the Earth even more.
“This work indicates that there is some part of the carbon cycle that we don’t understand, and it could accentuate global warming,” Clyde says.
Você precisa fazer login para comentar.