Arquivo da tag: Clima

Security Risks of Extreme Weather and Climate Change (Science Daily)

Feb. 11, 2013 — A Harvard researcher is pointing toward a new reason to worry about the effects of climate change — national security.

Hurricane Katrina. Predicted changes in extremes include more record high temperatures; fewer but stronger tropical cyclones; wider areas of drought and increases in precipitation; increased climate variability; Arctic warming and attendant impacts; and continued sea level rise as greenhouse warming continues and even accelerates. (Credit: NOAA)

A new report co-authored by Michael McElroy, the Gilbert Butler Professor of Environmental Studies, and D. James Baker, a former administrator of the National Oceanic and Atmospheric Administration, connects global climate change, extreme weather, and national security. During the next decade, the report concludes, climate change could have wide-reaching effects on everything from food, water, and energy supplies to critical infrastructure and economic security.

“Over the last century, the trend has been toward urbanization — to concentrate people in smaller areas,” McElroy said. “We’ve built an infrastructure — whether it’s where we build our homes or where we put our roads and bridges — that fits with that trend. If the weather pattern suddenly changes in a serious way, it could create very large problems. Bridges may be in the wrong place, or sea walls may not be high enough.”

Possible effects on critical infrastructure, however, only scratch the surface of the security concerns.

On an international scale, the report points to recent events, such as flooding in Pakistan and sustained drought in eastern Africa, that may be tied to changing weather patterns. How the United States responds to such disasters — whether by delivering humanitarian aid or through technical support — could affect security.

“By recognizing the immediacy of these risks, the U.S. can enhance its own security and help other countries do a better job of preparing for and coping with near-term climate extremes,” Baker said.

The report suggests that climate changes could even have long-reaching political effects.

It’s possible, McElroy said, that climate changes may have contributed to the uprisings of the Arab Spring by causing a rise in food prices, or that the extended drought in northern Mexico has contributed to political instability and a rise in drug trafficking in the region.

“We don’t have definitive answers, but our report raises these questions, because what we are saying is that these conditions are likely to be more normal than they were in the past,” McElroy said. “There are also questions related to sea-level rise. The conventional wisdom is that sea level is rising by a small amount, but observations show it’s rising about twice as fast as the models suggested. Could it actually go up by a large amount in a short period? I don’t think you can rule that out.”

Other potential effects, McElroy said, are tied to changes in an atmospheric circulation pattern called the Hadley circulation, in which warm tropical air rises, resulting in tropical rains. As the air moves to higher latitudes, it descends, causing the now-dry air to heat up. Regions where the hot, dry air returns to the surface are typically dominated by desert.

The problem, he said, is that evidence shows those arid regions are expanding.

“The observational data suggest that the Hadley circulation has expanded by several degrees in latitude,” McElroy said. “That’s a big deal, because if you shift where deserts are by just a few degrees, you’re talking about moving the southwestern desert into the grain-producing region of the country, or moving the Sahara into southern Europe.”

The report is the result of the authors’ involvement with Medea, a group of scientists who support the U.S. government by examining declassified national security data useful for scientific inquiry. In recent decades, the group has worked with officials in the United States and Russia to declassify data on climatic conditions in the Arctic and thousands of spy satellite images. Those images have been used to study ancient settlement patterns in the Middle East and changes in Arctic ice.

“I would be reluctant to say that our report is the last word on short-term climate change,” McElroy said. “Climate change is a moving target. We’ve done an honest, useful assessment of the state of play today, but we will need more information and more hard work to get it right. One of the recommendations in our report is the need for a serious investment in measurement and observation. It’s really important to keep doing that, otherwise we’re going to be flying blind.”

The study was conducted with funds provided by the Central Intelligence Agency. Any opinions, findings, conclusions, or recommendations expressed in this material are those of the authors and do not necessarily reflect the view of the CIA or the U.S. government.

Report: Climate Extremes: Recent Trends with Implications for National Security at www.environment.harvard.edu/climate-extremes

Na avaliação de especialistas, pré-sal deve trazer benefícios econômicos e científicos para o Brasil (Jornal da Ciência)

JC e-mail 4665, de 15 de Fevereiro de 2013.

Viviane Monteiro

O país não pode perder a oportunidade de utilizar os royalties do petróleo para investir em educação e em pesquisas científicas

Apesar dos riscos ambientais, a exploração do petróleo da camada pré-sal deve assegurar ao país, em longo prazo, novos patamares de desenvolvimento, tanto econômico quanto cientifico e tecnológico. Essa é a opinião que prevalece entre especialistas e pesquisadores da área de petróleo do Instituto Alberto Luiz Coimbra de Pós-Graduação e Pesquisa de Engenharia da Universidade Federal do Rio de Janeiro (Coppe-UFRJ), da Universidade Federal do Espírito Santo (UFES) e da Escola Politécnica da Universidade de São Paulo (Poli-USP).

Para eles, o Brasil não pode perder a oportunidade de explorar o pré-sal e nem de utilizar os royalties do petróleo extraído dessa camada profunda para investir em educação e em pesquisas científicas e tecnológicas. Um dos objetivos desses investimentos deve ser produzir energias limpas e renováveis, que devem substituir o combustível fóssil no período “pós-petróleo”, o que deve ocorrer nas próximas cinco décadas, aproximadamente.

Diante da exploração do pré-sal, o diretor de tecnologia e inovação da Coppe/UFRJ, Segen Estefen, diz que o Brasil deve se tornar um dos líderes mundiais na produção de tecnologias de ponta tanto para a exploração de petróleo quanto para o desenvolvimento de energias limpas e renováveis. A exploração do pré-sal, segundo ele acredita, representa uma janela de oportunidades para o Brasil figurar entre os maiores produtores de petróleo do mundo, tornando-se um dos “pelotões” de frente da Organização dos Países Exportadores de Petróleo (OPEP).

Nos últimos dois anos, o país passou da 18ª para a 13ª posição no ranking dos produtores de petróleo, conforme o relatório “Statistical Review of World Energy 2011”, da empresa britânica British Petroleum (BP). Com as descobertas das jazidas do pré-sal, as estimativas para as reservas nacionais de petróleo cresceram de 8 bilhões de barris, por volta de 2006, para algo entre 60 bilhões e 70 bilhões, atualmente. Ao colocar esses números na ponta do lápis, Segen calcula que tais cifras representariam uma receita de US$ 4 trilhões para o país, levando-se em conta o preço atual (US$ 100) do barril de petróleo. Ou seja, é um montante similar ao valor corrente do Produto Interno Bruto (PIB) nacional de R$ 4,143 trilhões, em 2011.

Em termos de reservas de petróleo, o pesquisador e professor Eustáquio Vinícius de Castro, do Laboratório de Petróleo da UFES, concorda que o pré-sal colocará o Brasil entre os cinco maiores produtores do petróleo do mundo, como Arábia Saudita, Estados Unidos e Venezuela. “A tecnologia a ser desenvolvida para atender à exploração do pré-sal deve ser estendida, também, para outras áreas, sobretudo as indústrias metal-mecânica e a de química ambiental”, diz.

Como exemplo, Castro cita equipamentos de perfuração de áreas ultraprofundas capazes de suportar fortes pressões, que podem ser utilizados pela construção civil; e agentes químicos (aditivos) que devem estar presente nos aparelhos para remoção de impurezas e purificação do óleo do pré-sal. “Esses aditivos, inclusive, podem ser utilizados na purificação de água residual, gerada por empresas fabricantes de tinta, na despoluição de rios ou de esgotos urbanos”, acrescenta.

Modelo norueguês – Também defensor da exploração do pré-sal, o professor Ricardo Cabral de Azevedo, do Departamento de Engenharia de Minas e de Petróleo da Poli/USP, aconselha o Brasil a adotar o modelo da Noruega na extração do petróleo da camada pré-sal e evitar a chamada “doença holandesa”. “Outros países que tiveram grandes reservas a explorar e produzir são exemplos do que devemos ou não fazer no Brasil”, explica. “A Holanda, por exemplo, sofreu o que ficou sendo conhecido como ‘doença holandesa’, porque sua economia se tornou excessivamente dependente do petróleo. Já a Noruega se transformou radicalmente e hoje é um dos países com maior IDH [Índice de Desenvolvimento Humano] do mundo”, lembra.

Até então, a Noruega era um dos países mais pobres da Europa, cujas finanças dependiam principalmente de exportações de commodities, como minérios e peixes enlatados. A virada da economia norueguesa ocorreu a partir de 1969, quando foram descobertas grandes reservas de petróleo no Mar do Norte e a receita foi dirigida principalmente para saúde e educação. Hoje, esse país europeu detém a terceira maior renda per capita do mundo (US$ 59,3 mil) e o IDH mais alto do planeta.

Royalties para educação e CT&I – Assim, para fazer frente aos desafios que se apresentam na extração do petróleo na camada pré-sal no Brasil, os especialistas reforçam a necessidade de destinar parte significativa da receita dessa atividade para educação, ciência, tecnologia e inovação, seguindo o modelo norueguês. Aliás, essa é uma bandeira levantada pela comunidade científica, representada pela Sociedade Brasileira para o Progresso da Ciência (SBPC).

Os especialistas são unânimes em afirmar que o país precisa aproveitar as riquezas do pré-sal a fim de conquistar novos patamares de desenvolvimento, dar um salto na qualidade na educação e melhorar o capital humano – lembrando que um dia as reservas do petróleo acabarão. “Lembramos que são reservas muito grandes, mas finitas”, alerta Azevedo. “Cabe a nós transformá-las em um legado permanente, investindo na educação e no desenvolvimento do nosso país”, defende.

Já o diretor de tecnologia e inovação da Coppe/UFRJ, Estefen, acrescenta que o país precisa preparar o terreno, na área de pesquisas científicas e tecnológicas, para o período pós-petróleo. Nesse caso, ele considera fundamental assegurar investimentos para ampliar consideravelmente as pesquisas e estudos científicos para o desenvolvimento de tecnologias para produção de energias limpas e renováveis, lembrando que há um esforço de vários países em prol da redução de emissões em médio prazo. Vale destacar que o petróleo é um combustível fóssil que contribui significativamente para o aumento do efeito estufa.

Para o pesquisador da UFES, Castro, que considera positiva a proposta de criação do fundo do pré-sal (fundo soberano) – para o qual deve ser destinada metade da receita do óleo a ser extraído de águas ultraprofundas para educação – a exploração do pré-sal precisa ser inteligente, com responsabilidade ambiental e investimento em educação. “O petróleo traz muita riqueza, mas pode trazer, também, muita pobreza e muito dano ambiental”, lembra. “Por isso, a exploração tem de ser de forma inteligente, com responsabilidade ambiental e investimento em educação.” Hoje as riquezas do petróleo são distribuídas a estados, municípios e União por intermédio de royalties. Pela lei em vigor, os recursos devem ser investidos na parte social do país, “mas as prefeituras fazem mau uso dos recursos”, avalia.

Explorar o pré-sal requer esforços científicos e tecnológicos, considerando que os reservatórios estão a quase sete mil metros de profundidade a partir do nível do mar, com destaque para as Bacias de Santos (SP) e de Campos (RJ). Para fazer frente a esses desafios, Estefen diz que o país precisa mobilizar a comunidade científica nacional, seu conhecimento disponível, criar novos laboratórios, formar capital humano e gerar empregos de qualidade. “Extrair o petróleo do pré-sal vai demandar grande esforço tecnológico, esforços que vão ajudar o Brasil a conquistar novos patamares de desenvolvimento, futuramente”, diz. “Isto é, se usarmos bem os recursos do pré-sal, vamos educar as crianças, desenvolver a indústria, a ciência e a tecnologia. Se seguir tal receituário, o Brasil deverá se destacar no cenário internacional como um dos líderes tecnológicos, dentre os quais figuram Estados Unidos, Japão e países europeus”, conclui.

Pesquisadores analisam os custos ambientais da exploração profunda
Ao colocar na balança os benefícios que as riquezas do pré-sal podem proporcionar ao país e os eventuais custos ambientais, pesquisadores avaliam que o Brasil não pode renunciar à exploração do petróleo em águas profundas, unilateralmente, mesmo reconhecendo que a queima do petróleo contribui para o aquecimento global. Isso não significa que o processo de exploração do pré-sal desconsidere os danos ambientais.

O diretor de tecnologia e inovação da Coppe/UFRJ, Segen Estefen, insiste em dizer que todas as pesquisas em andamento vislumbram a proteção do meio ambiente, em uma tentativa de dar mais segurança às operações. “Não faz sentido o Brasil se beneficiar do petróleo por três ou quatro décadas, mas deixar o país em uma situação ruim para o meio ambiente”, explica.

Hoje os pesquisadores da Coppe, por exemplo, trabalham, simultaneamente, com assuntos ligados tanto à produção de petróleo, nos dias atuais, quanto a outras tecnologias que podem ser usadas na era “pós-petróleo”. Estudam, entre outros aspectos, a produção de eletricidade pelas ondas do mar – uma energia limpa e renovável – aproveitando a mesma estrutura montada e financiada pela indústria do petróleo para desenvolver conhecimento para o período pós-petróleo.

Para o especialista da Coppe/UFRJ, o Brasil não pode renunciar ao óleo do pré-sal porque essa “é uma riqueza importante para o Brasil” por ser uma fonte de energia competitiva. Dessa forma, ele acrescenta, a extração do pré-sal deverá render frutos positivos ao país. “No Brasil, ainda com tanta desigualdade, não podemos abdicar dessas riquezas”, diz. “Se não forem exploradas, talvez, daqui a 50 anos o preço do petróleo não valha metade dos valores atuais.” Por enquanto, Estefen acrescenta, não existe nenhum combustível capaz de substituir o petróleo e nem previsões para os próximos 20 anos, aproximadamente. Além disso, a demanda por essa energia tende a aumentar muito em função do aumento da população e da demanda de países, principalmente nos países asiáticos.

Demonstrando a mesma opinião, o professor Ricardo Cabral de Azevedo, do Departamento de Engenharia de Minas e de Petróleo da Escola Politécnica da Universidade de São Paulo (Poli/USP), considera ideal o país investir no conhecimento para substituir o uso do combustível fóssil, paulatinamente, em uma tentativa de minimizar os impactos ambientais. “O fato é que sempre haverá riscos, nessa ou em qualquer outra atividade, mas o ser humano ainda precisa do petróleo”, lembra. “Desse modo, o fundamental é procurarmos reduzi-los ao máximo. Aí também as experiências do passado são fundamentais, para aprendermos com os erros já cometidos.”

O eventual retorno socioeconômico proporcionado pela exploração de petróleo na camada pré-sal compensam os riscos ambientais, na observação do pesquisador e professor Eustáquio Vinícius de Castro, do Laboratório de Petróleo da Universidade Federal do Espírito Santo (UFES). “Compensam desde que as coisas aconteçam de forma inteligente e sustentável e com racionalidade no processo de produção”, diz. ” Hoje, as empresas petrolíferas, que no passado foram mais poluentes, adotam mais segurança no processo de extração do petróleo, mesmo que alguns problemas aconteçam de vez em quando”.

Dimensão – O petróleo na camada pré-sal ocupa, aproximadamente, uma área de 800 km de comprimento por 200 km de largura, acompanhando a linha do litoral sudeste brasileiro. Segundo dados da Petrobras, desde 2006, foram perfurados mais de 80 poços, tanto na Bacia de Santos quanto na de Campos, com índice “de sucesso exploratório” acima de 80%. A estimativa é de que outras 19 novas plataformas entrem em operação até 2016; e outras 19 entrem em atividade até 2020. Segundo dados de suas assessoria de imprensa, a companhia petrolífera, líder na exploração do pré-sal, encomendou ainda 21 plataformas de produção e 28 sondas de exploração marítima a serem construídas até 2020 no País, além de 49 navios-tanque e centenas de barcos de apoio e serviços offshore.

Statistical Physics Offers a New Way to Look at Climate (Science Daily)

Mar. 5, 2013 — Statistical physics offers an approach to studying climate change that could dramatically reduce the time and brute-force computing that current simulation techniques require. The new approach focuses on fundamental forces that drive climate rather than on “following every little swirl” of water or air.

Two views, two approaches to simulation. Computer-generated images of a planet’s “zonal velocity” (the west-to-east component of wind) use direct numerical simulation (the traditional approach, left) and direct statistical simulation. The latter has limits, but its development is at a very early stage. (Credit: Marston lab/Brown University)

Scientists are using ever more complex models running on ever more powerful computers to simulate Earth’s climate. But new research suggests that basic physics could offer a simpler and more meaningful way to model key elements of climate.

The research, published in the journal Physical Review Letters, shows that a technique called direct statistical simulation does a good job of modeling fluid jets, fast-moving flows that form naturally in oceans and in the atmosphere. Brad Marston, professor of physics at Brown University and one of the authors of the paper, says the findings are a key step toward bringing powerful statistical models rooted in basic physics to bear on climate science.

In addition to the Physical Review Letters paper, Marston will report on the work at a meeting of the American Physical Society to be held in Baltimore this later month.

The method of simulation used in climate science now is useful but cumbersome, Marston said. The method, known as direct numerical simulation, amounts to taking a modified weather model and running it through long periods of time. Moment-to-moment weather — rainfall, temperatures, wind speeds at a given moment, and other variables — is averaged over time to arrive at the climate statistics of interest. Because the simulations need to account for every weather event along the way, they are mind-bogglingly complex, take a long time run, and require the world’s most powerful computers.

One practical advantage of the new approach: the ability to model climate conditions from millions of years ago without having to reconstruct the world’s entire weather history.Direct statistical simulation, on the other hand, is a new way of looking at climate. “The approach we’re investigating,” Marston said, “is the idea that one can directly find the statistics without having to do these lengthy time integrations.”

It’s a bit like the approach physicists use to describe the behavior of gases.

“Say you wanted to describe the air in a room,” Marston said. “One way to do it would be to run a giant supercomputer simulation of all the positions of all of the molecules bouncing off of each other. But another way would be to develop statistical mechanics and find that the gas actually obeys simple laws you can write down on a piece of paper: PV=nRT, the gas equation. That’s a much more useful description, and that’s the approach we’re trying to take with the climate.”

Conceptually, the technique focuses attention on fundamental forces driving climate, instead of “following every little swirl,” Marston said. A practical advantage would be the ability to model climate conditions from millions of years ago without having to reconstruct the world’s entire weather history in the process.

The theoretical basis for direct statistical simulation has been around for nearly 50 years. The problem, however, is that the mathematical and computational tools to apply the idea to climate systems aren’t fully developed. That is what Marston and his collaborators have been working on for the last few years, and the results in this new paper show their techniques have good potential.

The paper, which Marston wrote with University of Leeds mathematician Steve Tobias, investigates whether direct statistical simulation is useful in describing the formation and characteristics of fluid jets, narrow bands of fast-moving fluid that move in one direction. Jets form naturally in all kinds of moving fluids, including atmospheres and oceans. On Earth, atmospheric jet streams are major drivers of storm tracks.

For their study, Marston and Tobias simulated the jets that form as a fluid moves on a hypothetical spinning sphere. They modeled the fluid using both the traditional numerical technique and their statistical technique, and then compared the output of the two models. They found that the models generally arrived at similar values for the number of jets that would form and the strength of the airflow, demonstrating that statistical simulation can indeed be used to model jets.

There were limits, however, to what the statistical model could do. The study found that as pace of adding and removing energy to the fluid system increased, the statistical model started to break down. Marston and Tobias are currently working on an expansion of their technique to deal with that problem.

Despite the limitation, Marston is upbeat about the potential for the technique. “We’re very pleased that it works as well as it did here,” he said.

Since completing the study, Marston has integrated the method into a computer program called “GCM” that he has made easily available via Apple’s Mac App Store for other researchers to download. The program allows users to build their own simulations, comparing numerical and statistical models. Marston expects that researchers who are interested in this field will download it and play with the technique on their own, providing new insights along the way. “I’m hoping that citizen-scientists will also explore climate modeling with it as well, and perhaps make a discovery or two,” he said.

There’s much more work to be done on this, Marston stresses, both in solving the energy problem and in scaling the technique to model more realistic climate systems. At this point, the simulations have only been applied to hypothetical atmospheres with one or two layers. Earth’s atmosphere is a bit more complex than that.

“The research is at a very early stage,” Marston said, “but it’s picking up steam.”

Related app:http://www.brown.edu/Research/Environmental_Physics/Environmental_Physics/Code.html

Journal Reference:

  1. S. M. Tobias, J. B. Marston. Direct Statistical Simulation of Out-of-Equilibrium JetsPhysical Review Letters, 2013 DOI: 10.1103/PhysRevLett.110.104502

Desafios da pesquisa ambiental (Fapesp)

Em aula inaugural do Programa de Doutorado em Ambiente e Sociedade da Unicamp, Brito Cruz exorta pesquisadores a elevar nível de originalidade e sofisticação das pesquisas a fim de aumentar a participação brasileira nos debates internacionais sobre o clima (foto: Luiz Paulo Juttel)

Especiais
19/03/2013

Por Luiz Paulo Juttel

Agência FAPESP – A ciência de qualidade, publicada em periódicos científicos internacionais, embasa o debate social e as políticas públicas ligadas a questões ambientais em todo o globo. Para fazer parte desse debate, cientistas do Estado de São Paulo e do Brasil precisam elevar o nível de originalidade, sofisticação e o impacto das suas pesquisas a fim de que seus artigos sejam aceitos em grande volume por tais revistas. Esse foi o desafio proposto pelo diretor científico da FAPESP, Carlos Henrique de Brito Cruz, aos presentes à aula inaugural do Programa de Doutorado em Ambiente e Sociedade da Universidade Estadual de Campinas (Unicamp).

A apresentação “A FAPESP e os desafios em Pesquisa Ambiental em São Paulo”, ministrada por Brito Cruz na quarta-feira (13/03), trouxe dados que mostram um aumento significativo no número de artigos publicados por pesquisadores atuantes no Brasil no tema ambiente e ecologia. Na década de 1980, periódicos internacionais publicavam cerca de 10 artigos brasileiros por ano. Atualmente, esse número saltou para 600, o que significa uma elevação expressiva na quantidade de pesquisadores dedicados a esse tipo de pesquisa.

Entretanto, o impacto desses trabalhos na comunidade científica global e o número de citações recebidas, continua abaixo da média mundial. “Ainda não estamos criando ciência que tenha mais importância do que a melhor ciência feita no mundo”, afirmou Brito Cruz.

Para modificar esse cenário, o diretor científico da FAPESP disse que é preciso publicar mais em língua inglesa, ampliar as parcerias no âmbito nacional e internacional e estimular pesquisas com caráter interdisciplinar. Brito Cruz citou o exemplo do Programa de Pós-graduação Interinstitucional em Bioenergia, promovido pelas três universidades estaduais paulistas: Unicamp, Universidade de São Paulo (USP) e Universidade Estadual Paulista (Unesp), no qual  o aluno poderá cursar disciplinas livremente nas três universidades, o que amplia a chance de o estudo ganhar um caráter realmente interdisciplinar.

Durante a aula inaugural, Brito Cruz mostrou como a FAPESP tem contribuído para o fomento das pesquisas e para a formação de recursos humanos qualificados na área ambiental, por meio de três programas: o de bioenergia (BIOEN), de biodiversidade (BIOTA) e de mudanças climáticas (PFPMCG). Juntos, os três reúnem mais de 500 pesquisadores e 1.700 alunos de pós-graduação participantes.

Entre os resultados dos programas, destacam-se 19 decretos, resoluções e leis criados com base em resultados de pesquisa do BIOTA. “Ter uma pesquisa citada em uma lei é algo espetacular, porque a ciência gerou impacto para melhorar a política pública em um estado do tamanho da Argentina ou da Espanha”, afirmou Brito Cruz.

Por fim, o diretor científico enfatizou o papel da FAPESP no estímulo à pesquisa nas mais diversas áreas do conhecimento. Segundo Brito Cruz, as principais instituições de fomento à pesquisa ao redor do globo têm sofrido pressão para aprovar projetos que tornem a indústria mais competitiva, cure doentes ou diminua a pobreza. A FAPESP se interessa por projetos com essas características. Por outro lado, também “valoriza aquele tipo de pesquisa que faz a humanidade ficar mais sabida”, declarou Brito Cruz.

Doutorado em Ambiente e Sociedade

O Programa de Doutorado em Ambiente e Sociedade da Unicamp é organizado pelo Instituto de Filosofia e Ciências Humanas (IFCH) e pelo Núcleo de Estudos e Pesquisas Ambientais (Nepam). A coordenadora do programa, Leila da Costa Ferreira, explicou que o diferencial do curso ministrado há cinco anos pela universidade reside no caráter interdisciplinar das disciplinas oferecidas e dos projetos de pesquisas executados.

O curso recebeu conceito cinco na avaliação dos programas de pós-graduação da Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (Capes). Em 2012, cerca de 50 projetos de pesquisas participaram da seleção que aprovou oito alunos ingressantes, dentre as 15 vagas oferecidas.

Além de Ferreira, participaram da mesa de abertura da aula inaugural do Programa de Doutorado em Ambiente e Sociedade o coordenador do Programa BIOTA-FAPESP, Carlos Alfredo Joly, o pró-reitor de Pós-Graduação da Unicamp, Euclides de Mesquita Neto, e a coordenadora de Pós-Graduação do IFCH, Fátima Regina Évora.

Ten Times More Hurricane Surges in Future, New Research Predicts (Science Daily)

Mar. 18, 2013 — By examining the frequency of extreme storm surges in the past, previous research has shown that there was an increasing tendency for storm hurricane surges when the climate was warmer. But how much worse will it get as temperatures rise in the future? How many extreme storm surges like that from Hurricane Katrina, which hit the U.S. coast in 2005, will there be as a result of global warming? New research from the Niels Bohr Institute show that there will be a tenfold increase in frequency if the climate becomes two degrees Celcius warmer.

The results are published in the scientific journal, Proceedings of the National Academy of Science,PNAS.

The extreme storm surge from Superstorm Sandy in the autumn 2012 flooded large sections of New York and other coastal cities in the region. New research shows that such hurricane surges will become more frequent in a warmer climate. (Credit: © Leonard Zhukovsky / Fotolia)

Tropical cyclones arise over warm ocean surfaces with strong evaporation and warming of the air. The typically form in the Atlantic Ocean and move towards the U.S. East Coast and the Gulf of Mexico. If you want to try to calculate the frequency of tropical cyclones in a future with a warmer global climate, researchers have developed various models. One is based on the regional sea temperatures, while another is based on differences between the regional sea temperatures and the average temperatures in the tropical oceans. There is considerable disagreement among researchers about which is best.

New model for predicting cyclones

“Instead of choosing between the two methods, I have chosen to use temperatures from all around the world and combine them into a single model,” explains climate scientist Aslak Grinsted, Centre for Ice and Climate at the Niels Bohr Institute at the University of Copenhagen.

He takes into account the individual statistical models and weights them according to how good they are at explaining past storm surges. In this way, he sees that the model reflects the known physical relationships, for example, how the El Niño phenomenon affects the formation of cyclones. The research was performed in collaboration with colleagues from China and England.

The statistical models are used to predict the number of hurricane surges 100 years into the future. How much worse will it be per degree of global warming? How many ‘Katrinas’ will there be per decade?

Since 1923, there has been a ‘Katrina’ magnitude storm surge every 20 years.

10 times as many ‘Katrinas’

“We find that 0.4 degrees Celcius warming of the climate corresponds to a doubling of the frequency of extreme storm surges like the one following Hurricane Katrina. With the global warming we have had during the 20th century, we have already crossed the threshold where more than half of all ‘Katrinas’ are due to global warming,” explains Aslak Grinsted.

“If the temperature rises an additional degree, the frequency will increase by 3-4 times and if the global climate becomes two degrees warmer, there will be about 10 times as many extreme storm surges. This means that there will be a ‘Katrina’ magnitude storm surge every other year,” says Aslak Grinsted and he points out that in addition to there being more extreme storm surges, the sea will also rise due to global warming. As a result, the storm surges will become worse and potentially more destructive.

Journal Reference:

  1. Aslak Grinsted, John C. Moore, and Svetlana Jevrejeva.Projected Atlantic hurricane surge threat from rising temperaturesPNAS, March 18, 2013 DOI:10.1073/pnas.1209980110

Faltam cinco dias para um alerta global pela consciência ambiental (WWF/Envolverde)

18/3/2013 – 10h50

por Redação do WWF Brasil

n12 300x183 Faltam cinco dias para um alerta global pela consciência ambientalNo próximo sábado, 23 de março, às 20h30 (hora local), milhares, talvez bilhões de pessoas apagarão as luzes de suas casas, comércios, repartições, monumentos e outros logradouros importantes num ato simbólico de alerta contra as mudanças no clima.

É a nona edição da Hora do Planeta, um movimento que começou tímido na Austrália e hoje envolve milhares de cidades em mais de 152 países. Aqui o País a Hora do Planeta é promovida pelo WWF-Brasil e o objetivo é superar os números do ano passado, atraindo para a Hora do Planeta todas as capitais estaduais e o Distrito Federal, e ultrapassando a marca de 131 cidades participantes em 2012.

Sua cidade pode seguir o exemplo das muitas que já o fizeram e se inscrever enviando um email para cidades@wwf.org.br e assinando o Termo de Adesão. Escolas e instituições também não podem ficar de fora.

Sua participação pessoal é fundamental para o sucesso da Hora do Planeta. Não podemos mais consumir o equivalente a um planeta e meio de recursos para nossa subsistência na Terra. Adote práticas sustentáveis desde já e depois do 23 de março. Recicle. Reduza. Reutilize. É possível adotar mudanças simples no seu estilo de vida, que terão um grande impacto global. Evite desperdício de água e energia. Recorra a fontes alternativas, como solar e eólica, se possível. Use menos o carro e prefira o transporte público, a bicicleta e andar a pé. Coma menos carne vermelha. Consuma produtos locais, sempre que possível orgânicos. Informe-se mais sobre o tema. Nosso futuro comum está em perigo e as mudanças climáticas em curso ameaçam toda a vida na Terra. Sua consciência é a maior arma para combatê-las.

Acompanhe a Hora do Planeta no Brasil através dos nossos canais no Facebook, Twitter e YouTube. Divulgue a nossa mensagem. E veja aqui os desafios do “Eu vou se você for” — pessoas propondo alternativas para que todos adotem um estilo de vida mais correto ecologicamente (e de quebra mais saudável!).

Junte-se a nós. A Hora do Planeta já começou e não pode terminar quando as luzes se acenderem no sábado.

* Publicado originalmente no site WWF Brasil.

Não faltam avisos: cuidado com o clima (Envolverde)

Ambiente

18/3/2013 – 11h05

por Washington Novaes*

clima 300x225 Não faltam avisos: cuidado com o clima

Foto: Divulgação/ Internet

É preciso insistir e insistir: as grandes cidades brasileiras – mas não apenas elas – precisam criar com urgência políticas do clima que as habilitem a enfrentar com eficiência os “desastres naturais”, cada vez mais frequentes e intensos e que provocam um número cada vez maior de mortos e outras vítimas; precisam arrancar do fundo das gavetas projetos que permitam evitar inundações em áreas urbanas; criar planos diretores que incorporem as novas informações nessa área; rever os padrões de construção, já obsoletos, concebidos em outras épocas, para condições climáticas muito mais amenas – e que se mostram cada vez mais vulneráveis a desabamentos; incorporar as universidades nessa busca de formatos científicos e tecnológicos.

Segundo este jornal (21/2), de 12 locais alagados em uma semana no mês passado na cidade de São Paulo, 11 já sofriam com inundações há 20 anos – entre eles, alguns dos pontos com mais veículos e pessoas, como o Vale do Anhangabaú, a Avenida 23 de Maio, a Rua Turiaçu. E a Prefeitura de São Paulo promete desengavetar 79 obras antienchentes, algumas delas abafadas há 15 anos. Inacreditável. O governo do Estado assegura que vai trabalhar em 14 piscinões (outros 30 caberão a parcerias público-privadas), além de aplicar mais R$ 317 milhões em desassoreamento do Rio Tietê, onde já foi gasto R$ 1,7 bilhão (terá de gastar muito mais enquanto não decidir atuar nas dezenas de afluentes do rio sob o asfalto, que carregam sedimentos, lixo, esgotos, etc.). A população paulistana ficará muito grata – ela e 1 milhão de pessoas que entram e saem diariamente da cidade (Estado, 27/2).

Enquanto não houver uma ação enérgica na área do clima e na revisão dos padrões de construção em toda parte, continuaremos assim, como nas últimas semanas: obra irregular provoca desabamento de prédio na Liberdade e mata pedestre (1.ª/3); edifício de 20 andares desaba no Rio e arrasta mais dois, com 22 mortos (25/1); desabamento de lajes em construção de 13 pavimentos em São Bernardo do Campo mata duas pessoas (6/2); enchente em fábrica mata quatro em Sorocaba; inundação no Rio mata cinco pessoas (8/3); homem salva três pessoas e morre junto com um estudante, levados pela enxurrada durante temporal de cinco horas no Ipiranga, quando caiu um terço da chuva prevista para o mês e fez transbordar o Tamanduateí (11/3); deslizamento na moderna Rodovia dos Imigrantes mata uma pessoa e interrompe o tráfego (22/2), numa chuva de 183,4 milímetros, algumas vezes mais do que o índice médio de chuvas em um mês na região. Até o Arquivo Nacional, no Rio de Janeiro, perdeu mais de 130 caixas de documentos históricos num temporal no centro da cidade (10/3).

Não pode haver ilusões. O Brasil já está em quinto lugar entre os países que mais têm sofrido com desastres climáticos. O Semiárido, em outubro último, teve o mês mais seco em 83 anos, segundo o Operador Nacional do Sistema Elétrico (Estado, 31/10); 10 milhões de pessoas foram atingidas em mais de 1300 municípios. O Painel Intergovernamental sobre Mudanças Climáticas, órgão da Convenção do Clima, este ano só divulgará parte de seu novo relatório, mas seu secretário-geral, Rajendra Pachauri, já adverte que é preciso “espalhar a preocupação”, de vez que, com o aumento da temperatura, até 2050, entre 2 e 2,4 graus Celsius, o nível dos oceanos se elevará entre 0,4 e 1,4 metro – mas poderá ser mais, com o avanço do degelo no Ártico (Guardian, 28/2).

Não é por acaso, assim, que o sistema escolar público dos Estados Unidos já tenha, este ano, incorporado as questões do clima a seu currículo para os alunos. E que o Conselho da União Europeia tenha aprovado 20% do seu orçamento – ou 960 bilhões – para políticas e ações nessa área. Porque as informações são altamente preocupantes. Como as da Organização das Nações Unidas para a Alimentação e a Agricultura (9/3) de que duplicou, de 1970 para cá, a superfície de terras afetadas pela seca no mundo; ou a de que as emissões de dióxido de carbono CO2 por desmatamento, atividades agrícolas e outros formatos, entre 1990 e 2010, cresceram muito – e o Brasil responde por 25,8 bilhões de toneladas equivalentes de CO2, seguido pela Indonésia (13,1 bilhões de toneladas) e pela Nigéria (3,8 bilhões).

Os problemas com o clima, diz a Universidade de Reading (1.º/3), indicam que será preciso aumentar a produtividade na agricultura em 12% a partir de 2016, para compensar as perdas e as mudanças nos ambientes. A vegetação nas latitudes mais ao norte da América está mudando, começa a assemelhar-se à das áreas mais ao sul, segundo a Nasa (UPI, 12/3), que analisou o período 1982-2011; e lembra que as atividades no campo terão de adaptar-se. Também há alterações muito fortes em outras regiões, como nos Rios Tigre e Eufrates, que em sete anos (2003-2010) perderam 144 quilômetros cúbicos de água, equivalentes ao volume do Mar Morto (O Globo, 14/2).

Em toda parte as informações inquietam. Universidades da Flórida, por exemplo (Huffpost Miami, 12/3), alertam que será preciso transplantar três grandes estações de tratamento de esgotos no sul do Estado para evitar que elas fiquem “confinadas em ilhas” em menos de 50 anos, por causa da elevação do nível do mar. O almirante Samuel J. Locklear III, comandante da frota norte-americana no Pacífico, diz que essa elevação do nível dos oceanos “é a maior ameaça à segurança”. E que China e Índia precisam preparar-se para socorrer e evacuar centenas de milhares ou milhões de pessoas.

Retornando ao início deste artigo: as cidades brasileiras não podem adiar o enfrentamento das mudanças do clima, principalmente quanto a inundações e deslizamentos de terras (o Brasil tem mais de 5 milhões de pessoas em áreas de risco). Segundo a revista New Scientist (20/10/2012), 32 mil pessoas morreram no mundo, entre 2004 e 2010, em eventos dessa natureza (em terremotos, 80 mil). Não faltam avisos.

Washington Novaes é jornalista.

** Publicado originalmente no site O Estado de S. Paulo.

Obama Will Use Nixon-Era Law to Fight Climate Change (Bloomberg)

By Mark Drajem – Mar 15, 2013 12:50 PM GMT-0300

Daniel Acker/Bloomberg. Similar analyses could be made for the oil sands that would be transported in TransCanada Corp.’s Keystone XL pipeline, and leases to drill for oil, gas and coal on federal lands, such as those for Arch Coal Inc. and Peabody Energy Corp.

President Barack Obama is preparing to tell all federal agencies for the first time that they should consider the impact on global warming before approving major projects, from pipelines to highways.

The result could be significant delays for natural gas- export facilities, ports for coal sales to Asia, and even new forest roads, industry lobbyists warn.

“It’s got us very freaked out,” said Ross Eisenberg, vice president of the National Association of Manufacturers, a Washington-based group that represents 11,000 companies such as Exxon Mobil Corp. (XOM) and Southern Co. (SO) The standards, which constitute guidance for agencies and not new regulations, are set to be issued in the coming weeks, according to lawyers briefed by administration officials.

In taking the step, Obama would be fulfilling a vow to act alone in the face of a Republican-run House of Representatives unwilling to pass measures limiting greenhouse gases. He’d expand the scope of a Nixon-era law that was first intended to force agencies to assess the effect of projects on air, water and soil pollution.

“If Congress won’t act soon to protect future generations, I will,” Obama said last month during his State of the Union address. He pledged executive actions “to reduce pollution, prepare our communities for the consequences of climate change, and speed the transition to more sustainable sources of energy.”

Illinois Speech

The president is scheduled to deliver a speech on energy today at the Argonne National Laboratory in Lemont, Illinois. He is pressing Congress to create a $2 billion clean-energy research fund with fees paid by oil and gas producers.

While some U.S. agencies already take climate change into account when assessing projects, the new guidelines would apply across-the-board to all federal reviews. Industry lobbyists say they worry that projects could be tied up in lawsuits or administrative delays.

For example, Ambre Energy Ltd. is seeking a permit from the Army Corps of Engineers to build a coal-export facility at the Port of Morrow in Oregon. Under existing rules, officials weighing approval would consider whether ships in the port would foul the water or generate air pollution locally. The Environmental Protection Agency and activist groups say that review should be broadened to account for the greenhouse gases emitted when exported coal is burned in power plants in Asia.

Keystone Pipeline

Similar analyses could be made for the oil sands that would be transported in TransCanada Corp. (TRP)’s Keystone XL pipeline, and leases to drill for oil, gas and coal on federal lands, such as those for Arch Coal Inc. (ACI) and Peabody Energy Corp. (BTU)

If the new White House guidance is structured correctly, it will require just those kinds of lifecycle reviews, said Bill Snape, senior counsel at the Center for Biological Diversity inWashington. The environmental group has sued to press for this approach, and Snape says lawsuits along this line are certain if the administration approves the Keystone pipeline, which would transport oil from Canada’s tar sands to the U.S. Gulf Coast.

“The real danger is the delays,” said Eisenberg of the manufacturers’ group. “I don’t think the answer is ever going to be ‘no,’ but it can confound things.”

Lawyers and lobbyists are now waiting for the White House’s Council on Environmental Qualityto issue the long bottled-up standards for how agencies should address climate change under the National Environmental Policy Act, signed into law by President Richard Nixon in 1970.

Environmental Impact

NEPA requires federal agencies to consider and publish the environmental impact of their actions before making decisions. Those reviews don’t mandate a specific course of action. They do provide a chance for citizens and environmentalists to weigh in before regulators decide on an action — and to challenge those reviews in court if it’s cleared.

“Each agency currently differs in how their NEPA reviews consider the climate change impacts of projects, as well as how climate change impacts such as extreme weather will affect projects,” Taryn Tuss, a Council on Environmental Quality spokeswoman, said in an e-mail. “CEQ is working to incorporate the public input we received on the draft guidance, and will release updated guidance when it is completed.”

‘Major Shakeup’

The new standards will be “a major shakeup in how agencies conduct NEPA” reviews, said Brendan Cummings, senior counsel for the Center for Biological Diversity in San Francisco.

The White House is looking at requiring consideration of both the increase in greenhouse gases and a project’s vulnerability to flooding, drought or other extreme weather that might result from global warming, according to an initial proposal it issued in 2010. Those full reports would be required for projects with 25,000 metric tons of carbon dioxide equivalent emissions or more per year, the equivalent of burning about 100 rail cars of coal.

The initial draft exempted federal land and resource decisions from the guidance, although CEQ said it was assessing how to handle those cases. Federal lands could be included in the final standards.

The White House guidance itself won’t force any projects to be stopped outright. Instead, it’s likely to prompt lawsuits against federal projects on these grounds, and increase the probability that courts will step in and order extensive reviews as part of the “adequate analysis” required in the law, said George Mannina, an attorney at Nossaman LLP in Washington.

Next Administration

“The question is: Where does this analysis take us?” he said. “Adequate analysis may be much broader than the agency and applicant might consider.”

While the Obama administration’s guidance could be easily rescinded by the next administration, the court rulings that stem from these cases will live on as precedents, Mannina said.

Lobbying groups such as the U.S. Chamber of Commerce, American Petroleum Institute and the National Mining Association weighed in with the White House against including climate in NEPA, a law initially aimed at chemical leaks or air pollution.

“Not only will this result in additional delay of the NEPA process, but will result in speculative and inaccurate modeling that will have direct impacts on approval of specific projects,” the National Mining Association in Washington wrote in comments to the White House in 2010.

Leases Challenged

The group represents Arch Coal (ACI) and Peabody, both based in St. Louis. Leases that theDepartment of Interior issued for those companies to mine for coal in Wyoming are facing lawsuits from environmental groups, arguing that the agency didn’t adequately tally up the effect on global warming from burning that coal.

Given Obama’s pledge to address global warming, “this is a massive contradiction,” said Jeremy Nichols, director of climate at WildEarth Guardians in Denver, which filed lawsuits against the leases.

Arch Coal referred questions to the mining group.

Beth Sutton, a Peabody spokeswoman, said in an e-mail, “We believe the current regulatory approach to surface mine permits is appropriate and protects the environment.”

Since CEQ first announced its proposal, more than three dozen federal approvals were challenged on climate grounds, including a highway project in North Carolina, a methane-venting plan for a coal mine in Colorado, and a research facility in California, according to a chart compiled by the Center for Climate Change Law at Columbia University.

Next Target

The next target is TransCanada (TRP)’s application to build the 1,661-mile (2,673-kilometer) Keystone pipeline. The Sierra Club and 350.org drew 35,000 people to Washington last month to urge Obama to reject the pipeline. Meanwhile, the NEPA review by the State Department included an initial analysis of carbon released when the tar sands are refined into gasoline and used in vehicles.

It stopped short, however, of saying the project would result in an increase in greenhouse gas emissions. With or without the pipeline, the oil sands will be mined and used as fuel, the report said. That finding is likely to be disputed in court if the Obama administration clears the project.

“Keystone is ground zero,” said Snape, of the Center for Biological Diversity. “Clearly this will come into play, and it will be litigated.”

Any actions by the administration now on global warming would pick up on a mixed record over the past four years.

Cap-and-Trade

While Obama failed to get Congress to pass cap-and-trade legislation, the EPA reversed course from the previous administration and ruled that carbon-dioxide emissions endanger public health, opening the way for the agency to regulate it.

Using that finding, the agency raised mileage standards for automobiles and proposed rules for new power plants that would essentially outlaw the construction of new coal-fired power plants that don’t have expensive carbon-capture technology.

Environmentalists such as the Natural Resources Defense Council say the most important action next will be the EPA’s rules for existing power plants, the single biggest source of carbon-dioxide emissions. The NEPA standards are separate from those rules, and will affect how the federal government itself is furthering global warming.

“Agencies do a pretty poor job of looking at climate change impacts,” Rebecca Judd, a legislative counsel at the environmental legal group Earthjustice in Washington. “A thorough guidance would help alleviate that.”

Pobre previsão do tempo (Folha de S.Paulo)

15/03/2013 – 03h01

Michel Laub

Num artigo publicado na Folha em 2010 (http://goo.gl/fLVDJ), João Moreira Salles discutiu a hipervalorização das humanidades no Brasil, em detrimento de disciplinas como matemática, física e engenharia. Um dos efeitos da distorção, acrescento, é a pouca familiaridade –do público, dos intelectuais, da imprensa– com o discurso técnico e científico. E, por consequência, a docilidade com que são aceitas falácias nessas áreas.

Exemplos: propaganda de governo (números para todos os gostos), dietas da moda (pesquisas com todo tipo de metodologia e patrocínio), tratamentos de saúde (custo-benefício muitas vezes discutível) e até planilhas de futebol (nas quais um volante que só dá passes curtos terá índice de acerto maior que um lançador vertical).

De minha parte, resolvi testar um discurso científico bastante presente no cotidiano: o da meteorologia. Durante 28 dias de janeiro último, anotei erros e acertos do “Jornal do Tempo” (http://jornaldotempo.uol.com.br).

Um trabalho leigo, por certo, e consciente de que o serviço em questão não é representativo do setor no país ou no mundo. A home page do “Jornal do Tempo” apresenta dados que são uma média, um resumo –como na previsão da TV– de registros mais detalhados, inclusive em algumas de suas páginas internas.

Ocorre que médias são a face pública da meteorologia, o tal discurso –em tom seguro e cordial– que nos orienta a escolher a roupa de manhã, a levar ou não o guarda-chuva. E aí, assim como alguma lógica basta para perceber furos em trabalhos estatísticos, não é preciso ser expert para afirmar que há muita imprecisão no ramo.

As temperaturas do meu caderninho quase sempre estiveram dentro dos intervalos previstos na véspera (23 em 28 ocorrências). Comparadas à previsão da semana anterior, o índice cai para 17 em 28. Se botarmos lado a lado o intervalo previsto sete dias antes e o previsto no próprio dia, há diferença em 28 de 28.

Já nas condições atmosféricas, cuja conferência é mais difícil –da minha casa em Pinheiros, não tenho como saber se fui traído por uma garoa enquanto dormia ou algo assim–, houve 15 erros em 28.

São coisas aparentemente sem importância: um ou dois graus a mais, sol durante algumas horas num dia “fechado e chuvoso, com poucas trovoadas”. Mas há reparos objetivos senão aos métodos de medição, ao menos à forma como o resultado é exposto.

Assim, cravar uma temperatura única numa cidade como São Paulo, com seus morros e depressões, paraísos verdes e infernos de concreto em 1,5 milhão de quilômetros quadrados, é inexato por princípio. Igualmente a previsão do tempo numa só frase, que contempla tanto o pé d’água rápido e inofensivo quanto o dilúvio e o caos, dependendo da estrutura do bairro onde se está (“sol, alternando com chuva em forma de pancadas isoladas”).

A questão fica mais complexa quando transcende o território do erro, que é humano e aceitável. E da própria meteorologia, aqui citada apenas como sintoma. A autoridade que emana do discurso científico não se limita a influenciar debates acadêmicos sobre química ou astronomia.

Trata-se, também, de um fenômeno das ciências humanas. Seus desdobramentos políticos, econômicos e morais na sociedade como um todo não são desprezíveis. Foram teorias racialistas que justificaram a escravidão. Foi uma doutrina de incentivo à competição tecnológica que criou as armas nucleares.

No caso do aquecimento global, a grande bandeira científica de hoje, antes de tudo há um imperativo de bom senso: é mais inteligente viver de forma harmônica com a natureza, com menos emissão de carbono, desmatamento e desperdício consumista. Também imagino que previsões de climatologia sejam mais precisas do que, digamos, as da moça que descreve as condições do Sudeste inteiro em dez segundos no “Jornal Nacional”. Mas é fato que a revista “Time” alertou sobre a “nova era glacial” em 1974. E deu uma capa célebre, dez anos depois, sobre a hoje contestada ligação entre infarto e gema de ovo.

Os dois textos reproduziam uma conjectura científica influente à época. É recomendável seguir as que o são hoje –afinal, é o que mais próximo temos de certezas fora do fanatismo religioso ou ideológico. Apenas é bom, como dúvida saudável, em qualquer área de conhecimento vendido como infalível, lembrar da pobre previsão do tempo.

When It Rains These Days, Does It Pour? Has the Weather Become Stormier as the Climate Warms? (Science Daily)

Mar. 17, 2013 — There’s little doubt — among scientists at any rate — that the climate has warmed since people began to release massive amounts greenhouse gases to the atmosphere during the Industrial Revolution.

But ask a scientist if the weather is getting stormier as the climate warms and you’re likely to get a careful response that won’t make for a good quote.

There’s a reason for that.

“Although many people have speculated that the weather will get stormier as the climate warms, nobody has done the quantitative analysis needed to show this is indeed happening,” says Jonathan Katz, PhD, professor of physics at Washington University in St. Louis.

In the March 17 online version ofNature Climate Change, Katz and Thomas Muschinksi, a senior in physics who came to Katz looking for an undergraduate thesis project, describe the results of their analysis of more than 70 years of hourly precipitation data from 13 U.S. sites looking for quantitative evidence of increased storminess.

They found a significant, steady increase in storminess on the Olympic Peninsula in Washington State, which famously suffers from more or less continuous drizzle, a calm climate that lets storm peaks emerge clearly.

“Other sites have always been stormy,” Katz says, “so an increase such as we saw in the Olympic Peninsula data would not have been detectable in their data.”

They may also be getting stormier, he says, but so far they’re doing it under cover.

The difference between wetter and stormier

“We didn’t want to know whether the rainfall had increased or decreased,” Katz says, “but rather whether it was concentrated in violent storm events.”

Studies that look at the largest one-day or few-day precipitation totals recorded in a year, or the number of days in which in which total precipitation is above a threshold, measure whether locations are getting wetter, not whether they’re getting stormier, says Katz.

To get the statistical power to pick up brief downpours rather than total precipitation, Muschinski and Katz needed to find a large, fine-grained dataset.

“So we poked around,” Katz says, “and we found what we were looking for in the National Oceanic and Atmospheric Administration databases.”

NOAA has hourly precipitation data going back to 1940 or even further for many locations in the United States. Muschniski and Katz chose 13 sites that had long runs of data and represented a broad range of climates, from desert to rain forest.

They then tested the hypothesis that storms are becoming more frequent and intense by taking different measurements of the “shape” formed by the data points for each site.

Measuring these “moments” as they’re called, is a statistical test commonly used in science, says Katz, but one that hasn’t been applied to this problem before.

“We found a significant steady increase in stormy activity on the Olympic Peninsula,” Katz says. “We know that is real.”

“We found no evidence for an increase in storminess at the other 12 sites,” he said, “but because their weather is intrinsically stormier, it would be more difficult to detect a trend like that at the Olympic Peninsula even if it were occurring.”

The next step, Katz says, is to look at a much large number of sites that might be regionally averaged to reveal trends too slow to be significant for one site.

“There are larger databases,” he says, “but they’re also harder to sift through. Any one site might have half a million hourly measurements over the period we’re looking at, and to get good results. we have to devise an algorithm tuned to the database to filter out spurious or corrupted data.”

You could call that a rainy-day project.

Journal Reference:

  1. T. Muschinski, J. I. Katz. Trends in hourly rainfall statistics in the United States under a warming climateNature Climate Change, 2013; DOI:10.1038/nclimate1828

Bombshell: Recent Warming Is ‘Amazing And Atypical’ And Poised To Destroy Stable Climate That Enabled Civilization (Climate Progress)

By Joe Romm on Mar 8, 2013 at 12:44 pm

New Science Study Confirms ‘Hockey Stick’: The Rate Of Warming Since 1900 Is 50 Times Greater Than The Rate Of Cooling In Previous 5000 Years

Temperature change over past 11,300 years (in blue, via Science, 2013plus projected warming this century on humanity’s current emissions path (in red, via recent literature).

A stable climate enabled the development of modern civilization, global agriculture, and a world that could sustain a vast population. Now, the most comprehensive “Reconstruction of Regional and Global Temperature for the Past 11,300 Years” ever done reveals just how stable the climate has been — and just how destabilizing manmade carbon pollution has been and will continue to be unless we dramatically reverse emissions trends.

Researchers at Oregon State University (OSU) and Harvard University published their findings today in the journal Science. Their funder, the National Science Foundation, explains in a news release:

With data from 73 ice and sediment core monitoring sites around the world, scientists have reconstructed Earth’s temperature history back to the end of the last Ice Age.

The analysis reveals that the planet today is warmer than it’s been during 70 to 80 percent of the last 11,300 years.

… during the last 5,000 years, the Earth on average cooled about 1.3 degrees Fahrenheit–until the last 100 years, when it warmed about 1.3 degrees F.

In short, thanks primarily to carbon pollution, the temperature is changing 50 times faster than it did during the time modern civilization and agriculture developed, a time when humans figured out where the climate conditions — and rivers and sea levels — were most suited for living and farming. We are headed for 7 to 11°F warming this century on our current emissions path — increasing the rate of change 5-fold yet again.

By the second half of this century we will have some 9 billion people, a large fraction of whom will be living in places that simply can’t sustain them —  either because it is too hot and/or dry, the land is no longer arable, their glacially fed rivers have dried up, or the seas have risen too much.

We could keep that warming close to 4°F — and avoid the worst consequences — but only with immediate action.

This research vindicates the work of Michael Mann and others showing that recent warming is unprecedented in magnitude, speed, and cause during the past 2000 years — the so-called Hockey Stick — and in fact extends that back to at least 4000 years ago. I should say “vindicates for the umpteenth time” (see “Yet More Studies Back Hockey Stick“).

Lead author Shaun Marcott of OSU told NPR that the paleoclimate data reveal just how unprecedented our current warming is: “It’s really the rates of change here that’s amazing and atypical.” He noted to the AP, “Even in the ice age the global temperature never changed this quickly.”

And the rate of warming is what matters most, as Mann noted in an email to me:

This is an important paper. The key take home conclusion is that the rate and magnitude of recent global warmth appears unprecedented for *at least* the past 4K and the rate *at least* the past 11K. We know that there were periods in the past that were warmer than today, for example the early Cretaceous period 100 million yr ago. The real issue, from a climate change impacts point of view, is the rate of change—because that’s what challenges our adaptive capacity. And this paper suggests that the current rate has no precedent as far back as we can go w/ any confidence—11 kyr arguably, based on this study.

Katharine Hayhoe, an atmospheric scientist at Texas Tech University, told the AP:

We have, through human emissions of carbon dioxide and other heat-trapping gases, indefinitely delayed the onset of the next ice age and are now heading into an unknown future where humans control the thermostat of the planet.

Unfortunately, we have decided to change the setting on the thermostat from “Very Stable, Don’t Adjust” to “Hell and High Water.” It is the single most self-destructive act humanity has ever undertaken, but there is still time to aggressively slash emissions and aim for a setting of “Dangerous, But Probably Not Fatal.”

A Scientist’s Misguided Crusade (N.Y.Times)

OP-ED COLUMNIST

By JOE NOCERA

Published: March 4, 2013 

Last Friday, at 3:40 p.m., the State Department released its “Draft Supplemental Environmental Impact Statement” for the highly contentious Keystone XL pipeline, which Canada hopes to build to move its tar sands oil to refineries in the United States. In effect, the statement said there were no environmental impediments that would prevent President Obama from approving the pipeline.

Two hours and 20 minutes later, I received a blast e-mail containing a statement by James Hansen, the head of the Goddard Institute for Space Studies at NASA — i.e., NASA’s chief climate scientist. “Keystone XL, if the public were to allow our well-oiled government to shepherd it into existence, would be the first step down the wrong road, perpetuating our addiction to dirty fossil fuels, moving to ever dirtier ones,” it began. After claiming that the carbon in the tar sands “exceeds that in all oil burned in human history,” Hansen’s statement concluded: “The public must demand that the government begin serving the public’s interest, not the fossil fuel industry’s interest.”

As a private citizen, Hansen, 71, has the same First Amendment rights as everyone else. He can publicly oppose the Keystone XL pipeline if he so chooses, just as he can be as politically active as he wants to be in the anti-Keystone movement, and even be arrested during protests, something he managed to do recently in front of the White House.

But the blast e-mail didn’t come from James Hansen, private citizen. It specifically identified Hansen as the head of the Goddard Institute, and went on to describe him as someone who “has drawn attention to the danger of passing climate tipping points, producing irreversible climate impacts that would yield a different planet from the one on which civilization developed.” All of which made me wonder whether such apocalyptic pronouncements were the sort of statements a government scientist should be making — and whether they were really helping the cause of reversing climate change.

Let’s acknowledge right here that the morphing of scientists into activists is nothing new. Linus Pauling, the great chemist, was a peace activist who pushed hard for a nuclear test ban treaty. Albert Einstein also became a public opponent of nuclear weapons.

It is also important to acknowledge that Hansen has been a crucial figure in developing modern climate science. In 2009, Eileen Claussen, now the president of the Center for Climate and Energy Solutions, told The New Yorker that Hansen was a “heroic” scientist who “faced all kinds of pressures politically.” Today, his body of work is one of the foundations upon which much climate science is built.

Yet what people hear from Hansen today is not so much his science but his broad, unscientific views on, say, the evils of oil companies. In 2008, he wrote a paper, the thesis of which was that runaway climate change would occur when carbon in the atmosphere reached 350 parts per million — a point it had already exceeded — unless it were quickly reduced. There are many climate change experts who disagree with this judgment — who believe that the 350 number is arbitrary and even meaningless. Yet an entire movement,350.org, has been built around Hansen’s line in the sand.

Meanwhile, he has a department to run. For a midlevel scientist at the Goddard Institute, what signal is Hansen sending when he takes the day off to get arrested at the White House? Do his colleagues feel unfettered in their own work? There is, in fact, enormous resentment toward Hansen inside NASA, where many officials feel that their solid, analytical work on climate science is being lost in what many of them describe as “the Hansen sideshow.” His activism is not really doing any favors for the science his own subordinates are producing.

Finally, and most important, Hansen has placed all his credibility on one battle: the fight to persuade President Obama to block the Keystone XL pipeline. It is the wrong place for him to make a stand. Even in the unlikely event the pipeline is stopped, the tar sands oil will still be extracted and shipped. It might be harder to do without a pipeline, but it is already happening. And in the grand scheme, as I’ve written before, the tar sands oil is not a game changer. The oil we import from Venezuela today is dirtier than that from the tar sands. Not that the anti-pipeline activists seem to care.

What is particularly depressing is that Hansen has some genuinely important ideas, starting with placing a graduated carbon tax on fossil fuels. Such a tax would undoubtedly do far more to reduce carbon emissions and save the planet than stopping the Keystone XL pipeline.

A carbon tax might be worth getting arrested over. But by allowing himself to be distracted by Keystone, Hansen is hurting the very cause he claims to care so much about.

Big military guy more scared of climate change than enemy guns (Grist)

By Susie Cagle

11 Mar 2013 6:13 PM

Navy Admiral Samuel J. Locklear III, chief of U.S. Pacific Command, doesn’t look like your usual proponent of climate action. Spencer Ackerman writes at Wired that Locklear “is no smelly hippie,” but the guy does believe there will be terrible security threats on a warming planet, which might make him a smelly hippie in the eyes of many American military boosters.

13-03-11AdmSamuelLocklear
Commander U.S. 7th Fleet

Everyone wants him to be worried about North Korean nukes and Chinese missiles, but in an interview with The Boston Globe, Locklear said that societal upheaval due to climate change “is probably the most likely thing that is going to happen … that will cripple the security environment, probably more likely than the other scenarios we all often talk about.’’

“People are surprised sometimes,” he added, describing the reaction to his assessment. “You have the real potential here in the not-too-distant future of nations displaced by rising sea level. Certainly weather patterns are more severe than they have been in the past. We are on super typhoon 27 or 28 this year in the Western Pacific. The average is about 17.”

Locklear said his Hawaii-based headquarters — which is … responsible for operations from California to India — is working with Asian nations to stockpile supplies in strategic locations and planning a major exercise for May with nearly two dozen countries to practice the “what-ifs.”

Locklear isn’t alone in his climate fears. A recent article by Julia Whitty takes an in-depth look at what the military is doing to deal with climate change. A 2008 report by U.S. intelligence agencieswarned about national security challenges posed by global warming, as have later reports from the Department of Defense and the Joint Chiefs of Staff. New Defense Secretary Chuck Hagel understands the threat, too. People may be surprised sometimes, Adm. Locklear, but they really shouldn’t be!

Will not-a-dirty-hippie Locklear’s words help to further mainstream the idea that climate change is a serious security problem? And what all has the good admiral got planned for this emergency sea-rising drill in May?

Susie Cagle writes and draws news for Grist. She also writes and draws tweets for Twitter.

Terra se aproxima de maiores temperaturas em 11 mil anos; Derretimento no Canadá pode ser irreversível (Folha de São Paulo)

JC e-mail 4680, de 08 de Março de 2013.

Salvador Nogueira

Pesquisa reuniu dados de 73 localidades ao redor do mundo para estimar a temperatura global (e local) no período geológico conhecido como Holoceno

Um novo estudo conduzido por pesquisadores da Universidade Estadual do Oregon e da Universidade Harvard, ambas nos EUA, reconstruiu a temperatura média da Terra nos últimos 11,3 mil anos para compará-la aos níveis atuais.

A boa notícia: a Terra hoje está mais fria do que já esteve em sua época mais quente desse período. A má: se os modelos dos climatologistas estiverem certos, atingiremos um novo recorde de calor até o final do século.

O trabalho, publicado na revista “Science”, reuniu dados de 73 localidades ao redor do mundo para estimar a temperatura global (e local) no período geológico conhecido como Holoceno, que começou ao final da última era do gelo, há 11 mil anos.

Depois de consolidar todas as informações, em sua maioria provenientes de amostras de fósseis em sedimentos oceânicos, num único quadro –além de usar técnicas matemáticas para preencher os “buracos” encontrados nas diversas fontes usadas para estimar a temperatura no passado–, os cientistas puderam recriar uma “pequena história da variação climática da Terra”.

Diz-se pequena porque os resultados não permitem enxergar a variação ocorrida em uns poucos anos. É como se cada ponto nos dados representasse a temperatura em um período de 120 anos.

A HISTÓRIA

Os dados confirmam uma velha desconfiança dos cientistas: a de que a Terra passou por um período de aquecimento que começou cerca de 11 mil anos atrás. Em 1,5 mil anos, o planeta esquentou cerca de 0,6ºC e assim se estabilizou, durante cerca de 5.000 anos.

Então, 5,5 mil anos atrás, começou um novo processo de esfriamento –que terminou há 200 anos, com o que ficou conhecido como a “pequena era do gelo”. O planeta ficou 0,7ºC mais frio.

Entram em cena a industrialização acelerada e o século 20. O planeta volta a se esquentar. No momento, ele ainda não bateu o recorde de temperatura visto no início do Holoceno, mas já está mais quente que em 75% dos últimos 11 mil anos.

Assim, o estudo confirma que a temperatura da Terra está subindo em tempos recentes e mostra que a subida é muito mais rápida do que se pensava.

“Essa pesquisa mostra que já experimentamos quase a mesma faixa de mudança de temperatura desde o início da Revolução Industrial que foi vista nos 11 mil anos anteriores da história da Terra –mas essa mudança aconteceu muito mais depressa”, comenta Candace Major, diretor da divisão de Ciências Oceanográficas da Fundação Nacional de Ciência dos EUA, que financiou o estudo.

Por outro lado, a baixa resolução temporal do estudo (é impossível distinguir efeitos de poucos anos) dificulta a comparação com o atual fenômeno de aquecimento.

Para a mudança climática atual se tornar relevante na escala de tempo analisada pelo modelo de reconstrução dos últimos 11 mil anos, ela precisa continuar no próximo século. Segundo os modelos do IPCC (Painel Intergovernamental para Mudança Climática), da ONU, é isso que vai acontecer.

Contudo, ainda há incertezas sobre a magnitude do fenômeno. De toda forma, mesmo pelas estimativas mais otimistas, quando chegarmos a 2100, se nada for feito, provavelmente estaremos vivendo o período mais quente dos últimos 11 mil anos.

* * *

JC e-mail 4680, de 08 de Março de 2013.

via Reuters

As geleiras canadenses, terceiro maior depósito de gelo depois da Antártida e da Groenlândia, podem estar sofrendo um derretimento sem volta que deve aumentar o nível do mar, afirmaram cientistas

Cerca de 20% das geleiras no norte do Canadá podem desaparecer até o fim do século 21, num derretimento que pode acrescentar 3,5 cm ao nível do mar.

Segundo artigo na revista “Geophysical Research Letters”, o derretimento de geleiras brancas exporia a tundra escura, que tende a absorver mais calor e acelerar o derretimento.

A ONU estima um aumento do nível do mar entre 18 cm e 59 cm neste século ou mais se a cobertura de gelo da Antártida e da Groenlândia começar a derreter mais rápido.

A projeção de perda de 20% do volume de gelo no Canadá se baseou em um cenário com aumento de temperatura médio de 3ºC neste século e de 8ºC no Ártico canadense, dentro das previsões da ONU.

The Crisis in Climate-Change Coverage (Truth Out)

Sunday, 03 March 2013 07:23

By Josh StearnsFree Press

Climate activist Bill McKibben speaking at the San Francisco Bay Area's Moving Planet rally. (Photo: <a href="http://www.flickr.com/photos/350org/6186391697/sizes/m/in/photostream/" target="_blank"> 350.org / flickr</a>)Climate activist Bill McKibben speaking at the San Francisco Bay Area’s Moving Planet rally. (Photo: 350.org / flickr)

Fifty-thousand people recently marched in Washington, D.C., calling on President Obama to fulfill his recent promises to take immediate and meaningful action to address the looming climate crisis.

And just days before, a group of environmental journalists, scientists and activists came together in a Web chat to discuss the state of climate-change coverage in America.

The event, organized by Free Press andOrion Magazine, featured Kate Sheppard of Mother Jones; Bill McKibben, author and 350.org founder; Wen Stephenson, writer and climate activist; M. Sanjayan, CBS News contributor and Nature Conservancy scientist; Thomas Lovejoy, chief biologist at the Heinz Center and creator of the PBS show Nature; and reporter Susie Cagle of Grist.org.

Here’s what they had to say. (You can listen to the entire discussion here.)

Structure Versus Culture

A complex mix of structural and cultural factors has affected climate-change coverage in the U.S. The forces that shape U.S. media have not been kind to environmental reporting. Years of media consolidation have led to dramatic layoffs in commercial newsrooms, and environment and science desks are often the first to go. In addition, M. Sanjayan noted that media consolidation has had an echo-chamber effect: All climate stories sound the same and they lack depth, specificity and connection to place.

The U.S. also under-funds noncommercial alternatives, like public media, where climate-change reporting should thrive. The best environmental writing is happening at the margins of our media at longtime nonprofit magazines and new online startups. In contrast, mainstream outlets have tended to legitimize climate-change deniers in the face of widespread scientific consensus about the effects of global warming.

Wen Stephenson argued that journalists have been reticent to raise the alarm about climate change. “The mainstream media has failed to cover the climate crisis as a crisis,” he said.

Empathy Versus Objectivity

A repeated theme of the conversation was the line between advocacy and journalism. There was disagreement about where the line should fall. Kate Sheppard said she was disappointed that coverage of the BP oil spill didn’t inspire more sustained activism on climate change, but noted that it wasn’t her job to organize, only to inform.

Stephenson, on the other hand, argued that when it comes to climate change, journalists need to find their moral bearings. Acknowledging the limits of objectivity, Stephenson discussed the value of empathy and the need to understand the true human and natural stakes of this debate.

Telling a More Human Story

The panelists agreed that climate-change reporting needs to get personal. Journalists need to better connect climate change to people’s lives, their homes, their families and their everyday concerns. Susie Cagle said that when she reports on climate change she does so through the lens of cities, rivers and food.

Bill McKibben pointed to the way 350.org activists have shifted the narrative — literally putting their bodies on the line by holding protests and other events around the globe. McKibben also noted the importance of people making their own media — with photos, videos and blogs —especially when there are fewer and fewer local media outlets willing to take on the work.

Sanjayan said we need a better way to frame climate-change reporting. The Keystone XL Pipeline story has gained so much traction in part because there is a clear bad guy, a clear target and clear actions people can take. Those elements aren’t always present, so journalists need to find different ways to reach their audiences. We need to be aware of who is telling the story. Sanjayan noted that all too often, climate-change reporting is too U.S.-centric and doesn’t tell the full global story.

Quality and Quantity Versus Reach and Impact

Thirty-two years ago television offered nothing of substance about the natural world or the threats it faced. This was the inspiration for Thomas Lovejoy, the scientist who coined the term “biodiversity,” to pitch a new kind of show to New York public TV station WNET.

Since PBS’ Nature first aired, a lot has changed. Now, Sheppard said, there is a ton of great environmental reporting, but it’s not always easy to find and it’s not always seen by the people who need to see it. One way to foster better coverage, Sheppard said, is to support what’s already out there by sharing it, funding it and subscribing to those doing it.

Panelists acknowledged that many publications — like this Web chat itself — end up speaking to the choir when we desperately need to get beyond it. For Sheppard, one way of doing that is through journalism collaborations that help get content out to new audiences and on different platforms.

For Cagle, the platform piece is key. She talked about the need to get beyond the “wall of text” and tell more immersive stories about climate. For her, the use of audio and illustrations helps bring readers into the story. “Art can make stories more accessible and personal,” said Cagle.

Sanjayan discussed the potential for cable TV to be a powerful messenger. For example, he is working on an in-depth series for Showtime on climate change.

Next Steps

The discussion offered few cut-and-dry prescriptions for concrete changes that need to happen to embolden and expand climate coverage. Panelists agreed that we need a journalism of solutions, not just a journalism of problems. For newsrooms and journalists, the first step is to begin to understand the scope and scale of this crisis, and write as if your life depended on it.

Chemicals, Risk And The Public (Chicago Tribune)

April 29, 1989|By Earon S. Davis

The public is increasingly uncomfortable with both the processes and the results of government and industry decision-making about chemical hazards.

Decisions that expose people to uncertain and potentially catastrophic risks from chemicals seem to be made without adequate scientific information and without an appreciation of what makes a risk acceptable to the public.

The history of environmental and occupational health provides myriad examples in which entire industries have acted in complete disregard of public health risks and in which government failed to act until well after disasters were apparent.

It is not necessary to name each chemical, each debacle, in which the public was once told the risks were insignificant, but these include DDT, asbestos, Kepone, tobacco smoke, dioxin, PCBs, vinyl chloride, flame retardants in children`s sleepware, Chlordane, Alar and urea formaldehyde foam. These chemicals were banned or severely restricted, and virutally no chemical has been found to be safer than originally claimed by industry and government.

It is no wonder that government and industry efforts to characterize so many uncertain risks as “insignificant“ are met with great skepticism. In a pluralistic, democratic society, acceptance of uncertainty is a complex matter that requires far more than statistical models. Depending upon cultural and ethical factors, some risks are simply more acceptable than others.

When it comes to chemical risks to human health, many factors combine to place a relatively higher burden on government and industry to show social benefits. Not the least of these is the unsatisfactory track record of industry and its regulatory agencies.

Equally important are the tremendous gaps in scientific knowledge about chemically induced health effects, as well as the specific characteristics of these risks.

Chemical risks differ from many other kinds because, not only are the victims struck largely at random, but there is usually no way to know which illnesses are eventually caused by a chemical. There are so many poorly understood illnesses and so many chemical exposures which take many years to develop that most chemical victims will not even be identified, let alone properly compensated.

To the public, this difference is significant, but to industry it poses few problems. Rather, it presents the opportunity to create risks and yet remain free of liability for the bulk of the costs imposed on society, except in the rare instance where a chemical produces a disease which does not otherwise appear in humans.

Statutes of limitations, corporate litigiousness, inability or unwillingness of physicians to testify on causation and the sheer passage of time pose major obstacles to chemical victims attempting to receive compensation.

The delayed effects of chemical exposures also make it impossible to fully document the risks until decades after the Pandora`s box has been opened. The public is increasingly afraid that regulators are using the lack of immediately identified victims as evidence of chemical safety, which it simply is not.

Chemical risks are different because they strike people who have given no consent, who may be completely unaware of danger and who may not even have been born at the time of the decision that led to their exposure. They are unusual, too, because we don`t know enough about the causes of cancer, birth defects and neurological and immunologic disorders to understand the real risks posed by most chemicals.

The National Academy of Sciences has found that most chemicals in commerce have not even been tested for many of these potential health effects. In fact, there are growing concerns of new neurologic and chemical sensitivity disorders of which almost nothing is known.

We are exposed to so many chemicals that there is literally no way of estimating the cumulative risks. Many chemicals also present synergistic effects in which exposure to two or more substances produces risks many times greater than the simple sum of the risks. Society has begun to see that the thousands of acceptable risks could add up to one unacceptable generic chemical danger.

The major justification for chemical risks, given all of the unknowns and uncertainties, is an overriding benefit to society. One might justify taking a one-in-a-million risk for a product that would make the nation more economically competitive or prevent many serious cases of illness. But such a risk may not be acceptable if it is to make plastic seats last a little longer, to make laundry 5 percent brighter or lawns a bit greener, or to allow apples to ripen more uniformly.

These are some of the reasons the public is unwilling to accept many of the risks being forced upon it by government and industry. There is no “mass hysteria“ or “chemophobia.“ There is growing awareness of the preciousness of human life, the banal nature of much of what industry is producing and the gross inadequacy of efforts to protect the public from long-term chemical hazards.

If the public is to regain confidence in the risk management process, industry and government must open up their own decision-making to public inquiry and input. The specific hazards and benefits of any chemical product or byproduct should be explained in plain language. Uncertainties that cannot be quantified must also be explained and given full consideration. And the process must include ethical and moral considerations such as those addressed above. These are issues to be decided by the public, not bureaucrats or corporate interests.

For industry and government to regain public support, they must stop blaming “ignorance“ and overzealous public interest groups for the concern of the publc and the media.

Rather, they should begin by better appreciating the tremendous responsibility they bear to our current and future generations, and by paying more attention to the real bottom line in our democracy: the honest, rational concerns of the average American taxpayer.

Why Are Environmentalists Taking Anti-Science Positions? (Yale e360)

22 OCT 2012

On issues ranging from genetically modified crops to nuclear power, environmentalists are increasingly refusing to listen to scientific arguments that challenge standard green positions. This approach risks weakening the environmental movement and empowering climate contrarians.

By Fred Pearce

From Rachel Carson’s Silent Spring to James Hansen’s modern-day tales of climate apocalypse, environmentalists have long looked to good science and good scientists and embraced their findings. Often we have had to run hard to keep up with the crescendo of warnings coming out of academia about the perils facing the world. A generation ago, biologist Paul Ehrlich’sThe Population Bomb and systems analysts Dennis and Donella Meadows’The Limits to Growth shocked us with their stark visions of where the world was headed. No wide-eyed greenie had predicted the opening of an ozone hole before the pipe-smoking boffins of the British Antarctic Survey spotted it when looking skyward back in 1985. On issues ranging from ocean acidification and tipping points in the Arctic to the dangers of nanotechnology, the scientists have always gotten there first — and the environmentalists have followed.

And yet, recently, the environment movement seems to have been turning up on the wrong side of the scientific argument. We have been making claims that simply do not stand up. We are accused of being anti-science — and not without reason. A few, even close friends, have begun to compare this casual contempt for science with the tactics of climate contrarians.

That should hurt.

Three current issues suggest that the risks of myopic adherence to ideology over rational debate are real: genetically modified (GM) crops, nuclear power, and shale gas development. The conventional green position is that we should be opposed to all three. Yet the voices of those with genuine environmental credentials, but who take a different view, are being drowned out by sometimes abusive and irrational argument.

In each instance, the issue is not so much which side environmentalists should be on, but rather the mind-set behind those positions and the tactics adopted to make the case. The wider political danger is that by taking anti-scientific positions, environmentalists end up helping the anti-environmental sirens of the new right.

The issue is not which side environmentalists should be on, but rather the mind-set behind their positions.

Most major environmental groups — from Friends of the Earth to Greenpeace to the Sierra Club — want a ban or moratorium on GM crops, especially for food. They fear the toxicity of these “Frankenfoods,” are concerned the introduced genes will pollute wild strains of the crops, and worry that GM seeds are a weapon in the takeover of the world’s food supply by agribusiness.

For myself, I am deeply concerned about the power of business over the world’s seeds and food supply. But GM crops are an insignificant part of that control, which is based on money and control of trading networks. Clearly there are issues about gene pollution, though research suggesting there is a problem is still very thin. Let’s do the research, rather than trash the test fields, which has been the default response of groups such as Greenpeace, particularly in my home country of Britain.

As for the Frankenfoods argument, the evidence is just not there. As the British former campaigner against GMs, Mark Lynas, points out: “Hundreds of millions of people have eaten GM-originated food without a single substantiated case of any harm done whatsoever.”

The most recent claim, published in September in the journal Food and Chemical Toxicology, that GM corn can produced tumors in rats, has been attacked as flawed in execution and conclusion by a wide range of experts with no axe to grind. In any event, the controversial study was primarily about the potential impact of Roundup, a herbicide widely used with GM corn, and not the GM technology itself.

Nonetheless, the reaction of some in the environment community to the reasoned critical responses of scientists to the paper has been to claim a global conspiracy among researchers to hide the terrible truth. One scientist was dismissed on the Web site GM Watch for being “a longtime member of the European Food Safety Authority, i.e. the very body that approved the GM corn in question.” That’s like dismissing the findings of a climate scientist because he sits on the Intergovernmental Panel on Climate Change — the “very body” that warned us about climate change. See what I mean about aping the worst and most hysterical tactics of the climate contrarians?

Stewart Brand wrote in his 2009 book Whole Earth Discipline: “I dare say the environmental movement has done more harm with its opposition to genetic engineering than any other thing we’ve been wrong about.” He will see nods of ascent from members of a nascent “green genes” movement — among them environmentalist scientists, such as Pamela Ronald of the University of California at Davis — who say GM crops can advance the cause of sustainable agriculture by improving resilience to changing climate and reducing applications of agrochemicals.

Yet such people are routinely condemned as apologists for an industrial conspiracy to poison the world. Thus, Greenpeace in East Asia claims that children eating nutrient-fortified GM “golden rice” are being used as “guinea pigs.” And its UK Web site’s introduction to its global campaigns says, “The introduction of genetically modified food and crops has been a disaster, posing a serious threat to biodiversity and our own health.” Where, ask their critics, is the evidence for such claims?

The problem is the same in the energy debate. Many environmentalists who argue, as I do, that climate change is probably the big overarching issue facing humanity in the 21st century, nonetheless often refuse to recognize that nuclear power could have a role in saving us from the worst.

For environmentalists to fan the flames of fear of nuclear power seems reckless and anti-scientific.

Nuclear power is the only large-scale source of low-carbon electricity that is fully developed and ready for major expansion.

Yes, we need to expand renewables as fast as we can. Yes, we need to reduce further the already small risks of nuclear accidents and of leakage of fissile material into weapons manufacturing. But as George Monbiot, Britain’s most prominent environment columnist, puts it: “To abandon our primary current source of low carbon energy during a climate change emergency is madness.”

Monbiot attacks the gratuitous misrepresentation of the risks of radiation from nuclear plants. It is widely suggested, on the basis of a thoroughly discredited piece of Russian head-counting, that up to a million people were killed by the Chernobyl nuclear accident in 1986. In fact, it is far from clear that many people at all — beyond the 28 workers who received fatal doses while trying to douse the flames at the stricken reactor — actually died from Chernobyl radiation. Certainly, the death toll was nothing remotely on the scale claimed.

“We have a moral duty,” Monbiot says, “not to spread unnecessary and unfounded fears. If we persuade people that they or their children are likely to suffer from horrible and dangerous health problems, and if these fears are baseless, we cause great distress and anxiety, needlessly damaging the quality of people’s lives.”

Many people have a visceral fear of nuclear power and its invisible radiation. But for environmentalists to fan the flames — especially when it gets in the way of fighting a far more real threat, from climate change — seems reckless, anti-scientific and deeply damaging to the world’s climate future.

One sure result of Germany deciding to abandon nuclear power in the wake of last year’s Fukushima nuclear accident (calamitous, but any death toll will be tiny compared to that from the tsunami that caused it) will be rising carbon emissions from a revived coal industry. By one estimate, the end of nuclear power in Germany will result in an extra 300 million tons of carbon dioxide reaching the atmosphere between now and 2020 — more than the annual emissions of Italy and Spain combined.

Last, let’s look at the latest source of green angst: shale gas and the drilling technique of hydraulic fracturing, or fracking, used to extract it. There are probably good reasons for not developing shale gas in many places. Its extraction can pollute water and cause minor earth tremors, for instance. But at root this is an argument about carbon — a genuinely double-edged issue that needs debating. For there is a good environmental case to be made that shale gas, like nuclear energy, can be part of the solution to climate change. That case should be heard and not shouted down.

Opponents of shale gas rightly say it is a carbon-based fossil fuel. But it is a much less dangerous fossil fuel than coal. Carbon emissions from burning natural gas are roughly half those from burning coal. A switch from coal to shale gas is the main reason why, in 2011, U.S. CO2 emissions fell by almost 2 percent.

Many environmentalists are imbued with a sense of their own exceptionalism and original virtue.

We cannot ignore that. With coal’s share of the world’s energy supply rising from 25 to 30 percent in the past half decade, a good argument can be made that a dash to exploit cheap shale gas and undercut this surge in coal would do more to cut carbon emissions than almost anything else. The noted environmental economist Dieter Helm of the University of Oxford argues just this in a new book, The Carbon Crunch, out this month.

But this is an unpopular argument. Carl Pope, executive director of the Sierra Club, was pilloried by activists for making the case that gas could be a “bridge fuel” to a low-carbon future. And when he stepped down, his successor condemned him for taking cash from the gas industry to fund the Sierra Club’s Beyond Coal campaign. Pope was probably wrong to take donations of that type, though some environment groups do such things all the time. But his real crime to those in the green movement seems to have been to side with the gas lobby at all.

Many environmentalists are imbued with a sense of their own exceptionalism and original virtue. But we have been dangerously wrong before. When Rachel Carson’s sound case against the mass application of DDT as an agricultural pesticide morphed into blanket opposition to much smaller indoor applications to fight malaria, it arguably resulted in millions of deaths as the diseases resurged.

And more recently, remember the confusion over biofuels? They were a new green energy source we could all support. I remember, when the biofuels craze began about 2005, I reported on a few voices urging caution. They warned that the huge land take of crops like corn and sugar cane for biofuels might threaten food supplies; that the crops would add to the destruction of rainforests; and that the carbon gains were often small to non-existent. But Friends of the Earth and others trashed them as traitors to the cause of green energy.
Well, today most greens are against most biofuels. Not least Friends of the Earth, which calls them a “big green con.” In fact, we may have swung too far in the other direction, undermining research into second-generation biofuels that could be both land- and carbon-efficient.

We don’t have to be slaves to science. There is plenty of room for raising questions about ethics and priorities that challenge the world view of the average lab grunt. And we should blow the whistle on bad science. But to indulge in hysterical attacks on any new technology that does not excite our prejudices, or to accuse genuine researchers of being part of a global conspiracy, is dishonest and self-defeating.

We environmentalists should learn to be more humble about our policy prescriptions, more willing to hear competing arguments, and less keen to engage in hectoring and bullying.

Forecasting Climate With A Chance Of Backlash (NPR)

by JENNIFER LUDDEN

February 19, 2013 3:14 AM

When it comes to climate change, Americans place great trust in their local TV weathercaster, which has led climate experts to see huge potential for public education.

The only problem? Polls show most weather presenters don’t know much about climate science, and many who do are fearful of talking about something so polarizing.

In fact, if you have heard a weathercaster speak on climate change, it’s likely been to deny it. John Coleman in San Diego and Anthony Watts of Watts Up With That? are among a group of vocal die-hards, cranking out blog posts and videos countering climate science. But even many meteorologists who don’t think it’s all a hoax still profoundly distrust climate models.

“They get reminded each and every day anytime their models don’t prove to be correct,” says Ed Maibach, who directs the Center for Climate Change Communication at George Mason University, and has carried out several surveys of TV weathercasters. “For them, the whole notion of projecting what the climate will be 30, 50, a hundred years from now, they’ve got a fairly high degree of skepticism.”

And yet, Maibach has found that many meteorologists would like to learn more and would like to educate their viewers. A few years back, he hatched a plan and found a willing partner in an unlikely place.

Prepared For Backlash

“I loved it. That’s exactly what I wanted to do,” says Jim Gandy, chief meteorologist at WLTX in Columbia, S.C.

Gandy had actually begun reading up on climate change several years earlier, when — to his surprise — a couple of geology professors at a party asked whether he thought global warming was real. Gandy was disturbed by what he learned and was game to go on air with it, even in what he calls a “dark red” state with a lot of “resistance” to the idea of climate change.

“We talked about it at length,” he says, “and we were prepared for a backlash.”

Researchers at George Mason University, with the help of Climate Central, tracked down information specific to Columbia, something many local meteorologists, with multiple weather reports a day, simply have no time to do. They also created graphics for Gandy to use whenever the local weather gave him a peg. And Gandy’s local TV station gave him something precious: 90 seconds of air time in the evening newscast.

The segment was called “Climate Matters,” and Gandy kicked it off in late July 2010. He dove in deep, packing his limited time with tidbits usually buried in scientific papers. One segment looked at what Columbia’s summer temperatures are projected to be in 40 years. Another explained how scientists can track man-made global warming, since carbon dioxide from fossil fuels has a specific chemical footprint.

Gandy also made the issue personal for viewers, reporting on how climate change will make pollen and poison ivy grow faster and more potent. He says people stopped him on the street about that.

And the backlash? There were a few cranky comments. “To my knowledge,” Gandy says, “there was at least one phone call from someone saying they needed to fire me.” But generally, the series went over well.

Better Informed Viewers

Meanwhile, researcher Ed Maibach polled people before Climate Matters began, then again a year into it. He says compared with viewers of other local stations, those who watched Jim Gandy gained a more scientifically grounded understanding of climate change, from understanding that it’s largely caused by humans, that it’s happening here and now and that it’s harmful.

“All of this is the kind of information that will help people, and help communities, make better decisions about how to adapt to a changing climate,” Maibach says.

Maibach hopes to expand the experiment, eventually making localized climate research and graphs available to meteorologists across the country. And there are other efforts to help weather forecasters become climate educators.

Last March, longtime Minnesota meteorologist Paul Douglas, founder of WeatherNationTV, posted an impassioned letter online urging his fellow Republicans to acknowledge that climate change is real.

“Other meteorologists actually emailed me and said, ‘Thanks for giving voice to something I’ve been thinking but was too afraid to say publicly,’ ” he says.

Douglas is part of a group pushing to tighten certification standards for meteorologists.

“If you’re going to talk about climate science on the air,” he says, you would “need to learn about the real science, and not get it off a talk show radio program or a website.”

After all, both he and Gandy say it’s becoming harder to separate weather from climate. That means TV weathercasters will be busier, and more closely followed, than ever.

Richard A. Muller: The Conversion of a Climate-Change Skeptic (N.Y.Times)

OP-ED CONTRIBUTOR

By RICHARD A. MULLER

Published: July 28, 2012

Berkeley, Calif.

CALL me a converted skeptic. Three years ago I identified problems in previous climate studies that, in my mind, threw doubt on the very existence of global warming. Last year, following an intensive research effort involving a dozen scientists, I concluded that global warming was real and that the prior estimates of the rate of warming were correct. I’m now going a step further: Humans are almost entirely the cause.

My total turnaround, in such a short time, is the result of careful and objective analysis by the Berkeley Earth Surface Temperature project, which I founded with my daughter Elizabeth. Our results show that the average temperature of the earth’s land has risen by two and a half degrees Fahrenheit over the past 250 years, including an increase of one and a half degrees over the most recent 50 years. Moreover, it appears likely that essentially all of this increase results from the human emission of greenhouse gases.

These findings are stronger than those of the Intergovernmental Panel on Climate Change, the United Nations group that defines the scientific and diplomatic consensus on global warming. In its 2007 report, the I.P.C.C. concluded only that most of the warming of the prior 50 years could be attributed to humans. It was possible, according to the I.P.C.C. consensus statement, that the warming before 1956 could be because of changes in solar activity, and that even a substantial part of the more recent warming could be natural.

Our Berkeley Earth approach used sophisticated statistical methods developed largely by our lead scientist, Robert Rohde, which allowed us to determine earth land temperature much further back in time. We carefully studied issues raised by skeptics: biases from urban heating (we duplicated our results using rural data alone), from data selection (prior groups selected fewer than 20 percent of the available temperature stations; we used virtually 100 percent), from poor station quality (we separately analyzed good stations and poor ones) and from human intervention and data adjustment (our work is completely automated and hands-off). In our papers we demonstrate that none of these potentially troublesome effects unduly biased our conclusions.

The historic temperature pattern we observed has abrupt dips that match the emissions of known explosive volcanic eruptions; the particulates from such events reflect sunlight, make for beautiful sunsets and cool the earth’s surface for a few years. There are small, rapid variations attributable to El Niño and other ocean currents such as the Gulf Stream; because of such oscillations, the “flattening” of the recent temperature rise that some people claim is not, in our view, statistically significant. What has caused the gradual but systematic rise of two and a half degrees? We tried fitting the shape to simple math functions (exponentials, polynomials), to solar activity and even to rising functions like world population. By far the best match was to the record of atmospheric carbon dioxide, measured from atmospheric samples and air trapped in polar ice.

Just as important, our record is long enough that we could search for the fingerprint of solar variability, based on the historical record of sunspots. That fingerprint is absent. Although the I.P.C.C. allowed for the possibility that variations in sunlight could have ended the “Little Ice Age,” a period of cooling from the 14th century to about 1850, our data argues strongly that the temperature rise of the past 250 years cannot be attributed to solar changes. This conclusion is, in retrospect, not too surprising; we’ve learned from satellite measurements that solar activity changes the brightness of the sun very little.

How definite is the attribution to humans? The carbon dioxide curve gives a better match than anything else we’ve tried. Its magnitude is consistent with the calculated greenhouse effect — extra warming from trapped heat radiation. These facts don’t prove causality and they shouldn’t end skepticism, but they raise the bar: to be considered seriously, an alternative explanation must match the data at least as well as carbon dioxide does. Adding methane, a second greenhouse gas, to our analysis doesn’t change the results. Moreover, our analysis does not depend on large, complex global climate models, the huge computer programs that are notorious for their hidden assumptions and adjustable parameters. Our result is based simply on the close agreement between the shape of the observed temperature rise and the known greenhouse gas increase.

It’s a scientist’s duty to be properly skeptical. I still find that much, if not most, of what is attributed to climate change is speculative, exaggerated or just plain wrong. I’ve analyzed some of the most alarmist claims, and my skepticism about them hasn’t changed.

Hurricane Katrina cannot be attributed to global warming. The number of hurricanes hitting the United States has been going down, not up; likewise for intense tornadoes. Polar bears aren’t dying from receding ice, and the Himalayan glaciers aren’t going to melt by 2035. And it’s possible that we are currently no warmer than we were a thousand years ago, during the “Medieval Warm Period” or “Medieval Optimum,” an interval of warm conditions known from historical records and indirect evidence like tree rings. And the recent warm spell in the United States happens to be more than offset by cooling elsewhere in the world, so its link to “global” warming is weaker than tenuous.

The careful analysis by our team is laid out in five scientific papers now online at BerkeleyEarth.org. That site also shows our chart of temperature from 1753 to the present, with its clear fingerprint of volcanoes and carbon dioxide, but containing no component that matches solar activity. Four of our papers have undergone extensive scrutiny by the scientific community, and the newest, a paper with the analysis of the human component, is now posted, along with the data and computer programs used. Such transparency is the heart of the scientific method; if you find our conclusions implausible, tell us of any errors of data or analysis.

What about the future? As carbon dioxide emissions increase, the temperature should continue to rise. I expect the rate of warming to proceed at a steady pace, about one and a half degrees over land in the next 50 years, less if the oceans are included. But if China continues its rapid economic growth (it has averaged 10 percent per year over the last 20 years) and its vast use of coal (it typically adds one new gigawatt per month), then that same warming could take place in less than 20 years.

Science is that narrow realm of knowledge that, in principle, is universally accepted. I embarked on this analysis to answer questions that, to my mind, had not been answered. I hope that the Berkeley Earth analysis will help settle the scientific debate regarding global warming and its human causes. Then comes the difficult part: agreeing across the political and diplomatic spectrum about what can and should be done.

Richard A. Muller, a professor of physics at the University of California, Berkeley, and a former MacArthur Foundation fellow, is the author, most recently, of “Energy for Future Presidents: The Science Behind the Headlines.”

New Research Shows Complexity of Global Warming (Science Daily)

Jan. 30, 2013 — Global warming from greenhouse gases affects rainfall patterns in the world differently than that from solar heating, according to a study by an international team of scientists in the January 31 issue of Nature. Using computer model simulations, the scientists, led by Jian Liu (Chinese Academy of Sciences) and Bin Wang (International Pacific Research Center, University of Hawaii at Manoa), showed that global rainfall has increased less over the present-day warming period than during the Medieval Warm Period, even though temperatures are higher today than they were then.

Clouds over the Pacific Ocean. (Credit: Shang-Ping Xie)

The team examined global precipitation changes over the last millennium and future projection to the end of 21st century, comparing natural changes from solar heating and volcanism with changes from human-made greenhouse gas emissions. Using an atmosphere-ocean coupled climate model that simulates realistically both past and present-day climate conditions, the scientists found that for every degree rise in global temperature, the global rainfall rate since the Industrial Revolution has increased less by about 40% than during past warming phases of Earth.

Why does warming from solar heating and from greenhouse gases have such different effects on global precipitation?

“Our climate model simulations show that this difference results from different sea surface temperature patterns. When warming is due to increased greenhouse gases, the gradient of sea surface temperature (SST) across the tropical Pacific weakens, but when it is due to increased solar radiation, the gradient increases. For the same average global surface temperature increase, the weaker SST gradient produces less rainfall, especially over tropical land,” says co-author Bin Wang, professor of meteorology.

But why does warming from greenhouse gases and from solar heating affect the tropical Pacific SST gradient differently?

“Adding long-wave absorbers, that is heat-trapping greenhouse gases, to the atmosphere decreases the usual temperature difference between the surface and the top of the atmosphere, making the atmosphere more stable,” explains lead-author Jian Liu. “The increased atmospheric stability weakens the trade winds, resulting in stronger warming in the eastern than the western Pacific, thus reducing the usual SST gradient — a situation similar to El Niño.”

Solar radiation, on the other hand, heats Earth’s surface, increasing the usual temperature difference between the surface and the top of the atmosphere without weakening the trade winds. The result is that heating warms the western Pacific, while the eastern Pacific remains cool from the usual ocean upwelling.

“While during past global warming from solar heating the steeper tropical east-west SST pattern has won out, we suggest that with future warming from greenhouse gases, the weaker gradient and smaller increase in yearly rainfall rate will win out,” concludes Wang.

Journal Reference:

  1. Jian Liu, Bin Wang, Mark A. Cane, So-Young Yim, June-Yi Lee. Divergent global precipitation changes induced by natural versus anthropogenic forcingNature, 2013; 493 (7434): 656 DOI: 10.1038/nature11784

Understanding the Historical Probability of Drought (Science Daily)

Jan. 30, 2013 — Droughts can severely limit crop growth, causing yearly losses of around $8 billion in the United States. But it may be possible to minimize those losses if farmers can synchronize the growth of crops with periods of time when drought is less likely to occur. Researchers from Oklahoma State University are working to create a reliable “calendar” of seasonal drought patterns that could help farmers optimize crop production by avoiding days prone to drought.

Historical probabilities of drought, which can point to days on which crop water stress is likely, are often calculated using atmospheric data such as rainfall and temperatures. However, those measurements do not consider the soil properties of individual fields or sites.

“Atmospheric variables do not take into account soil moisture,” explains Tyson Ochsner, lead author of the study. “And soil moisture can provide an important buffer against short-term precipitation deficits.”

In an attempt to more accurately assess drought probabilities, Ochsner and co-authors, Guilherme Torres and Romulo Lollato, used 15 years of soil moisture measurements from eight locations across Oklahoma to calculate soil water deficits and determine the days on which dry conditions would be likely. Results of the study, which began as a student-led class research project, were published online Jan. 29 inAgronomy Journal. The researchers found that soil water deficits more successfully identified periods during which plants were likely to be water stressed than did traditional atmospheric measurements when used as proposed by previous research.

Soil water deficit is defined in the study as the difference between the capacity of the soil to hold water and the actual water content calculated from long-term soil moisture measurements. Researchers then compared that soil water deficit to a threshold at which plants would experience water stress and, therefore, drought conditions. The threshold was determined for each study site since available water, a factor used to calculate threshold, is affected by specific soil characteristics.

“The soil water contents differ across sites and depths depending on the sand, silt, and clay contents,” says Ochsner. “Readily available water is a site- and depth-specific parameter.”

Upon calculating soil water deficits and stress thresholds for the study sites, the research team compared their assessment of drought probability to assessments made using atmospheric data. They found that a previously developed method using atmospheric data often underestimated drought conditions, while soil water deficits measurements more accurately and consistently assessed drought probabilities. Therefore, the researchers suggest that soil water data be used whenever it is available to create a picture of the days on which drought conditions are likely.

If soil measurements are not available, however, the researchers recommend that the calculations used for atmospheric assessments be reconfigured to be more accurate. The authors made two such changes in their study. First, they decreased the threshold at which plants were deemed stressed, thus allowing a smaller deficit to be considered a drought condition. They also increased the number of days over which atmospheric deficits were summed. Those two changes provided estimates that better agreed with soil water deficit probabilities.

Further research is needed, says Ochsner, to optimize atmospheric calculations and provide accurate estimations for those without soil water data. “We are in a time of rapid increase in the availability of soil moisture data, but many users will still have to rely on the atmospheric water deficit method for locations where soil moisture data are insufficient.”

Regardless of the method used, Ochsner and his team hope that their research will help farmers better plan the cultivation of their crops and avoid costly losses to drought conditions.

Journal Reference:

  1. Guilherme M. Torres, Romulo P. Lollato, Tyson E. Ochsner.Comparison of Drought Probability Assessments Based on Atmospheric Water Deficit and Soil Water Deficit.Agronomy Journal, 2013; DOI: 10.2134/agronj2012.0295

Make climate change a priority (Washington Post)

Graphic: A new report prepared for the World Bank finds that the planet is on a path to warming 4 degrees by the end of the century, with devastating consequences. Click on the infographic to go to the World Bank for more information.

By Jim Yong Kim, Published: January 24

Jim Yong Kim is president of the World Bank.

The weather in Washington has been like a roller coaster this January. Yes, there has been a deep freeze this week, but it was the sudden warmth earlier in the month that was truly alarming. Flocks of birds — robins, wrens, cardinals and even blue jays – swarmed bushes with berries, eating as much as they could. Runners and bikers wore shorts and T-shirts. People worked in their gardens as if it were spring.

The signs of global warming are becoming more obvious and more frequent. A glut of extreme weather conditions is appearing globally. And the average temperature in the United States last year was the highest ever recorded.

As economic leaders gathered in Davos this week for the World Economic Forum, much of the conversation was about finances. But climate change should also be at the top of our agendas, because global warming imperils all of the development gains we have made.If there is no action soon, the future will become bleak. The World Bank Groupreleased a reportin November that concluded that the world could warm by 7.2 degrees Fahrenheit (4 degrees Celsius) by the end of this century if concerted action is not taken now.

A world that warm means seas would rise 1.5 to 3 feet, putting at risk hundreds of millions of city dwellers globally. It would mean that storms once dubbed “once in a century” would become common, perhaps occurring every year. And it would mean that much of the United States, from Los Angeles to Kansas to the nation’s capital, would feel like an unbearable oven in the summer.

My wife and I have two sons, ages 12 and 3. When they grow old, this could be the world they inherit. That thought alone makes me want to be part of a global movement that acts now.

Even as global climate negotiations continue, there is a need for urgent action outside the conventions. People everywhere must focus on where we will get the most impact to reduce emissions and build resilience in cities, communities and countries.

Strong leadership must come from the six big economies that account for two-thirds of the energy sector’s global carbon dioxide emissions. President Obama’s reference in his inaugural address this week to addressing climate and energy could help reignite this critical conversation domestically and abroad.

The world’s top priority must be to get finance flowing and get prices right on all aspects of energy costs to support low-carbon growth. Achieving a predictable price on carbon that accurately reflects real environmental costs is key to delivering emission reductions at scale. Correct energy pricing can also provide incentives for investments in energy efficiency and cleaner energy technologies.

A second immediate step is to end harmful fuel subsidies globally, which could lead to a 5 percent fall in emissions by 2020. Countries spend more than $500 billion annually in fossil-fuel subsidies and an additional $500 billion in other subsidies, often related to agriculture and water, that are, ultimately, environmentally harmful. That trillion dollars could be put to better use for the jobs of the future, social safety nets or vaccines.

A third focus is on cities. The largest 100 cities that contribute 67 percent of energy-related emissions are both the center of innovation for green growth and the most vulnerable to climate change. We have seen great leadership, for example, in New York and Rio de Janeiro on low-carbon growth and tackling practices that fuel climate change.

At the World Bank Group, through the $7 billion-plus Climate Investment Funds, we are managing forests, spreading solar energy and promoting green expansion for cities, all with a goal of stopping global warming. We also are in the midst of a major reexamination of our own practices and policies.

Just as the Bretton Woods institutions were created to prevent a third world war, the world needs a bold global approach to help avoid the climate catastrophe it faces today. The World Bank Group is ready to work with others to meet this challenge. With every investment we make and every action we take, we should have in mind the threat of an even warmer world and the opportunity of inclusive green growth.

After the hottest year on record in the United States, a year in which Hurricane Sandycaused billions of dollars in damagerecord droughts scorched farmland in the Midwest and our organization reported that the planet could become more than 7 degrees warmer, what are we waiting for? We need to get serious fast. The planet, our home, can’t wait.

Climate Change Beliefs of Independent Voters Shift With the Weather (Science Daily)

Jan. 24, 2013 — There’s a well-known saying in New England that if you don’t like the weather here, wait a minute. When it comes to independent voters, those weather changes can just as quickly shift beliefs about climate change.

Predicted probability of “climate change is happening now, caused mainly by human activities” response as a function of temperature anomaly and political party. (Credit: Lawrence Hamilton and Mary Stampone/UNH)

New research from the University of New Hampshire finds that the climate change beliefs of independent voters are dramatically swayed by short-term weather conditions. The research was conducted by Lawrence Hamilton, professor of sociology and senior fellow at the Carsey Institute, and Mary Stampone, assistant professor of geography and the New Hampshire state climatologist.

“We find that over 10 surveys, Republicans and Democrats remain far apart and firm in their beliefs about climate change. Independents fall in between these extremes, but their beliefs appear weakly held — literally blowing in the wind. Interviewed on unseasonably warm days, independents tend to agree with the scientific consensus on human-caused climate change. On unseasonably cool days, they tend not to,” Hamilton and Stampone say.

Hamilton and Stampone used statewide data from about 5,000 random-sample telephone interviews conducted on 99 days over two and a half years (2010 to 2012) by the Granite State Poll. They combined the survey data with temperature and precipitation indicators derived from New Hampshire’s U.S. Historical Climatology Network (USHCN) station records. Survey respondents were asked whether they thought climate change is happening now, caused mainly by human activities. Alternatively, respondents could state that climate change is not happening, or that it is happening but mainly for natural reasons.

Unseasonably warm or cool temperatures on the interview day and previous day seemed to shift the odds of respondents believing that humans are changing the climate. However, when researchers broke these responses down by political affiliation (Democrat, Republican or independent), they found that temperature had a substantial effect on climate change views mainly among independent voters.

“Independent voters were less likely to believe that climate change was caused by humans on unseasonably cool days and more likely to believe that climate change was caused by humans on unseasonably warm days. The shift was dramatic. On the coolest days, belief in human-caused climate change dropped below 40 percent among independents. On the hottest days, it increased above 70 percent,” Hamilton says.

New Hampshire’s self-identified independents generally resemble their counterparts on a nationwide survey that asked the same questions, according to the researchers. Independents comprise 18 percent of the New Hampshire estimation sample, compared with 17 percent nationally. They are similar with respect to education, but slightly older, and more balanced with respect to gender.

In conducting their analysis, the researchers took into account other factors such as education, age, and sex. They also made adjustments for the seasons, and for random variation between surveys that might be caused by nontemperature events.

Journal Reference:

  1. Lawrence C. Hamilton, Mary D. Stampone. Blowin’ in the wind: Short-term weather and belief in anthropogenic climate changeWeather, Climate, and Society, 2013; : 130123150419007 DOI: 10.1175/WCAS-D-12-00048.1

The Storm That Never Was: Why Meteorologists Are Often Wrong (Science Daily)

Jan. 24, 2013 — Have you ever woken up to a sunny forecast only to get soaked on your way to the office? On days like that it’s easy to blame the weatherman.

BYU engineering professor Julie Crockett studies waves in the ocean and the atmosphere. (Credit: Image courtesy of Brigham Young University)

But BYU mechanical engineering professor Julie Crockett doesn’t get mad at meteorologists. She understands something that very few people know: it’s not the weatherman’s fault he’s wrong so often.

According to Crockett, forecasters make mistakes because the models they use for predicting weather can’t accurately track highly influential elements called internal waves.

Atmospheric internal waves are waves that propagate between layers of low-density and high-density air. Although hard to describe, almost everyone has seen or felt these waves. Cloud patterns made up of repeating lines are the result of internal waves, and airplane turbulence happens when internal waves run into each other and break.

“Internal waves are difficult to capture and quantify as they propagate, deposit energy and move energy around,” Crockett said. “When forecasters don’t account for them on a small scale, then the large scale picture becomes a little bit off, and sometimes being just a bit off is enough to be completely wrong about the weather.”

One such example may have happened in 2011, when Utah meteorologists predicted an enormous winter storm prior to Thanksgiving. Schools across the state cancelled classes and sent people home early to avoid the storm. Though it’s impossible to say for sure, internal waves may have been driving stronger circulations, breaking up the storm and causing it to never materialize.

“When internal waves deposit their energy it can force the wind faster or slow the wind down such that it can enhance large scale weather patterns or extreme kinds of events,” Crockett said. “We are trying to get a better feel for where that wave energy is going.”

Internal waves also exist in oceans between layers of low-density and high-density water. These waves, often visible from space, affect the general circulation of the ocean and phenomena like the Gulf Stream and Jet Stream.

Both oceanic and atmospheric internal waves carry a significant amount of energy that can alter climates.

Crockett’s latest wave research, which appears in a recent issue of the International Journal of Geophysics, details how the relationship between large-scale and small-scale internal waves influences the altitude where wave energy is ultimately deposited.

To track wave energy, Crockett and her students generate waves in a tank in her lab and study every aspect of their behavior. She and her colleagues are trying to pinpoint exactly how climate changes affect waves and how those waves then affect weather.

Based on this, Crockett can then develop a better linear wave model with both 3D and 2D modeling that will allow forecasters to improve their weather forecasting.

“Understanding how waves move energy around is very important to large scale climate events,” Crockett said. “Our research is very important to this problem, but it hasn’t solved it completely.”

Journal Reference:

  1. B. Casaday, J. Crockett. Investigation of High-Frequency Internal Wave Interactions with an Enveloped Inertia WaveInternational Journal of Geophysics, 2012; 2012: 1 DOI: 10.1155/2012/863792