Arquivo da tag: Previsão

Germany Has Created An Accidental Empire (Social Europe)

25/03/2013 BY ULRICH BECK

ulrich_beckAre we now living in a German Europe? In an interview with EUROPP editors Stuart A Brown and Chris Gilson, Ulrich Beck discusses German dominance of the European Union, the divisive effects of austerity policies, and the relevance of his concept of the ‘risk society’ to the current problems being experienced in the Eurozone.

How has Germany come to dominate the European Union?

Well it happened somehow by accident. Germany has actually created an ‘accidental empire’. There is no master plan; no intention to occupy Europe. It doesn’t have a military basis, so all the talk about a ‘Fourth Reich’ is misplaced. Rather it has an economic basis – it’s about economic power – and it’s interesting to see how in the anticipation of a European catastrophe, with fears that the Eurozone and maybe even the European Union might break down, the landscape of power in Europe has changed fundamentally.

First of all there’s a split between the Eurozone countries and the non-Eurozone countries. Suddenly for example the UK, which is only a member of the EU and not a member of the Eurozone, is losing its veto power. It’s a tragic comedy how the British Prime Minister is trying to tell us that he is still the one who is in charge of changing the European situation. The second split is that among the Eurozone countries there is an important division of power between the lender countries and the debtor countries. As a result Germany, the strongest economic country, has become the most powerful EU state.

Are austerity policies dividing Europe?

Indeed they are, in many ways. First of all we have a new line of division between northern European and southern European countries. Of course this is very evident, but the background from a sociological point of view is that we are experiencing the redistribution of risk from the banks, through the states, to the poor, the unemployed and the elderly. This is an amazing new inequality, but we are still thinking in national terms and trying to locate this redistribution of risk in terms of national categories.

At the same time there are two leading ideologies in relation to austerity policies. The first is pretty much based on what I call the ‘Merkiavelli’ model – by this I mean a combination of Niccolò Machiavelli and Angela Merkel. On a personal level, Merkel takes a long time to make decisions: she’s always waiting until some kind of consensus appears. But this kind of waiting makes the countries depending on Germany’s decision realise that actually Germany holds the power. This deliberate hesitation is quite an interesting strategy in terms of the way that Germany has taken over economically.

The second element is that Germany’s austerity policies are not based simply on pragmatism, but also underlying values. The German objection to countries spending more money than they have is a moral issue which, from a sociological point of view, ties in with the ‘Protestant Ethic’. It’s a perspective which has Martin Luther and Max Weber in the background. But this is not seen as a moral issue in Germany, instead it’s viewed as economic rationality. They don’t see it as a German way of resolving the crisis; they see it as if they are the teachers instructing southern European countries on how to manage their economies.

This creates another ideological split because the strategy doesn’t seem to be working so far and we see many forms of protest, of which Cyprus is the latest example. But on the other hand there is still a very important and powerful neo-liberal faction in Europe which continues to believe that austerity policies are the answer to the crisis.

Is the Eurozone crisis proof that we live in a risk society?

Yes, this is the way I see it. My idea of the risk society could easily be misunderstood because the term ‘risk’ actually signifies that we are in a situation to cope with uncertainty, but to me the risk society is a situation in which we are not able to cope with the uncertainty and consequences that we produce in society.

I make a distinction between ‘first modernity’ and our current situation. First modernity, which lasted from around the 18th century until perhaps the 1960s or 1970s, was a period where there was a great deal of space for experimentation and we had a lot of answers for the uncertainties that we produced: probability models, insurance mechanisms, and so on. But then because of the success of modernity we are now producing consequences for which we don’t have any answers, such as climate change and the financial crisis. The financial crisis is an example of the victory of a specific interpretation of modernity: neo-liberal modernity after the breakdown of the Communist system, which dictates that the market is the solution and that the more we increase the role of the market, the better. But now we see that this model is failing and we don’t have any answers.

We have to make a distinction between a risk society and a catastrophe society. A catastrophe society would be one in which the motto is ‘too late’: where we give in to the panic of desperation. A risk society in contrast is about the anticipation of future catastrophes in order to prevent them from happening. But because these potential catastrophes are not supposed to happen – the financial system could collapse, or nuclear technology could be a threat to the whole world – we don’t have the basis for experimentation. The rationality of calculating risk doesn’t work anymore. We are trying to anticipate something that is not supposed to happen, which is an entirely new situation.

Take Germany as an example. If we look at Angela Merkel, a few years ago she didn’t believe that Greece posed a major problem, or that she needed to engage with it as an issue. Yet now we are in a completely different situation because she has learned that if you look into the eyes of a potential catastrophe, suddenly new things become possible. Suddenly you think about new institutions, or about the fiscal compact, or about a banking union, because you anticipate a catastrophe which is not supposed to happen. This is a huge mobilising force, but it’s highly ambivalent because it can be used in different ways. It could be used to develop a new vision for Europe, or it could be used to justify leaving the European Union.

How should Europe solve its problems?

I would say that the first thing we have to think about is what the purpose of the European Union actually is. Is there any purpose? Why Europe and not the whole world? Why not do it alone in Germany, or the UK, or France?

I think there are four answers in this respect. First, the European Union is about enemies becoming neighbours. In the context of European history this actually constitutes something of a miracle. The second purpose of the European Union is that it can prevent countries from being lost in world politics. A post-European Britain, or a post-European Germany, is a lost Britain, and a lost Germany. Europe is part of what makes these countries important from a global perspective.

The third point is that we should not only think about a new Europe, we also have to think about how the European nations have to change. They are part of the process and I would say that Europe is about redefining the national interest in a European way. Europe is not an obstacle to national sovereignty; it is the necessary means to improve national sovereignty. Nationalism is now the enemy of the nation because only through the European Union can these countries have genuine sovereignty.

The fourth point is that European modernity, which has been distributed all over the world, is a suicidal project. It’s producing all kinds of basic problems, such as climate change and the financial crisis. It’s a bit like if a car company created a car without any brakes and it started to cause accidents: the company would take these cars back to redesign them and that’s exactly what Europe should do with modernity. Reinventing modernity could be a specific purpose for Europe.

Taken together these four points form what you could say is a grand narrative of Europe, but one basic issue is missing in the whole design. So far we’ve thought about things like institutions, law, and economics, but we haven’t asked what the European Union means for individuals. What do individuals gain from the European project? First of all I would say that, particularly in terms of the younger generation, more Europe is producing more freedom. It’s not only about the free movement of people across Europe; it’s also about opening up your own perspective and living in a space which is essentially grounded on law.

Second, European workers, but also students as well, are now confronted with the kind of existential uncertainty which needs an answer. Half of the best educated generation in Spanish and Greek history lack any future prospects. So what we need is a vision for a social Europe in the sense that the individual can see that there is not necessarily social security, but that there is less uncertainty. Finally we need to redefine democracy from the bottom up. We need to ask how an individual can become engaged with the European project. In that respect I have made a manifesto, along with Daniel Cohn-Bendit, called “We Are Europe”, arguing that we need a free year for everyone to do a project in another country with other Europeans in order to start a European civil society.

A more detailed discussion of the topics covered in this article is available in Ulrich Beck’s latest book, German Europe (Polity 2013). This interview was first published on EUROPP@LSE

Security Risks of Extreme Weather and Climate Change (Science Daily)

Feb. 11, 2013 — A Harvard researcher is pointing toward a new reason to worry about the effects of climate change — national security.

Hurricane Katrina. Predicted changes in extremes include more record high temperatures; fewer but stronger tropical cyclones; wider areas of drought and increases in precipitation; increased climate variability; Arctic warming and attendant impacts; and continued sea level rise as greenhouse warming continues and even accelerates. (Credit: NOAA)

A new report co-authored by Michael McElroy, the Gilbert Butler Professor of Environmental Studies, and D. James Baker, a former administrator of the National Oceanic and Atmospheric Administration, connects global climate change, extreme weather, and national security. During the next decade, the report concludes, climate change could have wide-reaching effects on everything from food, water, and energy supplies to critical infrastructure and economic security.

“Over the last century, the trend has been toward urbanization — to concentrate people in smaller areas,” McElroy said. “We’ve built an infrastructure — whether it’s where we build our homes or where we put our roads and bridges — that fits with that trend. If the weather pattern suddenly changes in a serious way, it could create very large problems. Bridges may be in the wrong place, or sea walls may not be high enough.”

Possible effects on critical infrastructure, however, only scratch the surface of the security concerns.

On an international scale, the report points to recent events, such as flooding in Pakistan and sustained drought in eastern Africa, that may be tied to changing weather patterns. How the United States responds to such disasters — whether by delivering humanitarian aid or through technical support — could affect security.

“By recognizing the immediacy of these risks, the U.S. can enhance its own security and help other countries do a better job of preparing for and coping with near-term climate extremes,” Baker said.

The report suggests that climate changes could even have long-reaching political effects.

It’s possible, McElroy said, that climate changes may have contributed to the uprisings of the Arab Spring by causing a rise in food prices, or that the extended drought in northern Mexico has contributed to political instability and a rise in drug trafficking in the region.

“We don’t have definitive answers, but our report raises these questions, because what we are saying is that these conditions are likely to be more normal than they were in the past,” McElroy said. “There are also questions related to sea-level rise. The conventional wisdom is that sea level is rising by a small amount, but observations show it’s rising about twice as fast as the models suggested. Could it actually go up by a large amount in a short period? I don’t think you can rule that out.”

Other potential effects, McElroy said, are tied to changes in an atmospheric circulation pattern called the Hadley circulation, in which warm tropical air rises, resulting in tropical rains. As the air moves to higher latitudes, it descends, causing the now-dry air to heat up. Regions where the hot, dry air returns to the surface are typically dominated by desert.

The problem, he said, is that evidence shows those arid regions are expanding.

“The observational data suggest that the Hadley circulation has expanded by several degrees in latitude,” McElroy said. “That’s a big deal, because if you shift where deserts are by just a few degrees, you’re talking about moving the southwestern desert into the grain-producing region of the country, or moving the Sahara into southern Europe.”

The report is the result of the authors’ involvement with Medea, a group of scientists who support the U.S. government by examining declassified national security data useful for scientific inquiry. In recent decades, the group has worked with officials in the United States and Russia to declassify data on climatic conditions in the Arctic and thousands of spy satellite images. Those images have been used to study ancient settlement patterns in the Middle East and changes in Arctic ice.

“I would be reluctant to say that our report is the last word on short-term climate change,” McElroy said. “Climate change is a moving target. We’ve done an honest, useful assessment of the state of play today, but we will need more information and more hard work to get it right. One of the recommendations in our report is the need for a serious investment in measurement and observation. It’s really important to keep doing that, otherwise we’re going to be flying blind.”

The study was conducted with funds provided by the Central Intelligence Agency. Any opinions, findings, conclusions, or recommendations expressed in this material are those of the authors and do not necessarily reflect the view of the CIA or the U.S. government.

Report: Climate Extremes: Recent Trends with Implications for National Security at www.environment.harvard.edu/climate-extremes

Na avaliação de especialistas, pré-sal deve trazer benefícios econômicos e científicos para o Brasil (Jornal da Ciência)

JC e-mail 4665, de 15 de Fevereiro de 2013.

Viviane Monteiro

O país não pode perder a oportunidade de utilizar os royalties do petróleo para investir em educação e em pesquisas científicas

Apesar dos riscos ambientais, a exploração do petróleo da camada pré-sal deve assegurar ao país, em longo prazo, novos patamares de desenvolvimento, tanto econômico quanto cientifico e tecnológico. Essa é a opinião que prevalece entre especialistas e pesquisadores da área de petróleo do Instituto Alberto Luiz Coimbra de Pós-Graduação e Pesquisa de Engenharia da Universidade Federal do Rio de Janeiro (Coppe-UFRJ), da Universidade Federal do Espírito Santo (UFES) e da Escola Politécnica da Universidade de São Paulo (Poli-USP).

Para eles, o Brasil não pode perder a oportunidade de explorar o pré-sal e nem de utilizar os royalties do petróleo extraído dessa camada profunda para investir em educação e em pesquisas científicas e tecnológicas. Um dos objetivos desses investimentos deve ser produzir energias limpas e renováveis, que devem substituir o combustível fóssil no período “pós-petróleo”, o que deve ocorrer nas próximas cinco décadas, aproximadamente.

Diante da exploração do pré-sal, o diretor de tecnologia e inovação da Coppe/UFRJ, Segen Estefen, diz que o Brasil deve se tornar um dos líderes mundiais na produção de tecnologias de ponta tanto para a exploração de petróleo quanto para o desenvolvimento de energias limpas e renováveis. A exploração do pré-sal, segundo ele acredita, representa uma janela de oportunidades para o Brasil figurar entre os maiores produtores de petróleo do mundo, tornando-se um dos “pelotões” de frente da Organização dos Países Exportadores de Petróleo (OPEP).

Nos últimos dois anos, o país passou da 18ª para a 13ª posição no ranking dos produtores de petróleo, conforme o relatório “Statistical Review of World Energy 2011”, da empresa britânica British Petroleum (BP). Com as descobertas das jazidas do pré-sal, as estimativas para as reservas nacionais de petróleo cresceram de 8 bilhões de barris, por volta de 2006, para algo entre 60 bilhões e 70 bilhões, atualmente. Ao colocar esses números na ponta do lápis, Segen calcula que tais cifras representariam uma receita de US$ 4 trilhões para o país, levando-se em conta o preço atual (US$ 100) do barril de petróleo. Ou seja, é um montante similar ao valor corrente do Produto Interno Bruto (PIB) nacional de R$ 4,143 trilhões, em 2011.

Em termos de reservas de petróleo, o pesquisador e professor Eustáquio Vinícius de Castro, do Laboratório de Petróleo da UFES, concorda que o pré-sal colocará o Brasil entre os cinco maiores produtores do petróleo do mundo, como Arábia Saudita, Estados Unidos e Venezuela. “A tecnologia a ser desenvolvida para atender à exploração do pré-sal deve ser estendida, também, para outras áreas, sobretudo as indústrias metal-mecânica e a de química ambiental”, diz.

Como exemplo, Castro cita equipamentos de perfuração de áreas ultraprofundas capazes de suportar fortes pressões, que podem ser utilizados pela construção civil; e agentes químicos (aditivos) que devem estar presente nos aparelhos para remoção de impurezas e purificação do óleo do pré-sal. “Esses aditivos, inclusive, podem ser utilizados na purificação de água residual, gerada por empresas fabricantes de tinta, na despoluição de rios ou de esgotos urbanos”, acrescenta.

Modelo norueguês – Também defensor da exploração do pré-sal, o professor Ricardo Cabral de Azevedo, do Departamento de Engenharia de Minas e de Petróleo da Poli/USP, aconselha o Brasil a adotar o modelo da Noruega na extração do petróleo da camada pré-sal e evitar a chamada “doença holandesa”. “Outros países que tiveram grandes reservas a explorar e produzir são exemplos do que devemos ou não fazer no Brasil”, explica. “A Holanda, por exemplo, sofreu o que ficou sendo conhecido como ‘doença holandesa’, porque sua economia se tornou excessivamente dependente do petróleo. Já a Noruega se transformou radicalmente e hoje é um dos países com maior IDH [Índice de Desenvolvimento Humano] do mundo”, lembra.

Até então, a Noruega era um dos países mais pobres da Europa, cujas finanças dependiam principalmente de exportações de commodities, como minérios e peixes enlatados. A virada da economia norueguesa ocorreu a partir de 1969, quando foram descobertas grandes reservas de petróleo no Mar do Norte e a receita foi dirigida principalmente para saúde e educação. Hoje, esse país europeu detém a terceira maior renda per capita do mundo (US$ 59,3 mil) e o IDH mais alto do planeta.

Royalties para educação e CT&I – Assim, para fazer frente aos desafios que se apresentam na extração do petróleo na camada pré-sal no Brasil, os especialistas reforçam a necessidade de destinar parte significativa da receita dessa atividade para educação, ciência, tecnologia e inovação, seguindo o modelo norueguês. Aliás, essa é uma bandeira levantada pela comunidade científica, representada pela Sociedade Brasileira para o Progresso da Ciência (SBPC).

Os especialistas são unânimes em afirmar que o país precisa aproveitar as riquezas do pré-sal a fim de conquistar novos patamares de desenvolvimento, dar um salto na qualidade na educação e melhorar o capital humano – lembrando que um dia as reservas do petróleo acabarão. “Lembramos que são reservas muito grandes, mas finitas”, alerta Azevedo. “Cabe a nós transformá-las em um legado permanente, investindo na educação e no desenvolvimento do nosso país”, defende.

Já o diretor de tecnologia e inovação da Coppe/UFRJ, Estefen, acrescenta que o país precisa preparar o terreno, na área de pesquisas científicas e tecnológicas, para o período pós-petróleo. Nesse caso, ele considera fundamental assegurar investimentos para ampliar consideravelmente as pesquisas e estudos científicos para o desenvolvimento de tecnologias para produção de energias limpas e renováveis, lembrando que há um esforço de vários países em prol da redução de emissões em médio prazo. Vale destacar que o petróleo é um combustível fóssil que contribui significativamente para o aumento do efeito estufa.

Para o pesquisador da UFES, Castro, que considera positiva a proposta de criação do fundo do pré-sal (fundo soberano) – para o qual deve ser destinada metade da receita do óleo a ser extraído de águas ultraprofundas para educação – a exploração do pré-sal precisa ser inteligente, com responsabilidade ambiental e investimento em educação. “O petróleo traz muita riqueza, mas pode trazer, também, muita pobreza e muito dano ambiental”, lembra. “Por isso, a exploração tem de ser de forma inteligente, com responsabilidade ambiental e investimento em educação.” Hoje as riquezas do petróleo são distribuídas a estados, municípios e União por intermédio de royalties. Pela lei em vigor, os recursos devem ser investidos na parte social do país, “mas as prefeituras fazem mau uso dos recursos”, avalia.

Explorar o pré-sal requer esforços científicos e tecnológicos, considerando que os reservatórios estão a quase sete mil metros de profundidade a partir do nível do mar, com destaque para as Bacias de Santos (SP) e de Campos (RJ). Para fazer frente a esses desafios, Estefen diz que o país precisa mobilizar a comunidade científica nacional, seu conhecimento disponível, criar novos laboratórios, formar capital humano e gerar empregos de qualidade. “Extrair o petróleo do pré-sal vai demandar grande esforço tecnológico, esforços que vão ajudar o Brasil a conquistar novos patamares de desenvolvimento, futuramente”, diz. “Isto é, se usarmos bem os recursos do pré-sal, vamos educar as crianças, desenvolver a indústria, a ciência e a tecnologia. Se seguir tal receituário, o Brasil deverá se destacar no cenário internacional como um dos líderes tecnológicos, dentre os quais figuram Estados Unidos, Japão e países europeus”, conclui.

Pesquisadores analisam os custos ambientais da exploração profunda
Ao colocar na balança os benefícios que as riquezas do pré-sal podem proporcionar ao país e os eventuais custos ambientais, pesquisadores avaliam que o Brasil não pode renunciar à exploração do petróleo em águas profundas, unilateralmente, mesmo reconhecendo que a queima do petróleo contribui para o aquecimento global. Isso não significa que o processo de exploração do pré-sal desconsidere os danos ambientais.

O diretor de tecnologia e inovação da Coppe/UFRJ, Segen Estefen, insiste em dizer que todas as pesquisas em andamento vislumbram a proteção do meio ambiente, em uma tentativa de dar mais segurança às operações. “Não faz sentido o Brasil se beneficiar do petróleo por três ou quatro décadas, mas deixar o país em uma situação ruim para o meio ambiente”, explica.

Hoje os pesquisadores da Coppe, por exemplo, trabalham, simultaneamente, com assuntos ligados tanto à produção de petróleo, nos dias atuais, quanto a outras tecnologias que podem ser usadas na era “pós-petróleo”. Estudam, entre outros aspectos, a produção de eletricidade pelas ondas do mar – uma energia limpa e renovável – aproveitando a mesma estrutura montada e financiada pela indústria do petróleo para desenvolver conhecimento para o período pós-petróleo.

Para o especialista da Coppe/UFRJ, o Brasil não pode renunciar ao óleo do pré-sal porque essa “é uma riqueza importante para o Brasil” por ser uma fonte de energia competitiva. Dessa forma, ele acrescenta, a extração do pré-sal deverá render frutos positivos ao país. “No Brasil, ainda com tanta desigualdade, não podemos abdicar dessas riquezas”, diz. “Se não forem exploradas, talvez, daqui a 50 anos o preço do petróleo não valha metade dos valores atuais.” Por enquanto, Estefen acrescenta, não existe nenhum combustível capaz de substituir o petróleo e nem previsões para os próximos 20 anos, aproximadamente. Além disso, a demanda por essa energia tende a aumentar muito em função do aumento da população e da demanda de países, principalmente nos países asiáticos.

Demonstrando a mesma opinião, o professor Ricardo Cabral de Azevedo, do Departamento de Engenharia de Minas e de Petróleo da Escola Politécnica da Universidade de São Paulo (Poli/USP), considera ideal o país investir no conhecimento para substituir o uso do combustível fóssil, paulatinamente, em uma tentativa de minimizar os impactos ambientais. “O fato é que sempre haverá riscos, nessa ou em qualquer outra atividade, mas o ser humano ainda precisa do petróleo”, lembra. “Desse modo, o fundamental é procurarmos reduzi-los ao máximo. Aí também as experiências do passado são fundamentais, para aprendermos com os erros já cometidos.”

O eventual retorno socioeconômico proporcionado pela exploração de petróleo na camada pré-sal compensam os riscos ambientais, na observação do pesquisador e professor Eustáquio Vinícius de Castro, do Laboratório de Petróleo da Universidade Federal do Espírito Santo (UFES). “Compensam desde que as coisas aconteçam de forma inteligente e sustentável e com racionalidade no processo de produção”, diz. ” Hoje, as empresas petrolíferas, que no passado foram mais poluentes, adotam mais segurança no processo de extração do petróleo, mesmo que alguns problemas aconteçam de vez em quando”.

Dimensão – O petróleo na camada pré-sal ocupa, aproximadamente, uma área de 800 km de comprimento por 200 km de largura, acompanhando a linha do litoral sudeste brasileiro. Segundo dados da Petrobras, desde 2006, foram perfurados mais de 80 poços, tanto na Bacia de Santos quanto na de Campos, com índice “de sucesso exploratório” acima de 80%. A estimativa é de que outras 19 novas plataformas entrem em operação até 2016; e outras 19 entrem em atividade até 2020. Segundo dados de suas assessoria de imprensa, a companhia petrolífera, líder na exploração do pré-sal, encomendou ainda 21 plataformas de produção e 28 sondas de exploração marítima a serem construídas até 2020 no País, além de 49 navios-tanque e centenas de barcos de apoio e serviços offshore.

Statistical Physics Offers a New Way to Look at Climate (Science Daily)

Mar. 5, 2013 — Statistical physics offers an approach to studying climate change that could dramatically reduce the time and brute-force computing that current simulation techniques require. The new approach focuses on fundamental forces that drive climate rather than on “following every little swirl” of water or air.

Two views, two approaches to simulation. Computer-generated images of a planet’s “zonal velocity” (the west-to-east component of wind) use direct numerical simulation (the traditional approach, left) and direct statistical simulation. The latter has limits, but its development is at a very early stage. (Credit: Marston lab/Brown University)

Scientists are using ever more complex models running on ever more powerful computers to simulate Earth’s climate. But new research suggests that basic physics could offer a simpler and more meaningful way to model key elements of climate.

The research, published in the journal Physical Review Letters, shows that a technique called direct statistical simulation does a good job of modeling fluid jets, fast-moving flows that form naturally in oceans and in the atmosphere. Brad Marston, professor of physics at Brown University and one of the authors of the paper, says the findings are a key step toward bringing powerful statistical models rooted in basic physics to bear on climate science.

In addition to the Physical Review Letters paper, Marston will report on the work at a meeting of the American Physical Society to be held in Baltimore this later month.

The method of simulation used in climate science now is useful but cumbersome, Marston said. The method, known as direct numerical simulation, amounts to taking a modified weather model and running it through long periods of time. Moment-to-moment weather — rainfall, temperatures, wind speeds at a given moment, and other variables — is averaged over time to arrive at the climate statistics of interest. Because the simulations need to account for every weather event along the way, they are mind-bogglingly complex, take a long time run, and require the world’s most powerful computers.

One practical advantage of the new approach: the ability to model climate conditions from millions of years ago without having to reconstruct the world’s entire weather history.Direct statistical simulation, on the other hand, is a new way of looking at climate. “The approach we’re investigating,” Marston said, “is the idea that one can directly find the statistics without having to do these lengthy time integrations.”

It’s a bit like the approach physicists use to describe the behavior of gases.

“Say you wanted to describe the air in a room,” Marston said. “One way to do it would be to run a giant supercomputer simulation of all the positions of all of the molecules bouncing off of each other. But another way would be to develop statistical mechanics and find that the gas actually obeys simple laws you can write down on a piece of paper: PV=nRT, the gas equation. That’s a much more useful description, and that’s the approach we’re trying to take with the climate.”

Conceptually, the technique focuses attention on fundamental forces driving climate, instead of “following every little swirl,” Marston said. A practical advantage would be the ability to model climate conditions from millions of years ago without having to reconstruct the world’s entire weather history in the process.

The theoretical basis for direct statistical simulation has been around for nearly 50 years. The problem, however, is that the mathematical and computational tools to apply the idea to climate systems aren’t fully developed. That is what Marston and his collaborators have been working on for the last few years, and the results in this new paper show their techniques have good potential.

The paper, which Marston wrote with University of Leeds mathematician Steve Tobias, investigates whether direct statistical simulation is useful in describing the formation and characteristics of fluid jets, narrow bands of fast-moving fluid that move in one direction. Jets form naturally in all kinds of moving fluids, including atmospheres and oceans. On Earth, atmospheric jet streams are major drivers of storm tracks.

For their study, Marston and Tobias simulated the jets that form as a fluid moves on a hypothetical spinning sphere. They modeled the fluid using both the traditional numerical technique and their statistical technique, and then compared the output of the two models. They found that the models generally arrived at similar values for the number of jets that would form and the strength of the airflow, demonstrating that statistical simulation can indeed be used to model jets.

There were limits, however, to what the statistical model could do. The study found that as pace of adding and removing energy to the fluid system increased, the statistical model started to break down. Marston and Tobias are currently working on an expansion of their technique to deal with that problem.

Despite the limitation, Marston is upbeat about the potential for the technique. “We’re very pleased that it works as well as it did here,” he said.

Since completing the study, Marston has integrated the method into a computer program called “GCM” that he has made easily available via Apple’s Mac App Store for other researchers to download. The program allows users to build their own simulations, comparing numerical and statistical models. Marston expects that researchers who are interested in this field will download it and play with the technique on their own, providing new insights along the way. “I’m hoping that citizen-scientists will also explore climate modeling with it as well, and perhaps make a discovery or two,” he said.

There’s much more work to be done on this, Marston stresses, both in solving the energy problem and in scaling the technique to model more realistic climate systems. At this point, the simulations have only been applied to hypothetical atmospheres with one or two layers. Earth’s atmosphere is a bit more complex than that.

“The research is at a very early stage,” Marston said, “but it’s picking up steam.”

Related app:http://www.brown.edu/Research/Environmental_Physics/Environmental_Physics/Code.html

Journal Reference:

  1. S. M. Tobias, J. B. Marston. Direct Statistical Simulation of Out-of-Equilibrium JetsPhysical Review Letters, 2013 DOI: 10.1103/PhysRevLett.110.104502

Ten Times More Hurricane Surges in Future, New Research Predicts (Science Daily)

Mar. 18, 2013 — By examining the frequency of extreme storm surges in the past, previous research has shown that there was an increasing tendency for storm hurricane surges when the climate was warmer. But how much worse will it get as temperatures rise in the future? How many extreme storm surges like that from Hurricane Katrina, which hit the U.S. coast in 2005, will there be as a result of global warming? New research from the Niels Bohr Institute show that there will be a tenfold increase in frequency if the climate becomes two degrees Celcius warmer.

The results are published in the scientific journal, Proceedings of the National Academy of Science,PNAS.

The extreme storm surge from Superstorm Sandy in the autumn 2012 flooded large sections of New York and other coastal cities in the region. New research shows that such hurricane surges will become more frequent in a warmer climate. (Credit: © Leonard Zhukovsky / Fotolia)

Tropical cyclones arise over warm ocean surfaces with strong evaporation and warming of the air. The typically form in the Atlantic Ocean and move towards the U.S. East Coast and the Gulf of Mexico. If you want to try to calculate the frequency of tropical cyclones in a future with a warmer global climate, researchers have developed various models. One is based on the regional sea temperatures, while another is based on differences between the regional sea temperatures and the average temperatures in the tropical oceans. There is considerable disagreement among researchers about which is best.

New model for predicting cyclones

“Instead of choosing between the two methods, I have chosen to use temperatures from all around the world and combine them into a single model,” explains climate scientist Aslak Grinsted, Centre for Ice and Climate at the Niels Bohr Institute at the University of Copenhagen.

He takes into account the individual statistical models and weights them according to how good they are at explaining past storm surges. In this way, he sees that the model reflects the known physical relationships, for example, how the El Niño phenomenon affects the formation of cyclones. The research was performed in collaboration with colleagues from China and England.

The statistical models are used to predict the number of hurricane surges 100 years into the future. How much worse will it be per degree of global warming? How many ‘Katrinas’ will there be per decade?

Since 1923, there has been a ‘Katrina’ magnitude storm surge every 20 years.

10 times as many ‘Katrinas’

“We find that 0.4 degrees Celcius warming of the climate corresponds to a doubling of the frequency of extreme storm surges like the one following Hurricane Katrina. With the global warming we have had during the 20th century, we have already crossed the threshold where more than half of all ‘Katrinas’ are due to global warming,” explains Aslak Grinsted.

“If the temperature rises an additional degree, the frequency will increase by 3-4 times and if the global climate becomes two degrees warmer, there will be about 10 times as many extreme storm surges. This means that there will be a ‘Katrina’ magnitude storm surge every other year,” says Aslak Grinsted and he points out that in addition to there being more extreme storm surges, the sea will also rise due to global warming. As a result, the storm surges will become worse and potentially more destructive.

Journal Reference:

  1. Aslak Grinsted, John C. Moore, and Svetlana Jevrejeva.Projected Atlantic hurricane surge threat from rising temperaturesPNAS, March 18, 2013 DOI:10.1073/pnas.1209980110

Faltam cinco dias para um alerta global pela consciência ambiental (WWF/Envolverde)

18/3/2013 – 10h50

por Redação do WWF Brasil

n12 300x183 Faltam cinco dias para um alerta global pela consciência ambientalNo próximo sábado, 23 de março, às 20h30 (hora local), milhares, talvez bilhões de pessoas apagarão as luzes de suas casas, comércios, repartições, monumentos e outros logradouros importantes num ato simbólico de alerta contra as mudanças no clima.

É a nona edição da Hora do Planeta, um movimento que começou tímido na Austrália e hoje envolve milhares de cidades em mais de 152 países. Aqui o País a Hora do Planeta é promovida pelo WWF-Brasil e o objetivo é superar os números do ano passado, atraindo para a Hora do Planeta todas as capitais estaduais e o Distrito Federal, e ultrapassando a marca de 131 cidades participantes em 2012.

Sua cidade pode seguir o exemplo das muitas que já o fizeram e se inscrever enviando um email para cidades@wwf.org.br e assinando o Termo de Adesão. Escolas e instituições também não podem ficar de fora.

Sua participação pessoal é fundamental para o sucesso da Hora do Planeta. Não podemos mais consumir o equivalente a um planeta e meio de recursos para nossa subsistência na Terra. Adote práticas sustentáveis desde já e depois do 23 de março. Recicle. Reduza. Reutilize. É possível adotar mudanças simples no seu estilo de vida, que terão um grande impacto global. Evite desperdício de água e energia. Recorra a fontes alternativas, como solar e eólica, se possível. Use menos o carro e prefira o transporte público, a bicicleta e andar a pé. Coma menos carne vermelha. Consuma produtos locais, sempre que possível orgânicos. Informe-se mais sobre o tema. Nosso futuro comum está em perigo e as mudanças climáticas em curso ameaçam toda a vida na Terra. Sua consciência é a maior arma para combatê-las.

Acompanhe a Hora do Planeta no Brasil através dos nossos canais no Facebook, Twitter e YouTube. Divulgue a nossa mensagem. E veja aqui os desafios do “Eu vou se você for” — pessoas propondo alternativas para que todos adotem um estilo de vida mais correto ecologicamente (e de quebra mais saudável!).

Junte-se a nós. A Hora do Planeta já começou e não pode terminar quando as luzes se acenderem no sábado.

* Publicado originalmente no site WWF Brasil.

Não faltam avisos: cuidado com o clima (Envolverde)

Ambiente

18/3/2013 – 11h05

por Washington Novaes*

clima 300x225 Não faltam avisos: cuidado com o clima

Foto: Divulgação/ Internet

É preciso insistir e insistir: as grandes cidades brasileiras – mas não apenas elas – precisam criar com urgência políticas do clima que as habilitem a enfrentar com eficiência os “desastres naturais”, cada vez mais frequentes e intensos e que provocam um número cada vez maior de mortos e outras vítimas; precisam arrancar do fundo das gavetas projetos que permitam evitar inundações em áreas urbanas; criar planos diretores que incorporem as novas informações nessa área; rever os padrões de construção, já obsoletos, concebidos em outras épocas, para condições climáticas muito mais amenas – e que se mostram cada vez mais vulneráveis a desabamentos; incorporar as universidades nessa busca de formatos científicos e tecnológicos.

Segundo este jornal (21/2), de 12 locais alagados em uma semana no mês passado na cidade de São Paulo, 11 já sofriam com inundações há 20 anos – entre eles, alguns dos pontos com mais veículos e pessoas, como o Vale do Anhangabaú, a Avenida 23 de Maio, a Rua Turiaçu. E a Prefeitura de São Paulo promete desengavetar 79 obras antienchentes, algumas delas abafadas há 15 anos. Inacreditável. O governo do Estado assegura que vai trabalhar em 14 piscinões (outros 30 caberão a parcerias público-privadas), além de aplicar mais R$ 317 milhões em desassoreamento do Rio Tietê, onde já foi gasto R$ 1,7 bilhão (terá de gastar muito mais enquanto não decidir atuar nas dezenas de afluentes do rio sob o asfalto, que carregam sedimentos, lixo, esgotos, etc.). A população paulistana ficará muito grata – ela e 1 milhão de pessoas que entram e saem diariamente da cidade (Estado, 27/2).

Enquanto não houver uma ação enérgica na área do clima e na revisão dos padrões de construção em toda parte, continuaremos assim, como nas últimas semanas: obra irregular provoca desabamento de prédio na Liberdade e mata pedestre (1.ª/3); edifício de 20 andares desaba no Rio e arrasta mais dois, com 22 mortos (25/1); desabamento de lajes em construção de 13 pavimentos em São Bernardo do Campo mata duas pessoas (6/2); enchente em fábrica mata quatro em Sorocaba; inundação no Rio mata cinco pessoas (8/3); homem salva três pessoas e morre junto com um estudante, levados pela enxurrada durante temporal de cinco horas no Ipiranga, quando caiu um terço da chuva prevista para o mês e fez transbordar o Tamanduateí (11/3); deslizamento na moderna Rodovia dos Imigrantes mata uma pessoa e interrompe o tráfego (22/2), numa chuva de 183,4 milímetros, algumas vezes mais do que o índice médio de chuvas em um mês na região. Até o Arquivo Nacional, no Rio de Janeiro, perdeu mais de 130 caixas de documentos históricos num temporal no centro da cidade (10/3).

Não pode haver ilusões. O Brasil já está em quinto lugar entre os países que mais têm sofrido com desastres climáticos. O Semiárido, em outubro último, teve o mês mais seco em 83 anos, segundo o Operador Nacional do Sistema Elétrico (Estado, 31/10); 10 milhões de pessoas foram atingidas em mais de 1300 municípios. O Painel Intergovernamental sobre Mudanças Climáticas, órgão da Convenção do Clima, este ano só divulgará parte de seu novo relatório, mas seu secretário-geral, Rajendra Pachauri, já adverte que é preciso “espalhar a preocupação”, de vez que, com o aumento da temperatura, até 2050, entre 2 e 2,4 graus Celsius, o nível dos oceanos se elevará entre 0,4 e 1,4 metro – mas poderá ser mais, com o avanço do degelo no Ártico (Guardian, 28/2).

Não é por acaso, assim, que o sistema escolar público dos Estados Unidos já tenha, este ano, incorporado as questões do clima a seu currículo para os alunos. E que o Conselho da União Europeia tenha aprovado 20% do seu orçamento – ou 960 bilhões – para políticas e ações nessa área. Porque as informações são altamente preocupantes. Como as da Organização das Nações Unidas para a Alimentação e a Agricultura (9/3) de que duplicou, de 1970 para cá, a superfície de terras afetadas pela seca no mundo; ou a de que as emissões de dióxido de carbono CO2 por desmatamento, atividades agrícolas e outros formatos, entre 1990 e 2010, cresceram muito – e o Brasil responde por 25,8 bilhões de toneladas equivalentes de CO2, seguido pela Indonésia (13,1 bilhões de toneladas) e pela Nigéria (3,8 bilhões).

Os problemas com o clima, diz a Universidade de Reading (1.º/3), indicam que será preciso aumentar a produtividade na agricultura em 12% a partir de 2016, para compensar as perdas e as mudanças nos ambientes. A vegetação nas latitudes mais ao norte da América está mudando, começa a assemelhar-se à das áreas mais ao sul, segundo a Nasa (UPI, 12/3), que analisou o período 1982-2011; e lembra que as atividades no campo terão de adaptar-se. Também há alterações muito fortes em outras regiões, como nos Rios Tigre e Eufrates, que em sete anos (2003-2010) perderam 144 quilômetros cúbicos de água, equivalentes ao volume do Mar Morto (O Globo, 14/2).

Em toda parte as informações inquietam. Universidades da Flórida, por exemplo (Huffpost Miami, 12/3), alertam que será preciso transplantar três grandes estações de tratamento de esgotos no sul do Estado para evitar que elas fiquem “confinadas em ilhas” em menos de 50 anos, por causa da elevação do nível do mar. O almirante Samuel J. Locklear III, comandante da frota norte-americana no Pacífico, diz que essa elevação do nível dos oceanos “é a maior ameaça à segurança”. E que China e Índia precisam preparar-se para socorrer e evacuar centenas de milhares ou milhões de pessoas.

Retornando ao início deste artigo: as cidades brasileiras não podem adiar o enfrentamento das mudanças do clima, principalmente quanto a inundações e deslizamentos de terras (o Brasil tem mais de 5 milhões de pessoas em áreas de risco). Segundo a revista New Scientist (20/10/2012), 32 mil pessoas morreram no mundo, entre 2004 e 2010, em eventos dessa natureza (em terremotos, 80 mil). Não faltam avisos.

Washington Novaes é jornalista.

** Publicado originalmente no site O Estado de S. Paulo.

Obama Will Use Nixon-Era Law to Fight Climate Change (Bloomberg)

By Mark Drajem – Mar 15, 2013 12:50 PM GMT-0300

Daniel Acker/Bloomberg. Similar analyses could be made for the oil sands that would be transported in TransCanada Corp.’s Keystone XL pipeline, and leases to drill for oil, gas and coal on federal lands, such as those for Arch Coal Inc. and Peabody Energy Corp.

President Barack Obama is preparing to tell all federal agencies for the first time that they should consider the impact on global warming before approving major projects, from pipelines to highways.

The result could be significant delays for natural gas- export facilities, ports for coal sales to Asia, and even new forest roads, industry lobbyists warn.

“It’s got us very freaked out,” said Ross Eisenberg, vice president of the National Association of Manufacturers, a Washington-based group that represents 11,000 companies such as Exxon Mobil Corp. (XOM) and Southern Co. (SO) The standards, which constitute guidance for agencies and not new regulations, are set to be issued in the coming weeks, according to lawyers briefed by administration officials.

In taking the step, Obama would be fulfilling a vow to act alone in the face of a Republican-run House of Representatives unwilling to pass measures limiting greenhouse gases. He’d expand the scope of a Nixon-era law that was first intended to force agencies to assess the effect of projects on air, water and soil pollution.

“If Congress won’t act soon to protect future generations, I will,” Obama said last month during his State of the Union address. He pledged executive actions “to reduce pollution, prepare our communities for the consequences of climate change, and speed the transition to more sustainable sources of energy.”

Illinois Speech

The president is scheduled to deliver a speech on energy today at the Argonne National Laboratory in Lemont, Illinois. He is pressing Congress to create a $2 billion clean-energy research fund with fees paid by oil and gas producers.

While some U.S. agencies already take climate change into account when assessing projects, the new guidelines would apply across-the-board to all federal reviews. Industry lobbyists say they worry that projects could be tied up in lawsuits or administrative delays.

For example, Ambre Energy Ltd. is seeking a permit from the Army Corps of Engineers to build a coal-export facility at the Port of Morrow in Oregon. Under existing rules, officials weighing approval would consider whether ships in the port would foul the water or generate air pollution locally. The Environmental Protection Agency and activist groups say that review should be broadened to account for the greenhouse gases emitted when exported coal is burned in power plants in Asia.

Keystone Pipeline

Similar analyses could be made for the oil sands that would be transported in TransCanada Corp. (TRP)’s Keystone XL pipeline, and leases to drill for oil, gas and coal on federal lands, such as those for Arch Coal Inc. (ACI) and Peabody Energy Corp. (BTU)

If the new White House guidance is structured correctly, it will require just those kinds of lifecycle reviews, said Bill Snape, senior counsel at the Center for Biological Diversity inWashington. The environmental group has sued to press for this approach, and Snape says lawsuits along this line are certain if the administration approves the Keystone pipeline, which would transport oil from Canada’s tar sands to the U.S. Gulf Coast.

“The real danger is the delays,” said Eisenberg of the manufacturers’ group. “I don’t think the answer is ever going to be ‘no,’ but it can confound things.”

Lawyers and lobbyists are now waiting for the White House’s Council on Environmental Qualityto issue the long bottled-up standards for how agencies should address climate change under the National Environmental Policy Act, signed into law by President Richard Nixon in 1970.

Environmental Impact

NEPA requires federal agencies to consider and publish the environmental impact of their actions before making decisions. Those reviews don’t mandate a specific course of action. They do provide a chance for citizens and environmentalists to weigh in before regulators decide on an action — and to challenge those reviews in court if it’s cleared.

“Each agency currently differs in how their NEPA reviews consider the climate change impacts of projects, as well as how climate change impacts such as extreme weather will affect projects,” Taryn Tuss, a Council on Environmental Quality spokeswoman, said in an e-mail. “CEQ is working to incorporate the public input we received on the draft guidance, and will release updated guidance when it is completed.”

‘Major Shakeup’

The new standards will be “a major shakeup in how agencies conduct NEPA” reviews, said Brendan Cummings, senior counsel for the Center for Biological Diversity in San Francisco.

The White House is looking at requiring consideration of both the increase in greenhouse gases and a project’s vulnerability to flooding, drought or other extreme weather that might result from global warming, according to an initial proposal it issued in 2010. Those full reports would be required for projects with 25,000 metric tons of carbon dioxide equivalent emissions or more per year, the equivalent of burning about 100 rail cars of coal.

The initial draft exempted federal land and resource decisions from the guidance, although CEQ said it was assessing how to handle those cases. Federal lands could be included in the final standards.

The White House guidance itself won’t force any projects to be stopped outright. Instead, it’s likely to prompt lawsuits against federal projects on these grounds, and increase the probability that courts will step in and order extensive reviews as part of the “adequate analysis” required in the law, said George Mannina, an attorney at Nossaman LLP in Washington.

Next Administration

“The question is: Where does this analysis take us?” he said. “Adequate analysis may be much broader than the agency and applicant might consider.”

While the Obama administration’s guidance could be easily rescinded by the next administration, the court rulings that stem from these cases will live on as precedents, Mannina said.

Lobbying groups such as the U.S. Chamber of Commerce, American Petroleum Institute and the National Mining Association weighed in with the White House against including climate in NEPA, a law initially aimed at chemical leaks or air pollution.

“Not only will this result in additional delay of the NEPA process, but will result in speculative and inaccurate modeling that will have direct impacts on approval of specific projects,” the National Mining Association in Washington wrote in comments to the White House in 2010.

Leases Challenged

The group represents Arch Coal (ACI) and Peabody, both based in St. Louis. Leases that theDepartment of Interior issued for those companies to mine for coal in Wyoming are facing lawsuits from environmental groups, arguing that the agency didn’t adequately tally up the effect on global warming from burning that coal.

Given Obama’s pledge to address global warming, “this is a massive contradiction,” said Jeremy Nichols, director of climate at WildEarth Guardians in Denver, which filed lawsuits against the leases.

Arch Coal referred questions to the mining group.

Beth Sutton, a Peabody spokeswoman, said in an e-mail, “We believe the current regulatory approach to surface mine permits is appropriate and protects the environment.”

Since CEQ first announced its proposal, more than three dozen federal approvals were challenged on climate grounds, including a highway project in North Carolina, a methane-venting plan for a coal mine in Colorado, and a research facility in California, according to a chart compiled by the Center for Climate Change Law at Columbia University.

Next Target

The next target is TransCanada (TRP)’s application to build the 1,661-mile (2,673-kilometer) Keystone pipeline. The Sierra Club and 350.org drew 35,000 people to Washington last month to urge Obama to reject the pipeline. Meanwhile, the NEPA review by the State Department included an initial analysis of carbon released when the tar sands are refined into gasoline and used in vehicles.

It stopped short, however, of saying the project would result in an increase in greenhouse gas emissions. With or without the pipeline, the oil sands will be mined and used as fuel, the report said. That finding is likely to be disputed in court if the Obama administration clears the project.

“Keystone is ground zero,” said Snape, of the Center for Biological Diversity. “Clearly this will come into play, and it will be litigated.”

Any actions by the administration now on global warming would pick up on a mixed record over the past four years.

Cap-and-Trade

While Obama failed to get Congress to pass cap-and-trade legislation, the EPA reversed course from the previous administration and ruled that carbon-dioxide emissions endanger public health, opening the way for the agency to regulate it.

Using that finding, the agency raised mileage standards for automobiles and proposed rules for new power plants that would essentially outlaw the construction of new coal-fired power plants that don’t have expensive carbon-capture technology.

Environmentalists such as the Natural Resources Defense Council say the most important action next will be the EPA’s rules for existing power plants, the single biggest source of carbon-dioxide emissions. The NEPA standards are separate from those rules, and will affect how the federal government itself is furthering global warming.

“Agencies do a pretty poor job of looking at climate change impacts,” Rebecca Judd, a legislative counsel at the environmental legal group Earthjustice in Washington. “A thorough guidance would help alleviate that.”

Social Warfare (Foreign Policy)

Budget hawks’ plans to cut funding for political and social science aren’t just short-sighted and simple-minded — they’ll actually hurt national security.

BY SCOTT ATRAN | MARCH 15, 2013

With the automatic sequestration cuts geared up to slash billions of dollars from domestic programs, military funding, social services, and government-sponsored scientific research — including about a 6 percent reduction for the National Institutes of Health (NIH) and the National Science Foundation (NSF) — policymakers and professionals are scrambling to stave off the worst by resetting priorities. In a major speech last month, House majority leader Eric Cantor (R-VA), proposed outright to defund political and social science: “Funds currently spent by the government on social science — including on politics of all things — would be better spent on curing diseases,” he said, echoing a similar proposal he made in 2009. Florida Governor Rick Scott has made a similar push, proposing to divert state funds from disciplines like anthropology and psychology “to degrees where people can get jobs,” especially in technology and medicine. Those are fighting words, but they’re also simple-minded.

Social science may sound like a frivolous expenditure to legislative budget hawks, but far from trimming fat, defunding these programs would fundamentally undercut core national interests. Like it or not, social science research informs everything from national security to technology development to healthcare and economic management. For example, we can’t decide which drugs to take, unless their risks and benefits are properly assessed, and we can’t know how much faith to have in a given science or engineering project, unless we know how much to trust expert judgment. Likewise, we can’t fully prepare to stop our adversaries, unless we understand the limits of our own ability to see why others see the world differently. Despite hundreds of billions of taxpayer dollars poured into the global war on terrorism, radicalization against our country’s core interests continues to spread — and social science offers better ways than war to turn the tide.

In support of Rep. Cantor’s push to defund political and social science, a recent article in theAtlantic notes that “money [that] could have gone to towards life-saving cancer research” instead went to NSF-sponsored projects that “lack real-world impact” such as “the $750,000 spent studying the ‘sacred values‘ involved in cultural conflict.” Perhaps the use of words like “sacred” or “culture” incites such scorn, but as often occurs in many denunciations of social science, scant attention is actually paid to what the science proposes or produces. In fact, the results of this particular project — which I direct — have figured into numerous briefings to the National Security Staff at the White HouseSenate and House committees, the Department of State and Britain’s Parliament, and the Israeli Knesset (including the prime minister and defense minister). In addition, the research offices of the Department of Defense have also supported my team’s work, which figures prominently in recent strategy assessments that focus on al Qaeda and broader problems of radicalization and political violence.

Let me try to explain just exactly what it is that we do. My research team conducts laboratory experiments, including brain imaging studies — supported by field work with political leaders, revolutionaries, terrorists, and others — that show sacred values to be core determinants of personal and social identity (“who I am” and “who we are”). Humans process these identities as moral rules, duties, and obligations that defy the utilitarian and instrumental calculations ofrealpolitik or the marketplace. Simply put, people defending a sacred value will not trade its incarnation (Israel’s settlements, Iran’s nuclear fuel rods, America’s guns) for any number of iPads, or even for peace.

The sacred values of “devoted actors,” it turns out, generate actions independent of calculated risks, costs, and consequences — a direct contradiction of prevailing “rational actor” models of politics and economics, which focus on material interests. Devoted actors, in contrast, act because they sincerely and deeply believe “it’s the right thing to do,” regardless of risks or rewards. Practically, this means that such actors often harness deep and abiding social and political commitments to confront much stronger foes. Think of the American revolutionaries, who were willing to sacrifice “our lives, our fortunes and our sacred honor” in the fight for liberty against the greatest military power of the age — or modern suicide bombers willing to sacrifice everything for their cause.

Sacred values — as when land becomes “Holy Land” — sustain the commitment of revolutionaries and some terrorist groups to resist, and often overcome, more numerous and better-equipped militaries and police that function with measured rewards like better pay or promotion. Our research with political leaders and general populations also shows that sacred values — not political games or economics — underscore intractable conflicts like those between the Israelis and the Palestinians that defy the rational give-and-take of business-like negotiation. Field experiments in Israel, Palestine, Nigeria, and the United States indicate that commitment to such values can motivate and sustain wars beyond reasonable costs and casualties.

So what are the practical implications of these findings? Perhaps most importantly, our research explains why efforts to broker peace that rely on money or other material incentives are doomed when core values clash. In our studies with colleagues in Afghanistan, India, Indonesia, Iran, the Levant, and North Africa, we found that offers of material incentives to compromise on sacred values often backfire, actually increasing anger and violence toward a deal. For example, a 2010 study of attitudes toward Iran’s nuclear program found that most Iranians do not view the country’s nuclear program as sacred. But for about 13 percent of the population, the program has been made sacred through religious rhetoric. This group, which tends to be close to the regime, now believes a nuclear program is bound up with the national identity and with Islam itself. As a result, offering these people material rewards or punishments to abandon the program only increases their anger and support for it. Predictably, new sanctions, or heightened perception of sanctions, generate even more belligerent statements and actions by the regime to increase the pace, industrial capacity, and level of uranium enrichment. Of course, majority discontent with sanctions may yet force the regime to change course, or to double down on repression.

Understanding how this process plays out over time is a key to helping friends, thwarting enemies, and managing conflict. The ultimate goal of such research is to help save lives, resources, and national treasure. And by generating psychological knowledge about how culturally diverse individuals and groups advance values and interests that are potentially compatible or fundamentally antagonistic to our own, it can help keep the nation’s citizens, soldiers, and potential allies out of harm’s way. Our related research on the spiritual and material aspects of environmental disputes between Native American and majority-culture populations in North America andCentral America has also revealed surprising but practical ways to reduce conflict andsustainably manage forest commons and wildlife.

The would-be defunders of social science denounce an ivory tower that seems to exist only in their imagination — willfully ignoring evidence-based reasoning and results in order to advance a political agenda. Only $11 million of the NSF’s $7 billion-plus budget goes to political science research. It is exceedingly doubtful that getting rid of the entire NSF political science budget, which is equal to 0.5 percent of the cost of a single B-2 bomber, would really help to produce life-saving cancer research, where testing for even a single drug can cost more to develop than a B-2. Not that we must choose between either, mind you.

Social science is in fact moving the “hard” sciences forward. Consider the irony: a close collaborator on the “sacred values” project, Robert Axelrod, former president of the American Political Science Association, recently produced a potentially groundbreaking cancer study based on social science modeling of cancer cells as cooperative agents in competition with communities of healthy cells. Independent work by cancer researchers in the United States and abroad hasestablished that the cooperation among tumor cells that Axelrod and colleagues proposed does in fact take place in cell lines derived from human cancers, which has significant implications for the development of effective treatments.

Research from other fields of social science, including social and cognitive psychology and anthropology, continue to have deep implications for an enormous range of human problems: including how to better design and navigate transportation and communication networks, or manage airline crews and cockpits; on programming robots for industry and defense; on modeling computer systems and cybersecurity; on reconfiguring emergency medical care and diagnoses; in building effective responses to economic uncertainty; and enhancing industrial competitiveness and innovation. For example, perhaps the greatest long-term menace to the security of U.S. industry and defense is cyberwarfare, where the most insidious and hard-to-manage threat may stem not from hardware or software vulnerabilities but from “wetware,” the inclinations and biases of socially interacting human brains — as in just doing a friend a favor (like “click this link” or “can I borrow your flash drive?”). In recognition of that fact, Axelrod has suggested to the White House and Defense Department an “honor code” encouraging individuals to not only maintain cybersecurity themselves, but also not to lapse into doing favors for friends and to report such lapses in others.

Elected officials have the mandate to set priorities for research funding in the national interest. Ever since Abraham Lincoln established the National Academy of Sciences, however, a clear priority has been to allow scientific inquiry fairly free rein — to doubt, challenge, and ultimately change received wisdom if based on solid logic and evidence. What Rep. Cantor and like-minded colleagues seem to be saying is that this is fine, but only in the fields they consider expedient: in technology, medicine, and business. (Though possibly they mean to make an exception for the lucrative social science of polling, which can help to sell almost anything — even terrible ideas like defunding the rest of social science.)

It’s stunning to think that these influential politicians and the people who support them don’t want evidence-based reasoning and research to inform decisions concerning the nature and needs of our society — despite the fact that the vast majority of federal and state legislation deals with social issues, rather than technology or defense. To be sure, there is significant waste and wrongheadedness in the social sciences, as there is in any science (in fact, in any evolutionary process that progresses by trial and error), including, most recently, billions spent on possibly misleading use of mice in cancer research.

But those who would defund social science seriously underestimate the relationship between the wide-ranging freedom of scientific research and its pointed impact, and between theory and practice: Where disciplined imagination sweeps broadly to discover, say, that devoted actors do not respond to material incentives or disincentives (e.g., sanctions) in the same way that rational actors do, or that communities of people and body cells may share deep underlying organizational principles and responses to threats from outside aggressors, such knowledge can have a profound influence on our lives and wellbeing.

Even before they revolted in 1776, the American colonists may have already enjoyed the world’s highest standard of living. But they wanted something different: a free and progressive society, which money couldn’t buy. “Money has never made man happy, nor will it,” gibed Ben Franklin, but “if a man empties his purse into his head no one can take it away from him; an investment in knowledge always pays the best interest.” He founded America’s first learned society “to improve the common stock of knowledge,” which called for inquiry into many practical matters as well as “all philosophical Experiments that Light into the Nature of Things … and multiply the Conveniences or Pleasures of Life.” George Washington, Thomas Jefferson, John Adams, Alexander Hamilton, Thomas Paine, James Madison, and John Marshall all joined Franklin’s society and took part in the political, social, and economic revolution it helped spawn. Like the Founding Fathers, we want our descendants to be able to envision great futures for our country and a better world for all. For that, our children need the broad understanding of how the world works that the social sciences can provide — not just a technical education for well-paying jobs.

Pobre previsão do tempo (Folha de S.Paulo)

15/03/2013 – 03h01

Michel Laub

Num artigo publicado na Folha em 2010 (http://goo.gl/fLVDJ), João Moreira Salles discutiu a hipervalorização das humanidades no Brasil, em detrimento de disciplinas como matemática, física e engenharia. Um dos efeitos da distorção, acrescento, é a pouca familiaridade –do público, dos intelectuais, da imprensa– com o discurso técnico e científico. E, por consequência, a docilidade com que são aceitas falácias nessas áreas.

Exemplos: propaganda de governo (números para todos os gostos), dietas da moda (pesquisas com todo tipo de metodologia e patrocínio), tratamentos de saúde (custo-benefício muitas vezes discutível) e até planilhas de futebol (nas quais um volante que só dá passes curtos terá índice de acerto maior que um lançador vertical).

De minha parte, resolvi testar um discurso científico bastante presente no cotidiano: o da meteorologia. Durante 28 dias de janeiro último, anotei erros e acertos do “Jornal do Tempo” (http://jornaldotempo.uol.com.br).

Um trabalho leigo, por certo, e consciente de que o serviço em questão não é representativo do setor no país ou no mundo. A home page do “Jornal do Tempo” apresenta dados que são uma média, um resumo –como na previsão da TV– de registros mais detalhados, inclusive em algumas de suas páginas internas.

Ocorre que médias são a face pública da meteorologia, o tal discurso –em tom seguro e cordial– que nos orienta a escolher a roupa de manhã, a levar ou não o guarda-chuva. E aí, assim como alguma lógica basta para perceber furos em trabalhos estatísticos, não é preciso ser expert para afirmar que há muita imprecisão no ramo.

As temperaturas do meu caderninho quase sempre estiveram dentro dos intervalos previstos na véspera (23 em 28 ocorrências). Comparadas à previsão da semana anterior, o índice cai para 17 em 28. Se botarmos lado a lado o intervalo previsto sete dias antes e o previsto no próprio dia, há diferença em 28 de 28.

Já nas condições atmosféricas, cuja conferência é mais difícil –da minha casa em Pinheiros, não tenho como saber se fui traído por uma garoa enquanto dormia ou algo assim–, houve 15 erros em 28.

São coisas aparentemente sem importância: um ou dois graus a mais, sol durante algumas horas num dia “fechado e chuvoso, com poucas trovoadas”. Mas há reparos objetivos senão aos métodos de medição, ao menos à forma como o resultado é exposto.

Assim, cravar uma temperatura única numa cidade como São Paulo, com seus morros e depressões, paraísos verdes e infernos de concreto em 1,5 milhão de quilômetros quadrados, é inexato por princípio. Igualmente a previsão do tempo numa só frase, que contempla tanto o pé d’água rápido e inofensivo quanto o dilúvio e o caos, dependendo da estrutura do bairro onde se está (“sol, alternando com chuva em forma de pancadas isoladas”).

A questão fica mais complexa quando transcende o território do erro, que é humano e aceitável. E da própria meteorologia, aqui citada apenas como sintoma. A autoridade que emana do discurso científico não se limita a influenciar debates acadêmicos sobre química ou astronomia.

Trata-se, também, de um fenômeno das ciências humanas. Seus desdobramentos políticos, econômicos e morais na sociedade como um todo não são desprezíveis. Foram teorias racialistas que justificaram a escravidão. Foi uma doutrina de incentivo à competição tecnológica que criou as armas nucleares.

No caso do aquecimento global, a grande bandeira científica de hoje, antes de tudo há um imperativo de bom senso: é mais inteligente viver de forma harmônica com a natureza, com menos emissão de carbono, desmatamento e desperdício consumista. Também imagino que previsões de climatologia sejam mais precisas do que, digamos, as da moça que descreve as condições do Sudeste inteiro em dez segundos no “Jornal Nacional”. Mas é fato que a revista “Time” alertou sobre a “nova era glacial” em 1974. E deu uma capa célebre, dez anos depois, sobre a hoje contestada ligação entre infarto e gema de ovo.

Os dois textos reproduziam uma conjectura científica influente à época. É recomendável seguir as que o são hoje –afinal, é o que mais próximo temos de certezas fora do fanatismo religioso ou ideológico. Apenas é bom, como dúvida saudável, em qualquer área de conhecimento vendido como infalível, lembrar da pobre previsão do tempo.

When It Rains These Days, Does It Pour? Has the Weather Become Stormier as the Climate Warms? (Science Daily)

Mar. 17, 2013 — There’s little doubt — among scientists at any rate — that the climate has warmed since people began to release massive amounts greenhouse gases to the atmosphere during the Industrial Revolution.

But ask a scientist if the weather is getting stormier as the climate warms and you’re likely to get a careful response that won’t make for a good quote.

There’s a reason for that.

“Although many people have speculated that the weather will get stormier as the climate warms, nobody has done the quantitative analysis needed to show this is indeed happening,” says Jonathan Katz, PhD, professor of physics at Washington University in St. Louis.

In the March 17 online version ofNature Climate Change, Katz and Thomas Muschinksi, a senior in physics who came to Katz looking for an undergraduate thesis project, describe the results of their analysis of more than 70 years of hourly precipitation data from 13 U.S. sites looking for quantitative evidence of increased storminess.

They found a significant, steady increase in storminess on the Olympic Peninsula in Washington State, which famously suffers from more or less continuous drizzle, a calm climate that lets storm peaks emerge clearly.

“Other sites have always been stormy,” Katz says, “so an increase such as we saw in the Olympic Peninsula data would not have been detectable in their data.”

They may also be getting stormier, he says, but so far they’re doing it under cover.

The difference between wetter and stormier

“We didn’t want to know whether the rainfall had increased or decreased,” Katz says, “but rather whether it was concentrated in violent storm events.”

Studies that look at the largest one-day or few-day precipitation totals recorded in a year, or the number of days in which in which total precipitation is above a threshold, measure whether locations are getting wetter, not whether they’re getting stormier, says Katz.

To get the statistical power to pick up brief downpours rather than total precipitation, Muschinski and Katz needed to find a large, fine-grained dataset.

“So we poked around,” Katz says, “and we found what we were looking for in the National Oceanic and Atmospheric Administration databases.”

NOAA has hourly precipitation data going back to 1940 or even further for many locations in the United States. Muschniski and Katz chose 13 sites that had long runs of data and represented a broad range of climates, from desert to rain forest.

They then tested the hypothesis that storms are becoming more frequent and intense by taking different measurements of the “shape” formed by the data points for each site.

Measuring these “moments” as they’re called, is a statistical test commonly used in science, says Katz, but one that hasn’t been applied to this problem before.

“We found a significant steady increase in stormy activity on the Olympic Peninsula,” Katz says. “We know that is real.”

“We found no evidence for an increase in storminess at the other 12 sites,” he said, “but because their weather is intrinsically stormier, it would be more difficult to detect a trend like that at the Olympic Peninsula even if it were occurring.”

The next step, Katz says, is to look at a much large number of sites that might be regionally averaged to reveal trends too slow to be significant for one site.

“There are larger databases,” he says, “but they’re also harder to sift through. Any one site might have half a million hourly measurements over the period we’re looking at, and to get good results. we have to devise an algorithm tuned to the database to filter out spurious or corrupted data.”

You could call that a rainy-day project.

Journal Reference:

  1. T. Muschinski, J. I. Katz. Trends in hourly rainfall statistics in the United States under a warming climateNature Climate Change, 2013; DOI:10.1038/nclimate1828

Bombshell: Recent Warming Is ‘Amazing And Atypical’ And Poised To Destroy Stable Climate That Enabled Civilization (Climate Progress)

By Joe Romm on Mar 8, 2013 at 12:44 pm

New Science Study Confirms ‘Hockey Stick’: The Rate Of Warming Since 1900 Is 50 Times Greater Than The Rate Of Cooling In Previous 5000 Years

Temperature change over past 11,300 years (in blue, via Science, 2013plus projected warming this century on humanity’s current emissions path (in red, via recent literature).

A stable climate enabled the development of modern civilization, global agriculture, and a world that could sustain a vast population. Now, the most comprehensive “Reconstruction of Regional and Global Temperature for the Past 11,300 Years” ever done reveals just how stable the climate has been — and just how destabilizing manmade carbon pollution has been and will continue to be unless we dramatically reverse emissions trends.

Researchers at Oregon State University (OSU) and Harvard University published their findings today in the journal Science. Their funder, the National Science Foundation, explains in a news release:

With data from 73 ice and sediment core monitoring sites around the world, scientists have reconstructed Earth’s temperature history back to the end of the last Ice Age.

The analysis reveals that the planet today is warmer than it’s been during 70 to 80 percent of the last 11,300 years.

… during the last 5,000 years, the Earth on average cooled about 1.3 degrees Fahrenheit–until the last 100 years, when it warmed about 1.3 degrees F.

In short, thanks primarily to carbon pollution, the temperature is changing 50 times faster than it did during the time modern civilization and agriculture developed, a time when humans figured out where the climate conditions — and rivers and sea levels — were most suited for living and farming. We are headed for 7 to 11°F warming this century on our current emissions path — increasing the rate of change 5-fold yet again.

By the second half of this century we will have some 9 billion people, a large fraction of whom will be living in places that simply can’t sustain them —  either because it is too hot and/or dry, the land is no longer arable, their glacially fed rivers have dried up, or the seas have risen too much.

We could keep that warming close to 4°F — and avoid the worst consequences — but only with immediate action.

This research vindicates the work of Michael Mann and others showing that recent warming is unprecedented in magnitude, speed, and cause during the past 2000 years — the so-called Hockey Stick — and in fact extends that back to at least 4000 years ago. I should say “vindicates for the umpteenth time” (see “Yet More Studies Back Hockey Stick“).

Lead author Shaun Marcott of OSU told NPR that the paleoclimate data reveal just how unprecedented our current warming is: “It’s really the rates of change here that’s amazing and atypical.” He noted to the AP, “Even in the ice age the global temperature never changed this quickly.”

And the rate of warming is what matters most, as Mann noted in an email to me:

This is an important paper. The key take home conclusion is that the rate and magnitude of recent global warmth appears unprecedented for *at least* the past 4K and the rate *at least* the past 11K. We know that there were periods in the past that were warmer than today, for example the early Cretaceous period 100 million yr ago. The real issue, from a climate change impacts point of view, is the rate of change—because that’s what challenges our adaptive capacity. And this paper suggests that the current rate has no precedent as far back as we can go w/ any confidence—11 kyr arguably, based on this study.

Katharine Hayhoe, an atmospheric scientist at Texas Tech University, told the AP:

We have, through human emissions of carbon dioxide and other heat-trapping gases, indefinitely delayed the onset of the next ice age and are now heading into an unknown future where humans control the thermostat of the planet.

Unfortunately, we have decided to change the setting on the thermostat from “Very Stable, Don’t Adjust” to “Hell and High Water.” It is the single most self-destructive act humanity has ever undertaken, but there is still time to aggressively slash emissions and aim for a setting of “Dangerous, But Probably Not Fatal.”

How to Predict the Future of Technology (Science Daily)

Jan. 25, 2013 — The bread and butter of investing for Silicon Valley tech companies is stale. Instead, a new method of predicting the evolution of technology could save tech giants millions in research and development or developments of new products — and help analysts and venture capitalists determine which companies are on the right track.

The high-tech industry has long used Moore’s Law as a method to predict the growth of PC memory. Moore’s Law states that the number of chips on a transistor doubles every 18 months (initially every year). A paper by Gareth James and Gerard Tellis, professors at the USC Marshall School of Business and their co-authors Ashish Sood, at Emory and Ji Zhu at the University of Michigan, concludes that Moore’s Law does not apply for most industries, including the PC industry.

High-tech companies traditionally use Moore’s Law and other similar heuristics to predict the path of evolution of competing technologies and to decide where to funnel millions into research and development or new product development. The paper’s researchers claim that these models are outdated and inaccurate.

The paper offers a new model, Step and Wait (SAW), which more accurately tracks the path of technological evolution in six markets that the authors tested. According to the researchers, Moore’s Law and other models such as Kryder’s Law and Gompertz Law predict a smooth increasing exponential curve for the improvement in performance of various technologies. In contrast, the authors found that the performance of most technologies proceeds in steps (or jumps) of big improvements interspersed with waits (or periods of no growth in performance).

The sweet spot is in knowing which technology to back based on predicting when a new technology is going to have a jump in performance.

“We looked at the forest rather than the trees and see ‘steps’ and ‘waits’ across a variety of technologies,” Tellis said. While no one law applies to every market, Tellis and his co-authors looked at 26 technologies in six markets from lighting to automobile batteries, and found that the SAW model worked in all six, in contrast to several other competing models.

What Tellis and his colleagues did come up with, are average performance improvements for the industry in terms of “steps” and wait times (see table to the right). The challenge for strategists is to invest in various technologies to beat these averages.

Tellis said that tablet and mobile phone manufacturers can leverage this data. “Any manager has first to break down his or her products into components, find components for each technology, and then predict the future path of those technologies. For example, the mobile phone consists of three important technological components: memory, display, or CPU, the first two of which the authors analyzed. Similarly, tablets, manufacturers could rely on the figures for display and memory technologies.”

An example of how the SAW model could have saved a company from decline is Sony’s investment in TVs. Sony kept investing in cathode ray tube technology (CRT) even after liquid crystal display technology (LCD) first crossed CRT in performance in 1996. Instead of considering LCD, Sony introduced the FD Trinitron/WEGA series, a flat version of the CRT. CRT out-performed LCD for a few years, but ultimately lost decisively to LCD in 2001. In contrast, by backing LCD, Samsung grew to be the world’s largest manufacturer of the better performing LCD. The former market leader, Sony, had to seek a joint venture with Samsung in 2006 to manufacture LCDs.

Having the SAW model at the ready might have changed their course. “Prediction of the next step size and wait time using SAW could have helped Sony’s managers make a timely investment in LCD technology,” according to the study.

Journal Reference:

  1. Sood, Ashish, James, Gareth, Tellis, Gerard J. and Zhu, Ji.Predicting the Path of Technological Innovation: SAW Versus Moore, Bass, Gompertz, and KryderMarketing Science., July 22, 2012 [link]

A Scientist’s Misguided Crusade (N.Y.Times)

OP-ED COLUMNIST

By JOE NOCERA

Published: March 4, 2013 

Last Friday, at 3:40 p.m., the State Department released its “Draft Supplemental Environmental Impact Statement” for the highly contentious Keystone XL pipeline, which Canada hopes to build to move its tar sands oil to refineries in the United States. In effect, the statement said there were no environmental impediments that would prevent President Obama from approving the pipeline.

Two hours and 20 minutes later, I received a blast e-mail containing a statement by James Hansen, the head of the Goddard Institute for Space Studies at NASA — i.e., NASA’s chief climate scientist. “Keystone XL, if the public were to allow our well-oiled government to shepherd it into existence, would be the first step down the wrong road, perpetuating our addiction to dirty fossil fuels, moving to ever dirtier ones,” it began. After claiming that the carbon in the tar sands “exceeds that in all oil burned in human history,” Hansen’s statement concluded: “The public must demand that the government begin serving the public’s interest, not the fossil fuel industry’s interest.”

As a private citizen, Hansen, 71, has the same First Amendment rights as everyone else. He can publicly oppose the Keystone XL pipeline if he so chooses, just as he can be as politically active as he wants to be in the anti-Keystone movement, and even be arrested during protests, something he managed to do recently in front of the White House.

But the blast e-mail didn’t come from James Hansen, private citizen. It specifically identified Hansen as the head of the Goddard Institute, and went on to describe him as someone who “has drawn attention to the danger of passing climate tipping points, producing irreversible climate impacts that would yield a different planet from the one on which civilization developed.” All of which made me wonder whether such apocalyptic pronouncements were the sort of statements a government scientist should be making — and whether they were really helping the cause of reversing climate change.

Let’s acknowledge right here that the morphing of scientists into activists is nothing new. Linus Pauling, the great chemist, was a peace activist who pushed hard for a nuclear test ban treaty. Albert Einstein also became a public opponent of nuclear weapons.

It is also important to acknowledge that Hansen has been a crucial figure in developing modern climate science. In 2009, Eileen Claussen, now the president of the Center for Climate and Energy Solutions, told The New Yorker that Hansen was a “heroic” scientist who “faced all kinds of pressures politically.” Today, his body of work is one of the foundations upon which much climate science is built.

Yet what people hear from Hansen today is not so much his science but his broad, unscientific views on, say, the evils of oil companies. In 2008, he wrote a paper, the thesis of which was that runaway climate change would occur when carbon in the atmosphere reached 350 parts per million — a point it had already exceeded — unless it were quickly reduced. There are many climate change experts who disagree with this judgment — who believe that the 350 number is arbitrary and even meaningless. Yet an entire movement,350.org, has been built around Hansen’s line in the sand.

Meanwhile, he has a department to run. For a midlevel scientist at the Goddard Institute, what signal is Hansen sending when he takes the day off to get arrested at the White House? Do his colleagues feel unfettered in their own work? There is, in fact, enormous resentment toward Hansen inside NASA, where many officials feel that their solid, analytical work on climate science is being lost in what many of them describe as “the Hansen sideshow.” His activism is not really doing any favors for the science his own subordinates are producing.

Finally, and most important, Hansen has placed all his credibility on one battle: the fight to persuade President Obama to block the Keystone XL pipeline. It is the wrong place for him to make a stand. Even in the unlikely event the pipeline is stopped, the tar sands oil will still be extracted and shipped. It might be harder to do without a pipeline, but it is already happening. And in the grand scheme, as I’ve written before, the tar sands oil is not a game changer. The oil we import from Venezuela today is dirtier than that from the tar sands. Not that the anti-pipeline activists seem to care.

What is particularly depressing is that Hansen has some genuinely important ideas, starting with placing a graduated carbon tax on fossil fuels. Such a tax would undoubtedly do far more to reduce carbon emissions and save the planet than stopping the Keystone XL pipeline.

A carbon tax might be worth getting arrested over. But by allowing himself to be distracted by Keystone, Hansen is hurting the very cause he claims to care so much about.

Big military guy more scared of climate change than enemy guns (Grist)

By Susie Cagle

11 Mar 2013 6:13 PM

Navy Admiral Samuel J. Locklear III, chief of U.S. Pacific Command, doesn’t look like your usual proponent of climate action. Spencer Ackerman writes at Wired that Locklear “is no smelly hippie,” but the guy does believe there will be terrible security threats on a warming planet, which might make him a smelly hippie in the eyes of many American military boosters.

13-03-11AdmSamuelLocklear
Commander U.S. 7th Fleet

Everyone wants him to be worried about North Korean nukes and Chinese missiles, but in an interview with The Boston Globe, Locklear said that societal upheaval due to climate change “is probably the most likely thing that is going to happen … that will cripple the security environment, probably more likely than the other scenarios we all often talk about.’’

“People are surprised sometimes,” he added, describing the reaction to his assessment. “You have the real potential here in the not-too-distant future of nations displaced by rising sea level. Certainly weather patterns are more severe than they have been in the past. We are on super typhoon 27 or 28 this year in the Western Pacific. The average is about 17.”

Locklear said his Hawaii-based headquarters — which is … responsible for operations from California to India — is working with Asian nations to stockpile supplies in strategic locations and planning a major exercise for May with nearly two dozen countries to practice the “what-ifs.”

Locklear isn’t alone in his climate fears. A recent article by Julia Whitty takes an in-depth look at what the military is doing to deal with climate change. A 2008 report by U.S. intelligence agencieswarned about national security challenges posed by global warming, as have later reports from the Department of Defense and the Joint Chiefs of Staff. New Defense Secretary Chuck Hagel understands the threat, too. People may be surprised sometimes, Adm. Locklear, but they really shouldn’t be!

Will not-a-dirty-hippie Locklear’s words help to further mainstream the idea that climate change is a serious security problem? And what all has the good admiral got planned for this emergency sea-rising drill in May?

Susie Cagle writes and draws news for Grist. She also writes and draws tweets for Twitter.

Terra se aproxima de maiores temperaturas em 11 mil anos; Derretimento no Canadá pode ser irreversível (Folha de São Paulo)

JC e-mail 4680, de 08 de Março de 2013.

Salvador Nogueira

Pesquisa reuniu dados de 73 localidades ao redor do mundo para estimar a temperatura global (e local) no período geológico conhecido como Holoceno

Um novo estudo conduzido por pesquisadores da Universidade Estadual do Oregon e da Universidade Harvard, ambas nos EUA, reconstruiu a temperatura média da Terra nos últimos 11,3 mil anos para compará-la aos níveis atuais.

A boa notícia: a Terra hoje está mais fria do que já esteve em sua época mais quente desse período. A má: se os modelos dos climatologistas estiverem certos, atingiremos um novo recorde de calor até o final do século.

O trabalho, publicado na revista “Science”, reuniu dados de 73 localidades ao redor do mundo para estimar a temperatura global (e local) no período geológico conhecido como Holoceno, que começou ao final da última era do gelo, há 11 mil anos.

Depois de consolidar todas as informações, em sua maioria provenientes de amostras de fósseis em sedimentos oceânicos, num único quadro –além de usar técnicas matemáticas para preencher os “buracos” encontrados nas diversas fontes usadas para estimar a temperatura no passado–, os cientistas puderam recriar uma “pequena história da variação climática da Terra”.

Diz-se pequena porque os resultados não permitem enxergar a variação ocorrida em uns poucos anos. É como se cada ponto nos dados representasse a temperatura em um período de 120 anos.

A HISTÓRIA

Os dados confirmam uma velha desconfiança dos cientistas: a de que a Terra passou por um período de aquecimento que começou cerca de 11 mil anos atrás. Em 1,5 mil anos, o planeta esquentou cerca de 0,6ºC e assim se estabilizou, durante cerca de 5.000 anos.

Então, 5,5 mil anos atrás, começou um novo processo de esfriamento –que terminou há 200 anos, com o que ficou conhecido como a “pequena era do gelo”. O planeta ficou 0,7ºC mais frio.

Entram em cena a industrialização acelerada e o século 20. O planeta volta a se esquentar. No momento, ele ainda não bateu o recorde de temperatura visto no início do Holoceno, mas já está mais quente que em 75% dos últimos 11 mil anos.

Assim, o estudo confirma que a temperatura da Terra está subindo em tempos recentes e mostra que a subida é muito mais rápida do que se pensava.

“Essa pesquisa mostra que já experimentamos quase a mesma faixa de mudança de temperatura desde o início da Revolução Industrial que foi vista nos 11 mil anos anteriores da história da Terra –mas essa mudança aconteceu muito mais depressa”, comenta Candace Major, diretor da divisão de Ciências Oceanográficas da Fundação Nacional de Ciência dos EUA, que financiou o estudo.

Por outro lado, a baixa resolução temporal do estudo (é impossível distinguir efeitos de poucos anos) dificulta a comparação com o atual fenômeno de aquecimento.

Para a mudança climática atual se tornar relevante na escala de tempo analisada pelo modelo de reconstrução dos últimos 11 mil anos, ela precisa continuar no próximo século. Segundo os modelos do IPCC (Painel Intergovernamental para Mudança Climática), da ONU, é isso que vai acontecer.

Contudo, ainda há incertezas sobre a magnitude do fenômeno. De toda forma, mesmo pelas estimativas mais otimistas, quando chegarmos a 2100, se nada for feito, provavelmente estaremos vivendo o período mais quente dos últimos 11 mil anos.

* * *

JC e-mail 4680, de 08 de Março de 2013.

via Reuters

As geleiras canadenses, terceiro maior depósito de gelo depois da Antártida e da Groenlândia, podem estar sofrendo um derretimento sem volta que deve aumentar o nível do mar, afirmaram cientistas

Cerca de 20% das geleiras no norte do Canadá podem desaparecer até o fim do século 21, num derretimento que pode acrescentar 3,5 cm ao nível do mar.

Segundo artigo na revista “Geophysical Research Letters”, o derretimento de geleiras brancas exporia a tundra escura, que tende a absorver mais calor e acelerar o derretimento.

A ONU estima um aumento do nível do mar entre 18 cm e 59 cm neste século ou mais se a cobertura de gelo da Antártida e da Groenlândia começar a derreter mais rápido.

A projeção de perda de 20% do volume de gelo no Canadá se baseou em um cenário com aumento de temperatura médio de 3ºC neste século e de 8ºC no Ártico canadense, dentro das previsões da ONU.

Chemicals, Risk And The Public (Chicago Tribune)

April 29, 1989|By Earon S. Davis

The public is increasingly uncomfortable with both the processes and the results of government and industry decision-making about chemical hazards.

Decisions that expose people to uncertain and potentially catastrophic risks from chemicals seem to be made without adequate scientific information and without an appreciation of what makes a risk acceptable to the public.

The history of environmental and occupational health provides myriad examples in which entire industries have acted in complete disregard of public health risks and in which government failed to act until well after disasters were apparent.

It is not necessary to name each chemical, each debacle, in which the public was once told the risks were insignificant, but these include DDT, asbestos, Kepone, tobacco smoke, dioxin, PCBs, vinyl chloride, flame retardants in children`s sleepware, Chlordane, Alar and urea formaldehyde foam. These chemicals were banned or severely restricted, and virutally no chemical has been found to be safer than originally claimed by industry and government.

It is no wonder that government and industry efforts to characterize so many uncertain risks as “insignificant“ are met with great skepticism. In a pluralistic, democratic society, acceptance of uncertainty is a complex matter that requires far more than statistical models. Depending upon cultural and ethical factors, some risks are simply more acceptable than others.

When it comes to chemical risks to human health, many factors combine to place a relatively higher burden on government and industry to show social benefits. Not the least of these is the unsatisfactory track record of industry and its regulatory agencies.

Equally important are the tremendous gaps in scientific knowledge about chemically induced health effects, as well as the specific characteristics of these risks.

Chemical risks differ from many other kinds because, not only are the victims struck largely at random, but there is usually no way to know which illnesses are eventually caused by a chemical. There are so many poorly understood illnesses and so many chemical exposures which take many years to develop that most chemical victims will not even be identified, let alone properly compensated.

To the public, this difference is significant, but to industry it poses few problems. Rather, it presents the opportunity to create risks and yet remain free of liability for the bulk of the costs imposed on society, except in the rare instance where a chemical produces a disease which does not otherwise appear in humans.

Statutes of limitations, corporate litigiousness, inability or unwillingness of physicians to testify on causation and the sheer passage of time pose major obstacles to chemical victims attempting to receive compensation.

The delayed effects of chemical exposures also make it impossible to fully document the risks until decades after the Pandora`s box has been opened. The public is increasingly afraid that regulators are using the lack of immediately identified victims as evidence of chemical safety, which it simply is not.

Chemical risks are different because they strike people who have given no consent, who may be completely unaware of danger and who may not even have been born at the time of the decision that led to their exposure. They are unusual, too, because we don`t know enough about the causes of cancer, birth defects and neurological and immunologic disorders to understand the real risks posed by most chemicals.

The National Academy of Sciences has found that most chemicals in commerce have not even been tested for many of these potential health effects. In fact, there are growing concerns of new neurologic and chemical sensitivity disorders of which almost nothing is known.

We are exposed to so many chemicals that there is literally no way of estimating the cumulative risks. Many chemicals also present synergistic effects in which exposure to two or more substances produces risks many times greater than the simple sum of the risks. Society has begun to see that the thousands of acceptable risks could add up to one unacceptable generic chemical danger.

The major justification for chemical risks, given all of the unknowns and uncertainties, is an overriding benefit to society. One might justify taking a one-in-a-million risk for a product that would make the nation more economically competitive or prevent many serious cases of illness. But such a risk may not be acceptable if it is to make plastic seats last a little longer, to make laundry 5 percent brighter or lawns a bit greener, or to allow apples to ripen more uniformly.

These are some of the reasons the public is unwilling to accept many of the risks being forced upon it by government and industry. There is no “mass hysteria“ or “chemophobia.“ There is growing awareness of the preciousness of human life, the banal nature of much of what industry is producing and the gross inadequacy of efforts to protect the public from long-term chemical hazards.

If the public is to regain confidence in the risk management process, industry and government must open up their own decision-making to public inquiry and input. The specific hazards and benefits of any chemical product or byproduct should be explained in plain language. Uncertainties that cannot be quantified must also be explained and given full consideration. And the process must include ethical and moral considerations such as those addressed above. These are issues to be decided by the public, not bureaucrats or corporate interests.

For industry and government to regain public support, they must stop blaming “ignorance“ and overzealous public interest groups for the concern of the publc and the media.

Rather, they should begin by better appreciating the tremendous responsibility they bear to our current and future generations, and by paying more attention to the real bottom line in our democracy: the honest, rational concerns of the average American taxpayer.

Why Are Environmentalists Taking Anti-Science Positions? (Yale e360)

22 OCT 2012

On issues ranging from genetically modified crops to nuclear power, environmentalists are increasingly refusing to listen to scientific arguments that challenge standard green positions. This approach risks weakening the environmental movement and empowering climate contrarians.

By Fred Pearce

From Rachel Carson’s Silent Spring to James Hansen’s modern-day tales of climate apocalypse, environmentalists have long looked to good science and good scientists and embraced their findings. Often we have had to run hard to keep up with the crescendo of warnings coming out of academia about the perils facing the world. A generation ago, biologist Paul Ehrlich’sThe Population Bomb and systems analysts Dennis and Donella Meadows’The Limits to Growth shocked us with their stark visions of where the world was headed. No wide-eyed greenie had predicted the opening of an ozone hole before the pipe-smoking boffins of the British Antarctic Survey spotted it when looking skyward back in 1985. On issues ranging from ocean acidification and tipping points in the Arctic to the dangers of nanotechnology, the scientists have always gotten there first — and the environmentalists have followed.

And yet, recently, the environment movement seems to have been turning up on the wrong side of the scientific argument. We have been making claims that simply do not stand up. We are accused of being anti-science — and not without reason. A few, even close friends, have begun to compare this casual contempt for science with the tactics of climate contrarians.

That should hurt.

Three current issues suggest that the risks of myopic adherence to ideology over rational debate are real: genetically modified (GM) crops, nuclear power, and shale gas development. The conventional green position is that we should be opposed to all three. Yet the voices of those with genuine environmental credentials, but who take a different view, are being drowned out by sometimes abusive and irrational argument.

In each instance, the issue is not so much which side environmentalists should be on, but rather the mind-set behind those positions and the tactics adopted to make the case. The wider political danger is that by taking anti-scientific positions, environmentalists end up helping the anti-environmental sirens of the new right.

The issue is not which side environmentalists should be on, but rather the mind-set behind their positions.

Most major environmental groups — from Friends of the Earth to Greenpeace to the Sierra Club — want a ban or moratorium on GM crops, especially for food. They fear the toxicity of these “Frankenfoods,” are concerned the introduced genes will pollute wild strains of the crops, and worry that GM seeds are a weapon in the takeover of the world’s food supply by agribusiness.

For myself, I am deeply concerned about the power of business over the world’s seeds and food supply. But GM crops are an insignificant part of that control, which is based on money and control of trading networks. Clearly there are issues about gene pollution, though research suggesting there is a problem is still very thin. Let’s do the research, rather than trash the test fields, which has been the default response of groups such as Greenpeace, particularly in my home country of Britain.

As for the Frankenfoods argument, the evidence is just not there. As the British former campaigner against GMs, Mark Lynas, points out: “Hundreds of millions of people have eaten GM-originated food without a single substantiated case of any harm done whatsoever.”

The most recent claim, published in September in the journal Food and Chemical Toxicology, that GM corn can produced tumors in rats, has been attacked as flawed in execution and conclusion by a wide range of experts with no axe to grind. In any event, the controversial study was primarily about the potential impact of Roundup, a herbicide widely used with GM corn, and not the GM technology itself.

Nonetheless, the reaction of some in the environment community to the reasoned critical responses of scientists to the paper has been to claim a global conspiracy among researchers to hide the terrible truth. One scientist was dismissed on the Web site GM Watch for being “a longtime member of the European Food Safety Authority, i.e. the very body that approved the GM corn in question.” That’s like dismissing the findings of a climate scientist because he sits on the Intergovernmental Panel on Climate Change — the “very body” that warned us about climate change. See what I mean about aping the worst and most hysterical tactics of the climate contrarians?

Stewart Brand wrote in his 2009 book Whole Earth Discipline: “I dare say the environmental movement has done more harm with its opposition to genetic engineering than any other thing we’ve been wrong about.” He will see nods of ascent from members of a nascent “green genes” movement — among them environmentalist scientists, such as Pamela Ronald of the University of California at Davis — who say GM crops can advance the cause of sustainable agriculture by improving resilience to changing climate and reducing applications of agrochemicals.

Yet such people are routinely condemned as apologists for an industrial conspiracy to poison the world. Thus, Greenpeace in East Asia claims that children eating nutrient-fortified GM “golden rice” are being used as “guinea pigs.” And its UK Web site’s introduction to its global campaigns says, “The introduction of genetically modified food and crops has been a disaster, posing a serious threat to biodiversity and our own health.” Where, ask their critics, is the evidence for such claims?

The problem is the same in the energy debate. Many environmentalists who argue, as I do, that climate change is probably the big overarching issue facing humanity in the 21st century, nonetheless often refuse to recognize that nuclear power could have a role in saving us from the worst.

For environmentalists to fan the flames of fear of nuclear power seems reckless and anti-scientific.

Nuclear power is the only large-scale source of low-carbon electricity that is fully developed and ready for major expansion.

Yes, we need to expand renewables as fast as we can. Yes, we need to reduce further the already small risks of nuclear accidents and of leakage of fissile material into weapons manufacturing. But as George Monbiot, Britain’s most prominent environment columnist, puts it: “To abandon our primary current source of low carbon energy during a climate change emergency is madness.”

Monbiot attacks the gratuitous misrepresentation of the risks of radiation from nuclear plants. It is widely suggested, on the basis of a thoroughly discredited piece of Russian head-counting, that up to a million people were killed by the Chernobyl nuclear accident in 1986. In fact, it is far from clear that many people at all — beyond the 28 workers who received fatal doses while trying to douse the flames at the stricken reactor — actually died from Chernobyl radiation. Certainly, the death toll was nothing remotely on the scale claimed.

“We have a moral duty,” Monbiot says, “not to spread unnecessary and unfounded fears. If we persuade people that they or their children are likely to suffer from horrible and dangerous health problems, and if these fears are baseless, we cause great distress and anxiety, needlessly damaging the quality of people’s lives.”

Many people have a visceral fear of nuclear power and its invisible radiation. But for environmentalists to fan the flames — especially when it gets in the way of fighting a far more real threat, from climate change — seems reckless, anti-scientific and deeply damaging to the world’s climate future.

One sure result of Germany deciding to abandon nuclear power in the wake of last year’s Fukushima nuclear accident (calamitous, but any death toll will be tiny compared to that from the tsunami that caused it) will be rising carbon emissions from a revived coal industry. By one estimate, the end of nuclear power in Germany will result in an extra 300 million tons of carbon dioxide reaching the atmosphere between now and 2020 — more than the annual emissions of Italy and Spain combined.

Last, let’s look at the latest source of green angst: shale gas and the drilling technique of hydraulic fracturing, or fracking, used to extract it. There are probably good reasons for not developing shale gas in many places. Its extraction can pollute water and cause minor earth tremors, for instance. But at root this is an argument about carbon — a genuinely double-edged issue that needs debating. For there is a good environmental case to be made that shale gas, like nuclear energy, can be part of the solution to climate change. That case should be heard and not shouted down.

Opponents of shale gas rightly say it is a carbon-based fossil fuel. But it is a much less dangerous fossil fuel than coal. Carbon emissions from burning natural gas are roughly half those from burning coal. A switch from coal to shale gas is the main reason why, in 2011, U.S. CO2 emissions fell by almost 2 percent.

Many environmentalists are imbued with a sense of their own exceptionalism and original virtue.

We cannot ignore that. With coal’s share of the world’s energy supply rising from 25 to 30 percent in the past half decade, a good argument can be made that a dash to exploit cheap shale gas and undercut this surge in coal would do more to cut carbon emissions than almost anything else. The noted environmental economist Dieter Helm of the University of Oxford argues just this in a new book, The Carbon Crunch, out this month.

But this is an unpopular argument. Carl Pope, executive director of the Sierra Club, was pilloried by activists for making the case that gas could be a “bridge fuel” to a low-carbon future. And when he stepped down, his successor condemned him for taking cash from the gas industry to fund the Sierra Club’s Beyond Coal campaign. Pope was probably wrong to take donations of that type, though some environment groups do such things all the time. But his real crime to those in the green movement seems to have been to side with the gas lobby at all.

Many environmentalists are imbued with a sense of their own exceptionalism and original virtue. But we have been dangerously wrong before. When Rachel Carson’s sound case against the mass application of DDT as an agricultural pesticide morphed into blanket opposition to much smaller indoor applications to fight malaria, it arguably resulted in millions of deaths as the diseases resurged.

And more recently, remember the confusion over biofuels? They were a new green energy source we could all support. I remember, when the biofuels craze began about 2005, I reported on a few voices urging caution. They warned that the huge land take of crops like corn and sugar cane for biofuels might threaten food supplies; that the crops would add to the destruction of rainforests; and that the carbon gains were often small to non-existent. But Friends of the Earth and others trashed them as traitors to the cause of green energy.
Well, today most greens are against most biofuels. Not least Friends of the Earth, which calls them a “big green con.” In fact, we may have swung too far in the other direction, undermining research into second-generation biofuels that could be both land- and carbon-efficient.

We don’t have to be slaves to science. There is plenty of room for raising questions about ethics and priorities that challenge the world view of the average lab grunt. And we should blow the whistle on bad science. But to indulge in hysterical attacks on any new technology that does not excite our prejudices, or to accuse genuine researchers of being part of a global conspiracy, is dishonest and self-defeating.

We environmentalists should learn to be more humble about our policy prescriptions, more willing to hear competing arguments, and less keen to engage in hectoring and bullying.

Forecasting Climate With A Chance Of Backlash (NPR)

by JENNIFER LUDDEN

February 19, 2013 3:14 AM

When it comes to climate change, Americans place great trust in their local TV weathercaster, which has led climate experts to see huge potential for public education.

The only problem? Polls show most weather presenters don’t know much about climate science, and many who do are fearful of talking about something so polarizing.

In fact, if you have heard a weathercaster speak on climate change, it’s likely been to deny it. John Coleman in San Diego and Anthony Watts of Watts Up With That? are among a group of vocal die-hards, cranking out blog posts and videos countering climate science. But even many meteorologists who don’t think it’s all a hoax still profoundly distrust climate models.

“They get reminded each and every day anytime their models don’t prove to be correct,” says Ed Maibach, who directs the Center for Climate Change Communication at George Mason University, and has carried out several surveys of TV weathercasters. “For them, the whole notion of projecting what the climate will be 30, 50, a hundred years from now, they’ve got a fairly high degree of skepticism.”

And yet, Maibach has found that many meteorologists would like to learn more and would like to educate their viewers. A few years back, he hatched a plan and found a willing partner in an unlikely place.

Prepared For Backlash

“I loved it. That’s exactly what I wanted to do,” says Jim Gandy, chief meteorologist at WLTX in Columbia, S.C.

Gandy had actually begun reading up on climate change several years earlier, when — to his surprise — a couple of geology professors at a party asked whether he thought global warming was real. Gandy was disturbed by what he learned and was game to go on air with it, even in what he calls a “dark red” state with a lot of “resistance” to the idea of climate change.

“We talked about it at length,” he says, “and we were prepared for a backlash.”

Researchers at George Mason University, with the help of Climate Central, tracked down information specific to Columbia, something many local meteorologists, with multiple weather reports a day, simply have no time to do. They also created graphics for Gandy to use whenever the local weather gave him a peg. And Gandy’s local TV station gave him something precious: 90 seconds of air time in the evening newscast.

The segment was called “Climate Matters,” and Gandy kicked it off in late July 2010. He dove in deep, packing his limited time with tidbits usually buried in scientific papers. One segment looked at what Columbia’s summer temperatures are projected to be in 40 years. Another explained how scientists can track man-made global warming, since carbon dioxide from fossil fuels has a specific chemical footprint.

Gandy also made the issue personal for viewers, reporting on how climate change will make pollen and poison ivy grow faster and more potent. He says people stopped him on the street about that.

And the backlash? There were a few cranky comments. “To my knowledge,” Gandy says, “there was at least one phone call from someone saying they needed to fire me.” But generally, the series went over well.

Better Informed Viewers

Meanwhile, researcher Ed Maibach polled people before Climate Matters began, then again a year into it. He says compared with viewers of other local stations, those who watched Jim Gandy gained a more scientifically grounded understanding of climate change, from understanding that it’s largely caused by humans, that it’s happening here and now and that it’s harmful.

“All of this is the kind of information that will help people, and help communities, make better decisions about how to adapt to a changing climate,” Maibach says.

Maibach hopes to expand the experiment, eventually making localized climate research and graphs available to meteorologists across the country. And there are other efforts to help weather forecasters become climate educators.

Last March, longtime Minnesota meteorologist Paul Douglas, founder of WeatherNationTV, posted an impassioned letter online urging his fellow Republicans to acknowledge that climate change is real.

“Other meteorologists actually emailed me and said, ‘Thanks for giving voice to something I’ve been thinking but was too afraid to say publicly,’ ” he says.

Douglas is part of a group pushing to tighten certification standards for meteorologists.

“If you’re going to talk about climate science on the air,” he says, you would “need to learn about the real science, and not get it off a talk show radio program or a website.”

After all, both he and Gandy say it’s becoming harder to separate weather from climate. That means TV weathercasters will be busier, and more closely followed, than ever.

Richard A. Muller: The Conversion of a Climate-Change Skeptic (N.Y.Times)

OP-ED CONTRIBUTOR

By RICHARD A. MULLER

Published: July 28, 2012

Berkeley, Calif.

CALL me a converted skeptic. Three years ago I identified problems in previous climate studies that, in my mind, threw doubt on the very existence of global warming. Last year, following an intensive research effort involving a dozen scientists, I concluded that global warming was real and that the prior estimates of the rate of warming were correct. I’m now going a step further: Humans are almost entirely the cause.

My total turnaround, in such a short time, is the result of careful and objective analysis by the Berkeley Earth Surface Temperature project, which I founded with my daughter Elizabeth. Our results show that the average temperature of the earth’s land has risen by two and a half degrees Fahrenheit over the past 250 years, including an increase of one and a half degrees over the most recent 50 years. Moreover, it appears likely that essentially all of this increase results from the human emission of greenhouse gases.

These findings are stronger than those of the Intergovernmental Panel on Climate Change, the United Nations group that defines the scientific and diplomatic consensus on global warming. In its 2007 report, the I.P.C.C. concluded only that most of the warming of the prior 50 years could be attributed to humans. It was possible, according to the I.P.C.C. consensus statement, that the warming before 1956 could be because of changes in solar activity, and that even a substantial part of the more recent warming could be natural.

Our Berkeley Earth approach used sophisticated statistical methods developed largely by our lead scientist, Robert Rohde, which allowed us to determine earth land temperature much further back in time. We carefully studied issues raised by skeptics: biases from urban heating (we duplicated our results using rural data alone), from data selection (prior groups selected fewer than 20 percent of the available temperature stations; we used virtually 100 percent), from poor station quality (we separately analyzed good stations and poor ones) and from human intervention and data adjustment (our work is completely automated and hands-off). In our papers we demonstrate that none of these potentially troublesome effects unduly biased our conclusions.

The historic temperature pattern we observed has abrupt dips that match the emissions of known explosive volcanic eruptions; the particulates from such events reflect sunlight, make for beautiful sunsets and cool the earth’s surface for a few years. There are small, rapid variations attributable to El Niño and other ocean currents such as the Gulf Stream; because of such oscillations, the “flattening” of the recent temperature rise that some people claim is not, in our view, statistically significant. What has caused the gradual but systematic rise of two and a half degrees? We tried fitting the shape to simple math functions (exponentials, polynomials), to solar activity and even to rising functions like world population. By far the best match was to the record of atmospheric carbon dioxide, measured from atmospheric samples and air trapped in polar ice.

Just as important, our record is long enough that we could search for the fingerprint of solar variability, based on the historical record of sunspots. That fingerprint is absent. Although the I.P.C.C. allowed for the possibility that variations in sunlight could have ended the “Little Ice Age,” a period of cooling from the 14th century to about 1850, our data argues strongly that the temperature rise of the past 250 years cannot be attributed to solar changes. This conclusion is, in retrospect, not too surprising; we’ve learned from satellite measurements that solar activity changes the brightness of the sun very little.

How definite is the attribution to humans? The carbon dioxide curve gives a better match than anything else we’ve tried. Its magnitude is consistent with the calculated greenhouse effect — extra warming from trapped heat radiation. These facts don’t prove causality and they shouldn’t end skepticism, but they raise the bar: to be considered seriously, an alternative explanation must match the data at least as well as carbon dioxide does. Adding methane, a second greenhouse gas, to our analysis doesn’t change the results. Moreover, our analysis does not depend on large, complex global climate models, the huge computer programs that are notorious for their hidden assumptions and adjustable parameters. Our result is based simply on the close agreement between the shape of the observed temperature rise and the known greenhouse gas increase.

It’s a scientist’s duty to be properly skeptical. I still find that much, if not most, of what is attributed to climate change is speculative, exaggerated or just plain wrong. I’ve analyzed some of the most alarmist claims, and my skepticism about them hasn’t changed.

Hurricane Katrina cannot be attributed to global warming. The number of hurricanes hitting the United States has been going down, not up; likewise for intense tornadoes. Polar bears aren’t dying from receding ice, and the Himalayan glaciers aren’t going to melt by 2035. And it’s possible that we are currently no warmer than we were a thousand years ago, during the “Medieval Warm Period” or “Medieval Optimum,” an interval of warm conditions known from historical records and indirect evidence like tree rings. And the recent warm spell in the United States happens to be more than offset by cooling elsewhere in the world, so its link to “global” warming is weaker than tenuous.

The careful analysis by our team is laid out in five scientific papers now online at BerkeleyEarth.org. That site also shows our chart of temperature from 1753 to the present, with its clear fingerprint of volcanoes and carbon dioxide, but containing no component that matches solar activity. Four of our papers have undergone extensive scrutiny by the scientific community, and the newest, a paper with the analysis of the human component, is now posted, along with the data and computer programs used. Such transparency is the heart of the scientific method; if you find our conclusions implausible, tell us of any errors of data or analysis.

What about the future? As carbon dioxide emissions increase, the temperature should continue to rise. I expect the rate of warming to proceed at a steady pace, about one and a half degrees over land in the next 50 years, less if the oceans are included. But if China continues its rapid economic growth (it has averaged 10 percent per year over the last 20 years) and its vast use of coal (it typically adds one new gigawatt per month), then that same warming could take place in less than 20 years.

Science is that narrow realm of knowledge that, in principle, is universally accepted. I embarked on this analysis to answer questions that, to my mind, had not been answered. I hope that the Berkeley Earth analysis will help settle the scientific debate regarding global warming and its human causes. Then comes the difficult part: agreeing across the political and diplomatic spectrum about what can and should be done.

Richard A. Muller, a professor of physics at the University of California, Berkeley, and a former MacArthur Foundation fellow, is the author, most recently, of “Energy for Future Presidents: The Science Behind the Headlines.”

Power of Suggestion (The Chronicle of Higher Education)

January 30, 2013

The amazing influence of unconscious cues is among the most fascinating discoveries of our time­—that is, if it’s true

By Tom Bartlett

New Haven, Conn.

Power of SuggestionMark Abramson for The Chronicle Review. John Bargh rocked the world of social psychology with experiments that showed the power of unconscious cues over our behavior.

Aframed print of “The Garden of Earthly Delights” hangs above the moss-green, L-shaped sectional in John Bargh’s office on the third floor of Yale University’s Kirtland Hall. Hieronymus Bosch’s famous triptych imagines a natural environment that is like ours (water, flowers) yet not (enormous spiked and translucent orbs). What precisely the 15th-century Dutch master had in mind is still a mystery, though theories abound. On the left is presumably paradise, in the middle is the world, and on the right is hell, complete with knife-faced monster and human-devouring bird devil.

By Bosch’s standard, it’s too much to say the past year has been hellish for Bargh, but it hasn’t been paradise either. Along with personal upheaval, including a lengthy child-custody battle, he has coped with what amounts to an assault on his life’s work, the research that pushed him into prominence, the studies that Malcolm Gladwell called “fascinating” and Daniel Kahneman deemed “classic.” What was once widely praised is now being pilloried in some quarters as emblematic of the shoddiness and shallowness of social psychology. When Bargh responded to one such salvo with a couple of sarcastic blog posts, he was ridiculed as going on a “one-man rampage.” He took the posts down and regrets writing them, but his frustration and sadness at how he’s been treated remain.

Psychology may be simultaneously at the highest and lowest point in its history. Right now its niftiest findings are routinely simplified and repackaged for a mass audience; if you wish to publish a best seller sans bloodsucking or light bondage, you would be well advised to match a few dozen psychological papers with relatable anecdotes and a grabby, one-word title. That isn’t true across the board. Researchers engaged in more technical work on, say, the role of grapheme units in word recognition must comfort themselves with the knowledge that science is, by its nature, incremental. But a social psychologist with a sexy theory has star potential. In the last decade or so, researchers have made astonishing discoveries about the role of consciousness, the reasons for human behavior, the motivations for why we do what we do. This stuff is anything but incremental.

At the same time, psychology has been beset with scandal and doubt. Formerly high-flying researchers like Diederik Stapel, Marc Hauser, and Dirk Smeesters saw their careers implode after allegations that they had cooked their results and managed to slip them past the supposedly watchful eyes of peer reviewers. Psychology isn’t the only field with fakers, but it has its share. Plus there’s the so-called file-drawer problem, that is, the tendency for researchers to publish their singular successes and ignore their multiple failures, making a fluke look like a breakthrough. Fairly or not, social psychologists are perceived to be less rigorous in their methods, generally not replicating their own or one another’s work, instead pressing on toward the next headline-making outcome.

Much of the criticism has been directed at priming. The definitions get dicey here because the term can refer to a range of phenomena, some of which are grounded in decades of solid evidence—like the “anchoring effect,” which happens, for instance, when a store lists a competitor’s inflated price next to its own to make you think you’re getting a bargain. That works. The studies that raise eyebrows are mostly in an area known as behavioral or goal priming, research that demonstrates how subliminal prompts can make you do all manner of crazy things. A warm mug makes you friendlier. The American flag makes you vote Republican. Fast-food logos make you impatient. A small group of skeptical psychologists—let’s call them the Replicators—have been trying to reproduce some of the most popular priming effects in their own labs.

What have they found? Mostly that they can’t get those results. The studies don’t check out. Something is wrong. And because he is undoubtedly the biggest name in the field, the Replicators have paid special attention to John Bargh and the study that started it all.

As in so many other famous psychological experiments, the researcher lies to the subject. After rearranging lists of words into sensible sentences, the subject—a New York University undergraduate—is told that the experiment is about language ability. It is not. In fact, the real test doesn’t begin until the subject exits the room. In the hallway is a graduate student with a stopwatch hidden beneath her coat. She’s pretending to wait for a meeting but really she’s working with the researchers. She times how long it takes the subject to walk from the doorway to a strip of silver tape a little more than 30 feet down the corridor. The experiment hinges on that stopwatch.

The words the subject was asked to rearrange were not random, though they seemed that way (this was confirmed in postexperiment interviews with each subject). They were words like “bingo” and “Florida,” “knits” and “wrinkles,” “bitter” and “alone.” Reading the list, you can almost picture a stooped senior padding around a condo, complaining at the television. A control group unscrambled words that evoked no theme. When the walking times of the two groups were compared, the Florida-knits-alone subjects walked, on average, more slowly than the control group. Words on a page made them act old.

It’s a cute finding. But the more you think about it, the more serious it starts to seem. What if we are constantly being influenced by subtle, unnoticed cues? If “Florida” makes you sluggish, could “cheetah” make you fleet of foot? Forget walking speeds. Is our environment making us meaner or more creative or stupider without our realizing it? We like to think we’re steering the ship of self, but what if we’re actually getting blown about by ghostly gusts?

John Bargh and his co-authors, Mark Chen and Lara Burrows, performed that experiment in 1990 or 1991. They didn’t publish it until 1996. Why sit on such a fascinating result? For starters, they wanted to do it again, which they did. They also wanted to perform similar experiments with different cues. One of those other experiments tested subjects to see if they were more hostile when primed with an African-American face. They were. (The subjects were not African-American.) In the other experiment, the subjects were primed with rude words to see if that would make them more likely to interrupt a conversation. It did.

The researchers waited to publish until other labs had found the same type of results. They knew their finding would be controversial. They knew many people wouldn’t believe it. They were willing to stick their necks out, but they didn’t want to be the only ones.

Since that study was published in the Journal of Personality and Social Psychology,it has been cited more than 2,000 times. Though other researchers did similar work at around the same time, and even before, it was that paper that sparked the priming era. Its authors knew, even before it was published, that the paper was likely to catch fire. They wrote: “The implications for many social psychological phenomena … would appear to be considerable.” Translation: This is a huge deal.

When he was 9 or 10, Bargh decided to become a psychologist. He was in the kitchen of his family’s house in Champaign, Ill., when this revelation came to him. He didn’t know everything that would entail, of course, or what exactly a psychologist did, but he wanted to understand more about human emotion because it was this “mysterious powerful influence on everything.” His dad was an administrator at the University of Illinois, and so he was familiar with university campuses. He liked them. He still does. When he was in high school, he remembers arguing about B.F. Skinner. Everyone else in the class thought Skinner’s ideas were ridiculous. Bargh took the other side, not so much because he embraced the philosophy of radical behaviorism or enjoyed Skinner’s popular writings. It was more because he reveled in contrarianism. “This guy is thinking something nobody else agrees with,” he says now. “Let’s consider that he might be right.”

I met Bargh on a Thursday morning a couple of weeks before Christmas. He was dressed in cable-knit and worn jeans with hiking boots. At 58 he still has a full head of dark, appropriately mussed-up hair. Bargh was reclining on the previously mentioned moss-green sectional while downing coffee to stay alert as he whittled away at a thick stack of finals papers. He rose to greet me, sat back down, and sighed.

The last year has been tough for Bargh. Professionally, the nadir probably came in January, when a failed replication of the famous elderly-walking study was published in the journal PLoS ONE. It was not the first failed replication, but this one stung. In the experiment, the researchers had tried to mirror Bargh’s methods with an important exception: Rather than stopwatches, they used automatic timing devices with infrared sensors to eliminate any potential bias. The words didn’t make subjects act old. They tried the experiment again with stopwatches and added a twist: They told those operating the stopwatches which subjects were expected to walk slowly. Then it worked. The title of their paper tells the story: “Behavioral Priming: It’s All in the Mind, but Whose Mind?”

The paper annoyed Bargh. He thought the researchers didn’t faithfully follow his methods section, despite their claims that they did. But what really set him off was a blog post that explained the results. The post, on the blog Not Exactly Rocket Science, compared what happened in the experiment to the notorious case of Clever Hans, the horse that could supposedly count. It was thought that Hans was a whiz with figures, stomping a hoof in response to mathematical queries. In reality, the horse was picking up on body language from its handler. Bargh was the deluded horse handler in this scenario. That didn’t sit well with him. If the PLoS ONE paper is correct, the significance of his experiment largely dissipates. What’s more, he looks like a fool, tricked by a fairly obvious flaw in the setup.

Bargh responded in two long, detailed posts on his rarely updated Psychology Todayblog. He spelled out the errors he believed were made in the PLoS ONE paper. Most crucially, he wrote, in the original experiment there was no way for the graduate student with the stopwatch to know who was supposed to walk slowly and who wasn’t. The posts were less temperate than most public discourse in science, but they were hardly mouth-foaming rants. He referred to “incompetent or ill-informed researchers,” clearly a shot at the paper’s authors. He mocked the journal where the replication was published as “pay to play” and lacking the oversight of traditional journals. The title of the post, “Nothing in Their Heads,” while perhaps a reference to unconscious behavior, seemed less than collegial.

He also expressed concern for readers who count on “supposedly reputable online media sources for accurate information on psychological science.” This was a dig at the blog post’s author, Ed Yong, who Bargh believes had written an unfair piece. “I was hurt by the things that were said, not just in the article, but in Ed Yong’s coverage of it,” Bargh says now. Yong’s post was more, though, than a credulous summary of the study. He interviewed researchers and provided context. The headline, “Why a classic psychology experiment isn’t what it seemed,” might benefit from softening, but if you’re looking for an example of sloppy journalism, this ain’t it.

While Bargh was dismayed by the paper and the publicity, the authors of the replication were equally taken aback by the severity of Bargh’s reaction. “That really threw us off, that response,” says Axel Cleeremans, a professor of cognitive science at the Université Libre de Bruxelles. “It was obvious that he was so dismissive, it was close to frankly insulting. He described us as amateur experimentalists, which everyone knows we are not.” Nor did they feel that his critique of their methods was valid. Even so, they tried the experiment again, taking into account Bargh’s concerns. It still didn’t work.

Bargh took his blog posts down after they were criticized. Though his views haven’t changed, he feels bad about his tone. In our conversations over the last month or so, Bargh has at times vigorously defended his work, pointing to a review he published recently in Trends in Cognitive Sciences that marshals recent priming studies into a kind of state-of-the-field address. Short version: Science marches on, priming’s doing great.

He complains that he has been a victim of scientific bullying (and some sympathetic toward Bargh use that phrase, too). There are other times, though, when he just seems crushed. “You invest your whole career and life in something, and to have this happen near the end of it—it’s very hard to take,” he says. Priming is what Bargh is known for. When he says “my name is a symbol that stands for these kinds of effects,” he’s not being arrogant. That’s a fact. Before the 1996 paper, he had already published respected and much-cited work on unconscious, automatic mental processes, but priming has defined him. In an interview on the Web site Edge a few years ago, back before the onslaught, he explained his research goals: “We have a trajectory downward, always downward, trying to find simple, basic causes and with big effects. We’re looking for simple things—not anything complicated—simple processes or concepts that then have profound effects.” The article labeled him “the simplifier.”

When I ask if he still believes in these effects, he says yes. They have been replicated in multiple labs. Some of those replications have been exact: stopwatch, the same set of words, and so on. Others have been conceptual. While they explore the same idea, maybe the study is about handwriting rather than walking. Maybe it’s about obesity rather than elderly stereotypes. But the gist is the same. “It’s not just my work that’s under attack here,” Bargh says. “It’s lots of people’s research being attacked and dismissed.” He has moments of doubt. How could he not? It’s deeply unsettling to have someone scrutinizing your old papers, looking for inconsistencies, even if you’re fairly confident about what you’ve accomplished. “Maybe there’s something we were doing that I didn’t realize,” he says, explaining the thoughts that have gone through his head. “You start doing that examination.”

So why not do an actual examination? Set up the same experiments again, with additional safeguards. It wouldn’t be terribly costly. No need for a grant to get undergraduates to unscramble sentences and stroll down a hallway. Bargh says he wouldn’t want to force his graduate students, already worried about their job prospects, to spend time on research that carries a stigma. Also, he is aware that some critics believe he’s been pulling tricks, that he has a “special touch” when it comes to priming, a comment that sounds like a compliment but isn’t. “I don’t think anyone would believe me,” he says.

Harold Pashler wouldn’t. Pashler, a professor of psychology at the University of California at San Diego, is the most prolific of the Replicators. He started trying priming experiments about four years ago because, he says, “I wanted to see these effects for myself.” That’s a diplomatic way of saying he thought they were fishy. He’s tried more than a dozen so far, including the elderly-walking study. He’s never been able to achieve the same results. Not once.

This fall, Daniel Kahneman, the Nobel Prize-winning psychologist, sent an e-mail to a small group of psychologists, including Bargh, warning of a “train wreck looming” in the field because of doubts surrounding priming research. He was blunt: “I believe that you should collectively do something about this mess. To deal effectively with the doubts you should acknowledge their existence and confront them straight on, because a posture of defiant denial is self-defeating,” he wrote.

Strongly worded e-mails from Nobel laureates tend to get noticed, and this one did. He sent it after conversations with Bargh about the relentless attacks on priming research. Kahneman cast himself as a mediator, a sort of senior statesman, endeavoring to bring together believers and skeptics. He does have a dog in the fight, though: Kahneman believes in these effects and has written admiringly of Bargh, including in his best seller Thinking, Fast and Slow.

On the heels of that message from on high, an e-mail dialogue began between the two camps. The vibe was more conciliatory than what you hear when researchers are speaking off the cuff and off the record. There was talk of the type of collaboration that Kahneman had floated, researchers from opposing sides combining their efforts in the name of truth. It was very civil, and it didn’t lead anywhere.

In one of those e-mails, Pashler issued a challenge masquerading as a gentle query: “Would you be able to suggest one or two goal priming effects that you think are especially strong and robust, even if they are not particularly well-known?” In other words, put up or shut up. Point me to the stuff you’re certain of and I’ll try to replicate it. This was intended to counter the charge that he and others were cherry-picking the weakest work and then doing a victory dance after demolishing it. He didn’t get the straightforward answer he wanted. “Some suggestions emerged but none were pointing to a concrete example,” he says.

One possible explanation for why these studies continually and bewilderingly fail to replicate is that they have hidden moderators, sensitive conditions that make them a challenge to pull off. Pashler argues that the studies never suggest that. He wrote in that same e-mail: “So from our reading of the literature, it is not clear why the results should be subtle or fragile.”

Bargh contends that we know more about these effects than we did in the 1990s, that they’re more complicated than researchers had originally assumed. That’s not a problem, it’s progress. And if you aren’t familiar with the literature in social psychology, with the numerous experiments that have modified and sharpened those early conclusions, you’re unlikely to successfully replicate them. Then you will trot out your failure as evidence that the study is bogus when really what you’ve proved is that you’re no good at social psychology.

Pashler can’t quite disguise his disdain for such a defense. “That doesn’t make sense to me,” he says. “You published it. That must mean you think it is a repeatable piece of work. Why can’t we do it just the way you did it?”

That’s how David Shanks sees things. He, too, has been trying to replicate well-known priming studies, and he, too, has been unable to do so. In a forthcoming paper, Shanks, a professor of psychology at University College London, recounts his and his several co-authors’ attempts to replicate one of the most intriguing effects, the so-called professor prime. In the study, one group was told to imagine a professor’s life and then list the traits that brought to mind. Another group was told to do the same except with a soccer hooligan rather than a professor.

The groups were then asked questions selected from the board game Trivial Pursuit, questions like “Who painted ‘Guernica’?” and “What is the capital of Bangladesh?” (Picasso and Dhaka, for those playing at home.) Their scores were then tallied. The subjects who imagined the professor scored above a control group that wasn’t primed. The subjects who imagined soccer hooligans scored below the professor group and below the control. Thinking about a professor makes you smart while thinking about a hooligan makes you dumb. The study has been replicated a number of times, including once on Dutch television.

Shanks can’t get the result. And, boy, has he tried. Not once or twice, but nine times.

The skepticism about priming, says Shanks, isn’t limited to those who have committed themselves to reperforming these experiments. It’s not only the Replicators. “I think more people in academic psychology than you would imagine appreciate the historical implausibility of these findings, and it’s just that those are the opinions that they have over the water fountain,” he says. “They’re not the opinions that get into the journalism.”

Like all the skeptics I spoke with, Shanks believes the worst is yet to come for priming, predicting that “over the next two or three years you’re going to see an avalanche of failed replications published.” The avalanche may come sooner than that. There are failed replications in press at the moment and many more that have been completed (Shanks’s paper on the professor prime is in press at PLoS ONE). A couple of researchers I spoke with didn’t want to talk about their results until they had been peer reviewed, but their preliminary results are not encouraging.

Ap Dijksterhuis is the author of the professor-prime paper. At first, Dijksterhuis, a professor of psychology at Radboud University Nij­megen, in the Netherlands, wasn’t sure he wanted to be interviewed for this article. That study is ancient news—it was published in 1998, and he’s moved away from studying unconscious processes in the last couple of years, in part because he wanted to move on to new research on happiness and in part because of the rancor and suspicion that now accompany such work. He’s tired of it.

The outing of Diederik Stapel made the atmosphere worse. Stapel was a social psychologist at Tilburg University, also in the Netherlands, who was found to have committed scientific misconduct in scores of papers. The scope and the depth of the fraud were jaw-dropping, and it changed the conversation. “It wasn’t about research practices that could have been better. It was about fraud,” Dijksterhuis says of the Stapel scandal. “I think that’s playing in the background. It now almost feels as if people who do find significant data are making mistakes, are doing bad research, and maybe even doing fraudulent things.”

In the e-mail discussion spurred by Kahneman’s call to action, Dijk­sterhuis laid out a number of possible explanations for why skeptics were coming up empty when they attempted priming studies. Cultural differences, for example. Studying prejudice in the Netherlands is different from studying it in the United States. Certain subjects are not susceptible to certain primes, particularly a subject who is unusually self-aware. In an interview, he offered another, less charitable possibility. “It could be that they are bad experimenters,” he says. “They may turn out failures to replicate that have been shown by 15 or 20 people already. It basically shows that it’s something with them, and it’s something going on in their labs.”

Joseph Cesario is somewhere between a believer and a skeptic, though these days he’s leaning more skeptic. Cesario is a social psychologist at Michigan State University, and he’s successfully replicated Bargh’s elderly-walking study, discovering in the course of the experiment that the attitude of a subject toward the elderly determined whether the effect worked or not. If you hate old people, you won’t slow down. He is sympathetic to the argument that moderators exist that make these studies hard to replicate, lots of little monkey wrenches ready to ruin the works. But that argument only goes so far. “At some point, it becomes excuse-making,” he says. “We have to have some threshold where we say that it doesn’t exist. It can’t be the case that some small group of people keep hitting on the right moderators over and over again.”

Cesario has been trying to replicate a recent finding of Bargh’s. In that study, published last year in the journal Emotion, Bargh and his co-author, Idit Shalev, asked subjects about their personal hygiene habits—how often they showered and bathed, for how long, how warm they liked the water. They also had subjects take a standard test to determine their degree of social isolation, whether they were lonely or not. What they found is that lonely people took longer and warmer baths and showers, perhaps substituting the warmth of the water for the warmth of regular human interaction.

That isn’t priming, exactly, though it is a related unconscious phenomenon often called embodied cognition. As in the elderly-walking study, the subjects didn’t realize what they were doing, didn’t know they were bathing longer because they were lonely. Can warm water alleviate feelings of isolation? This was a result with real-world applications, and reporters jumped on it. “Wash the loneliness away with a long, hot bath,” read an NBC News headline.

Bargh’s study had 92 subjects. So far Cesario has run more than 2,500 through the same experiment. He’s found absolutely no relationship between bathing and loneliness. Zero. “It’s very worrisome if you have people thinking they can take a shower and they can cure their depression,” he says. And he says Bargh’s data are troublesome. “Extremely small samples, extremely large effects—that’s a red flag,” he says. “It’s not a red flag for people publishing those studies, but it should be.”

Even though he is, in a sense, taking aim at Bargh, Cesario thinks it’s a shame that the debate over priming has become so personal, as if it’s a referendum on one man. “He has the most eye-catching findings. He always has,” Cesario says. “To the extent that some of his effects don’t replicate, because he’s identified as priming, it casts doubt on the entire body of research. He is priming.”

That has been the narrative. Bargh’s research is crumbling under scrutiny and, along with it, perhaps priming as a whole. Maybe the most exciting aspect of social psychology over the last couple of decades, these almost magical experiments in which people are prompted to be smarter or slower without them even knowing it, will end up as an embarrassing footnote rather than a landmark achievement.

Then along comes Gary Latham.

Latham, an organizational psychologist in the management school at the University of Toronto, thought the research Bargh and others did was crap. That’s the word he used. He told one of his graduate students, Amanda Shantz, that if she tried to apply Bargh’s principles it would be a win-win. If it failed, they could publish a useful takedown. If it succeeded … well, that would be interesting.

They performed a pilot study, which involved showing subjects a photo of a woman winning a race before the subjects took part in a brainstorming task. As Bargh’s research would predict, the photo made them perform better at the brainstorming task. Or seemed to. Latham performed the experiment again in cooperation with another lab. This time the study involved employees in a university fund-raising call center. They were divided into three groups. Each group was given a fact sheet that would be visible while they made phone calls. In the upper left-hand corner of the fact sheet was either a photo of a woman winning a race, a generic photo of employees at a call center, or no photo. Again, consistent with Bargh, the subjects who were primed raised more money. Those with the photo of call-center employees raised the most, while those with the race-winner photo came in second, both outpacing the photo-less control. This was true even though, when questioned afterward, the subjects said they had been too busy to notice the photos.

Latham didn’t want Bargh to be right. “I couldn’t have been more skeptical or more disbelieving when I started the research,” he says. “I nearly fell off my chair when my data” supported Bargh’s findings.

That experiment has changed Latham’s opinion of priming and has him wondering now about the applications for unconscious primes in our daily lives. Are there photos that would make people be safer at work? Are there photos that undermine performance? How should we be fine-tuning the images that surround us? “It’s almost scary in lots of ways that these primes in these environments can affect us without us being aware,” he says. Latham hasn’t stopped there. He’s continued to try experiments using Bargh’s ideas, and those results have only strengthened his confidence in priming. “I’ve got two more that are just mind-blowing,” he says. “And I know John Bargh doesn’t know about them, but he’ll be a happy guy when he sees them.”

Latham doesn’t know why others have had trouble. He only knows what he’s found, and he’s certain about his own data. In the end, Latham thinks Bargh will be vindicated as a pioneer in understanding unconscious motivations. “I’m like a converted Christian,” he says. “I started out as a devout atheist, and now I’m a believer.”

Following his come-to-Jesus transformation, Latham sent an e-mail to Bargh to let him know about the call-center experiment. When I brought this up with Bargh, his face brightened slightly for the first time in our conversation. “You can imagine how that helped me,” he says. He had been feeling isolated, under siege, worried that his legacy was becoming a cautionary tale. “You feel like you’re on an island,” he says.

Though Latham is now a believer, he remains the exception. With more failed replications in the pipeline, Dijksterhuis believes that Kahneman’s looming-train-wreck letter, though well meaning, may become a self-fulfilling prophecy, helping to sink the field rather than save it. Perhaps the perception has already become so negative that further replications, regardless of what they find, won’t matter much. For his part, Bargh is trying to take the long view. “We have to think about 50 or 100 years from now—are people going to believe the same theories?” he says. “Maybe it’s not true. Let’s see if it is or isn’t.”

Tom Bartlett is a senior writer at The Chronicle.

New Research Shows Complexity of Global Warming (Science Daily)

Jan. 30, 2013 — Global warming from greenhouse gases affects rainfall patterns in the world differently than that from solar heating, according to a study by an international team of scientists in the January 31 issue of Nature. Using computer model simulations, the scientists, led by Jian Liu (Chinese Academy of Sciences) and Bin Wang (International Pacific Research Center, University of Hawaii at Manoa), showed that global rainfall has increased less over the present-day warming period than during the Medieval Warm Period, even though temperatures are higher today than they were then.

Clouds over the Pacific Ocean. (Credit: Shang-Ping Xie)

The team examined global precipitation changes over the last millennium and future projection to the end of 21st century, comparing natural changes from solar heating and volcanism with changes from human-made greenhouse gas emissions. Using an atmosphere-ocean coupled climate model that simulates realistically both past and present-day climate conditions, the scientists found that for every degree rise in global temperature, the global rainfall rate since the Industrial Revolution has increased less by about 40% than during past warming phases of Earth.

Why does warming from solar heating and from greenhouse gases have such different effects on global precipitation?

“Our climate model simulations show that this difference results from different sea surface temperature patterns. When warming is due to increased greenhouse gases, the gradient of sea surface temperature (SST) across the tropical Pacific weakens, but when it is due to increased solar radiation, the gradient increases. For the same average global surface temperature increase, the weaker SST gradient produces less rainfall, especially over tropical land,” says co-author Bin Wang, professor of meteorology.

But why does warming from greenhouse gases and from solar heating affect the tropical Pacific SST gradient differently?

“Adding long-wave absorbers, that is heat-trapping greenhouse gases, to the atmosphere decreases the usual temperature difference between the surface and the top of the atmosphere, making the atmosphere more stable,” explains lead-author Jian Liu. “The increased atmospheric stability weakens the trade winds, resulting in stronger warming in the eastern than the western Pacific, thus reducing the usual SST gradient — a situation similar to El Niño.”

Solar radiation, on the other hand, heats Earth’s surface, increasing the usual temperature difference between the surface and the top of the atmosphere without weakening the trade winds. The result is that heating warms the western Pacific, while the eastern Pacific remains cool from the usual ocean upwelling.

“While during past global warming from solar heating the steeper tropical east-west SST pattern has won out, we suggest that with future warming from greenhouse gases, the weaker gradient and smaller increase in yearly rainfall rate will win out,” concludes Wang.

Journal Reference:

  1. Jian Liu, Bin Wang, Mark A. Cane, So-Young Yim, June-Yi Lee. Divergent global precipitation changes induced by natural versus anthropogenic forcingNature, 2013; 493 (7434): 656 DOI: 10.1038/nature11784

Understanding the Historical Probability of Drought (Science Daily)

Jan. 30, 2013 — Droughts can severely limit crop growth, causing yearly losses of around $8 billion in the United States. But it may be possible to minimize those losses if farmers can synchronize the growth of crops with periods of time when drought is less likely to occur. Researchers from Oklahoma State University are working to create a reliable “calendar” of seasonal drought patterns that could help farmers optimize crop production by avoiding days prone to drought.

Historical probabilities of drought, which can point to days on which crop water stress is likely, are often calculated using atmospheric data such as rainfall and temperatures. However, those measurements do not consider the soil properties of individual fields or sites.

“Atmospheric variables do not take into account soil moisture,” explains Tyson Ochsner, lead author of the study. “And soil moisture can provide an important buffer against short-term precipitation deficits.”

In an attempt to more accurately assess drought probabilities, Ochsner and co-authors, Guilherme Torres and Romulo Lollato, used 15 years of soil moisture measurements from eight locations across Oklahoma to calculate soil water deficits and determine the days on which dry conditions would be likely. Results of the study, which began as a student-led class research project, were published online Jan. 29 inAgronomy Journal. The researchers found that soil water deficits more successfully identified periods during which plants were likely to be water stressed than did traditional atmospheric measurements when used as proposed by previous research.

Soil water deficit is defined in the study as the difference between the capacity of the soil to hold water and the actual water content calculated from long-term soil moisture measurements. Researchers then compared that soil water deficit to a threshold at which plants would experience water stress and, therefore, drought conditions. The threshold was determined for each study site since available water, a factor used to calculate threshold, is affected by specific soil characteristics.

“The soil water contents differ across sites and depths depending on the sand, silt, and clay contents,” says Ochsner. “Readily available water is a site- and depth-specific parameter.”

Upon calculating soil water deficits and stress thresholds for the study sites, the research team compared their assessment of drought probability to assessments made using atmospheric data. They found that a previously developed method using atmospheric data often underestimated drought conditions, while soil water deficits measurements more accurately and consistently assessed drought probabilities. Therefore, the researchers suggest that soil water data be used whenever it is available to create a picture of the days on which drought conditions are likely.

If soil measurements are not available, however, the researchers recommend that the calculations used for atmospheric assessments be reconfigured to be more accurate. The authors made two such changes in their study. First, they decreased the threshold at which plants were deemed stressed, thus allowing a smaller deficit to be considered a drought condition. They also increased the number of days over which atmospheric deficits were summed. Those two changes provided estimates that better agreed with soil water deficit probabilities.

Further research is needed, says Ochsner, to optimize atmospheric calculations and provide accurate estimations for those without soil water data. “We are in a time of rapid increase in the availability of soil moisture data, but many users will still have to rely on the atmospheric water deficit method for locations where soil moisture data are insufficient.”

Regardless of the method used, Ochsner and his team hope that their research will help farmers better plan the cultivation of their crops and avoid costly losses to drought conditions.

Journal Reference:

  1. Guilherme M. Torres, Romulo P. Lollato, Tyson E. Ochsner.Comparison of Drought Probability Assessments Based on Atmospheric Water Deficit and Soil Water Deficit.Agronomy Journal, 2013; DOI: 10.2134/agronj2012.0295

Revolução nas universidades (OESP)

JC e-mail 4656, de 30 de Janeiro de 2013.

Artigo de Thomas Friedman* no The New York Times, publicado no O Estado de São Paulo

Avanço do ensino superior online nas melhores escolas tornará o conceito de diploma algo arcaico; e isso é bom
Deus sabe que há muitas más notícias no mundo atual que nos derrubam, mas está ocorrendo alguma coisa formidável que me deixa esperançoso com relação ao futuro. Trata-se da revolução, incipiente, no ensino superior online.

Nada tem mais potencial para tirar as pessoas da pobreza – oferecendo a elas um ensino acessível que vai ajudá-las a conseguir trabalho ou ter melhores condições no seu emprego.
Nada tem mais potencial para libertar um bilhão de cérebros para solucionar os grandes problemas do mundo.

E nada tem mais potencial para recriar o ensino superior do que as MOOC (Massive Open Online Course), plataformas desenvolvidas por especialistas de Stanford, por colegas do MIT (Massachusetts Institute of Technology) e por empresas como Goursera e Udacity.

Em maio, escrevi um artigo sobre a Goursera – fundada por dois cientistas da computação de Stanford, Daphne Koller e Andrew Ng. Há duas semanas, retornei a Paio Alto para saber do seu progresso. Quando visitei a Goursera, em 2012, cerca de 300 mil pessoas participavam de 38 cursos proferidos por professores de Stanford e de outras universidades de elite.

Hoje, são 2,4 milhões de alunos e 214 cursos de 33 universidades, incluindo 8 internacionais. AnantAgarwal, ex-diretor do laboratório de inteligência artificial do MIT, hoje é presidente da edX, uma plataforma sem fins lucrativos criada em conjunto pelo MIT e pela Univer-sidade Harvard. Anant disse que, desde maio, cerca de 155 mil alunos do mundo todo participam do primeiro curso da edX: um curso introdutório sobre circuitos do MIT.

“E um número superior ao total dos alunos do MIT em sua história de 150 anos”, afirmou.
Claro que somente uma pequena porcentagem desses alunos completa o curso, mas estou convencido de que, dentro de cinco anos, essas plataformas alcançarão um público mais amplo. Imagine como isso poderá mudar a ajuda externa dos EUA.

Gastando relativamente pouco, o país poderia arrendar um espaço num vilarejo egípcio, instalar duas dezenas de computadores e dispositivos de acesso à internet de alta velocidade via satélite, contratar um professor local como coordenador e convidar todos os egípcios que desejarem ter aulas online com os melhores professores do mundo e legendas em árabe.

É preciso ouvir as histórias narradas pelos pioneiros dessa iniciativa para compreender seu potencial revolucionário. Uma das favoritas de Daphne Koller é sobre Daniel, um jovem de 17 anos com autismo que se comunica por meio do computador. Ele fez um curso online de poesia moderna oferecido pela Universidade da Pensilvânia.

Segundo Daniel e seus pais, a combinação de um currículo acadêmico rigoroso, que exige que ele se concentre na sua tarefa, e do sistema de aprendizado online, que não força sua capacidade de se relacionar, permite que ele administre melhor o autismo.

Daphne mostrou uma carta de Daniel em que ele escreveu: “Por favor, relateà Goursera e à Universidade da Pensilvânia a minhahistória. Souumjovem saindo do autismo. Ainda não consigo sentar-me numa sala de aula, de modo que esse foi meu primeiro curso de verdade.

Agora, sei que posso me beneficiar de um trabalho que exige muito de mim e ter o prazer de me sintonizar com o mundo.” Um membro da equipe do Goursera, que fez um curso sobre sustentabilida-de, me disse que foi muito mais interessante do que um estudo similar que ele fez na faculdade. Do curso online participaram estudantes do mundo todo e, assim, “as discussões que surgiram foram muito mais valiosas e interessantes do que os debates com pessoas iguais de uma típica faculdade americana. Mitch Duneier, professor de sociologia de Princeton, escreveu um ensaio sobre sua experiência ao dar aula num curso da Coursera.

“Há alguns meses, quando o campus de Princeton ficou quase em silêncio depois das cerimônias de graduação, 40 mil estudantes de 113 países chegaram aqui via internet para um curso grátis de introdução à sociologia. Minha aula de abertura, sobre o clássico de C. Wright Mills, de 1959, The Sociological Imagination, foi concentrada na leitura minuciosa do texto de um capítulo-chave. Pedi aos alunos para seguirem a análise em suas cópias, como faço em sala de aula. Quando dou essa aula em Princeton, normalmente, são feitas algumas perguntas perspicazes. Nesse caso, algumas horas depois de postar a versão online, os fóruns pegaram fogo, com centenas de comentários e perguntas. Alguns dias depois, eram milhares. Num espaço de três semanas, recebi mais feed-back sobre minhas ideias 11a área de sociologia do que em toda a minha carreira de professor, o que influenciou consideravelmente cada uma das minhas aulas e seminários seguintes.”

Anant Agarwal, da edX, fala sobre um estudante no Cairo que teve dificuldades e postou uma mensagem dizendo que pretendia abandonar o curso online. Em resposta, outros alunos no Cairo, da mesma classe, o convidaram para um encontro numa casa de chá, onde se ofereceram para ajudá-lo. Um estudante da Mongólia, de 15 anos, que estava na mesma classe, participando de um curso semipre-sencial, hoje está se candidatando a uma vaga no MIT e na Universidade da Califórnia, em Berkeley.

À medida que pensamos no futuro do ensino superior, segundo o presidente do MIT, Rafael Reif, algo que hoje chamamos “diploma” será um conceito relacionado com “tijolos e argamassa” – e as tradicionais experiências 110 campus, que influenciarão cada vez mais a tecnologia e a internet para melhorar o trabalho em sala de aula e no laboratório.

Ao lado disso, contudo, muitas universidades oferecerão cursos online para estudantes de qualquer parte do mundo, em que eles conseguirão “credenciais” – ou seja, certificados atestando que realizaram o trabalho e passaram, em todos os exames.

O processo de criação de credenciais fidedignas certificando que o aluno domina adequadamente o assunto – e no qual um empregador pode confiar ainda está sendo aperfeiçoado por todos os MOOCs. No entanto, uma vez resolvida a questão, esse fenômeno realmente se propagará muito.

Posso ver o dia em que você criará o seu diploma universitário participando dos melhores cursos online com os mais capacitados professores do mundo todo – de computação de Stanford, de empreendedorismo da Wharton, de ética da Brandeis, de literatura da Universidade de Edimburgo – pagando apenas uma taxa pelo certificado de conclusão do curso. Isso mudará o ensino, o aprendizado e o caminho para o emprego.

“Um novo mundo está se revelando”, disse Reif. “E todos terão de se adaptar”.

* Thomas Friedman é colunista do The New York Times. (O texto foi traduzido por Terezinha Martinho do O Estado de São Paulo)