Arquivo mensal: julho 2012

Um novo bóson à vista (FAPESP)

Físicos do Cern descobriram nova partícula que parece ser o bóson de Higgs

MARCOS PIVETTA | Edição Online 19:46 4 de julho de 2012

Colisões de prótons nos quais se observa quatro elétrons de alta energia (linhas verdes e torres vermelhas). O evento mostra características esperadas do decaimento de um bóson de Higgs mas também é coerente com processos de fundo do modelo padrão

de Lindau (Alemanha)*

O maior laboratório do mundo pode ter encontrado a partícula que dá massa a todas as outras partículas, o tão procurado bóson de Higgs. Era a peça que faltava para completar um quebra-cabeça científico chamado modelo padrão, o arcabouço teórico formulado nas últimas décadas para explicar as partículas e forças presentes na matéria visível do Universo. Depois de analisarem trilhões de colisões de prótons produzidas em 2011 e em parte deste ano no Grande Acelerador de Hádrons (LHC), físicos dos dois maiores experimentos tocados de forma independente no Centro Europeu de Energia Nuclear (Cern) anunciaram nesta quarta-feira (4), nos arredores de Genebra (Suíça), a descoberta de uma nova partícula que tem quase todas as características do bóson de Higgs, embora ainda não possam assegurar com certeza de que se trata especificamente desse ou de algum outro tipo de bóson.

“Observamos em nossos dados sinais claros de uma nova partícula na região de massa em torno de 126 GeV (Giga-elétron-volts)”, disse a física Fabiola Gianotti, porta-voz do experimento Atlas. “Mas precisamos um pouco mais de tempo para prepararmos os resultados para publicação.” As informações provenientes de outro experimento feito no Cern, o CMS, são praticamente idênticas. “Os resultados são preliminares, mas os sinais que vemos em torno da região com massa de 125 GeV são dramáticos. É realmente uma nova partícula. Sabemos que deve ser um bóson e é o bóson mais pesado que achamos”, afirmou o porta-voz do experimento CMS, o físico Joe Incandela. Se tiver mesmo uma massa de 125 ou 126 GeV, a nova partícula será tão pesada quanto um átomo do elemento químico iodo.

Em ambos os casos experimentos, o grau de confiabilidade das análises estatísticas atingiu o nível que os cientistas chamam de 5 sigma. Nesses casos, a chance de erro é de uma em três milhões. Ou seja, com esse nível de certeza, é possível falar que houve uma descoberta, só não se conhece em detalhes a natureza da partícula encontrada. “É incrível que essa descoberta tenha acontecido durante a minha vida”, comenta Peter Higgs, o físico teórico britânico que, há 50 anos, ao lado de outros cientistas, previu a existência desse tipo de bóson. Ainda neste mês, um artigo com os dados do LHC deverá ser submetido a uma revista científica. Até o final do ano, quando acelerador será fechado para manutenção por ao menos um ano e meio, mais dados devem ser produzidos pelos dois experimentos.

“Estou rindo o dia todo”
Em Lindau, uma pequena cidade do sul da Alemanha à beira do lago Constance na divisa com a Áustria e a Suíça, onde ocorre nesta semana o 62º Encontro de Prêmios Nobel, os pesquisadores comemoraram a notícia vinda dos experimentos no Cern. Como o tema do encontro deste ano era física, não faltaram laureados com a maior honraria da ciência para comentar o feito. “Não sabemos se é o bóson (de Higgs), mas é um bóson”, disse o físico teórico David J. Gross, da Universidade de Califórnia, ganhador do Nobel de 2004 pela descoberta da liberdade assintótica. “Estou rindo o dia todo.” O físico experimental Carlo Rubia, ex-diretor geral do Cern e ganhador do Nobel de 1984 por trabalhos que levaram à identificação de dois tipos de bósons (o W e Z), foi na mesma linha de raciocínio. “Estamos diante de um marco”, afirmou.

Talvez com um entusiasmo um pouco menor, mas ainda assim reconhecendo a enorme importância do achado no Cern, dois outros Nobel deram sua opinião sobre a notícia do dia. “É algo que esperávamos há anos”, afirmou o físico teórico holandês Martinus Veltman, que recebeu o prêmio em 1999. “O modelo padrão ganhou um degrau maior de validade.” Para o cosmologista americano George Smoot, ganhador do Nobel de 2006 pela descoberta da radiação cósmica de fundo (uma relíquia do Big Bang, a explosão primordial que criou o Universo), ainda deve demorar uns dois ou três anos para os cientistas realmente saberem que tipo de nova partícula foi realmente descoberta. Se a nova partícula não for o bóson de Higgs, Smoot disse que seria “maravilhoso se fosse algo relacionado com a matéria escura”, um misterioso componente que, ao lado da matéria visível e da ainda mais desconhecida energia escura, seria um dos pilares do Universo.

Não é possível medir de forma direta partículas com as propriedades do bóson de Higgs, mas sua existência, ainda que fugaz, deixaria rastros, que, estes sim, poderiam ser detectados num acelerador de partículas tão potente como o LHC. Instáveis e fugazes, os bósons de Higgs sobrevivem uma ínfima fração de segundo – até decaírem e virarem partículas menos pesadas, que, por sua vez, decaem também e dão origem a partículas ainda mais leves. O modelo padrão prevê que, em função de sua massa, os bósons de Higgs devem decair em diferentes canais, ou seja, em distintas combinações de partículas mais leves, como dois fótons ou quatro léptons. Nos experimentos feitos no Cern, dos quais participaram cerca de 6 mil físicos, foram encontradas evidências quase inequívocas das formas de decaimento que seriam a assinatura típica dos bóson de Higgs.

*O jornalista Marcos Pivetta viajou a Lindau a convite do Daad (Serviço Alemão de Intercâmbio Acadêmico)

Elinor Ostrom, defender of the commons, died on June 12th, aged 78 (The Economist)

Jun 30th 2012 | from the print edition

IT SEEMED to Elinor Ostrom that the world contained a large body of common sense. People, left to themselves, would sort out rational ways of surviving and getting along. Although the world’s arable land, forests, fresh water and fisheries were all finite, it was possible to share them without depleting them and to care for them without fighting. While others wrote gloomily of the tragedy of the commons, seeing only overfishing and overfarming in a free-for-all of greed, Mrs Ostrom, with her loud laugh and louder tops, cut a cheery and contrarian figure.

Years of fieldwork, by herself and others, had shown her that humans were not trapped and helpless amid diminishing supplies. She had looked at forests in Nepal, irrigation systems in Spain, mountain villages in Switzerland and Japan, fisheries in Maine and Indonesia. She had even, as part of her PhD at the University of California, Los Angeles, studied the water wars and pumping races going on in the 1950s in her own dry backyard.

All these cases had taught her that, over time, human beings tended to draw up sensible rules for the use of common-pool resources. Neighbours set boundaries and assigned shares, with each individual taking it in turn to use water, or to graze cows on a certain meadow. Common tasks, such as clearing canals or cutting timber, were done together at a certain time. Monitors watched out for rule-breakers, fining or eventually excluding them. The schemes were mutual and reciprocal, and many had worked well for centuries.

Best of all, they were not imposed from above. Mrs Ostrom put no faith in governments, nor in large conservation schemes paid for with aid money and crawling with concrete-bearing engineers. “Polycentrism” was her ideal. Caring for the commons had to be a multiple task, organised from the ground up and shaped to cultural norms. It had to be discussed face to face, and based on trust. Mrs Ostrom, besides poring over satellite data and quizzing lobstermen herself, enjoyed employing game theory to try to predict the behaviour of people faced with limited resources. In her Workshop in Political Theory and Policy Analysis at the University of Indiana—set up with her husband Vincent, a political scientist, in 1973—her students were given shares in a notional commons. When they simply discussed what they should do before they did it, their rate of return from their “investments” more than doubled.

“Small is beautiful” sometimes seemed to be her creed. Her workshop looked somewhat like a large, cluttered cottage, reflecting her and Vincent’s idea that science was a form of artisanship. When the vogue in America was all for consolidation of public services, she ran against it. For some years she compared police forces in the town of Speedway and the city of Indianapolis, finding that forces of 25-50 officers performed better by almost every measure than 100-strong metropolitan teams. But smaller institutions, she cautioned, might not work better in every case. As she travelled the world, giving out good and sharp advice, “No panaceas!” was her cry.

Scarves for the troops

Rather than littleness, collaboration was her watchword. Neighbours thrived if they worked together. The best-laid communal schemes would fall apart once people began to act only as individuals, or formed elites. Born poor herself, to a jobless film-set-maker in Los Angeles who soon left her mother alone, she despaired of people who wanted only a grand house or a fancy car. Her childhood world was coloured by digging a wartime “victory” vegetable garden, knitting scarves for the troops, buying her clothes in a charity store: mutual efforts to a mutual end.

The same approach was valuable in academia, too. Her own field, institutional economics (or “the study of social dilemmas”, as she thought of it), straddled political science, ecology, psychology and anthropology. She liked to learn from all of them, marching boldly across the demarcation lines to hammer out good policy, and she welcomed workshop-partners from any discipline, singing folk songs with them, too, if anyone had a guitar. They were family. Pure economists looked askance at this perky, untidy figure, especially when she became the first woman to win a shared Nobel prize for economics in 2009. She was not put out; it was the workshop’s prize, anyway, she said, and the money would go for scholarships.

Yet the incident shed a keen light on one particular sort of collaboration: that between men and women. Lin (as everyone called her) and Vincent, both much-honoured professors, were joint stars of their university in old age. But she had been dissuaded from studying economics at UCLA because, being a girl, she had been steered away from maths at high school; and she was dissuaded from doing political science because, being a girl, she could not hope for a good university post. As a graduate, she had been offered only secretarial jobs; and her first post at Indiana involved teaching a 7.30am class in government that no one else would take.

There was, she believed, a great common fund of sense and wisdom in the world. But it had been an uphill struggle to show that it reposed in both women and men; and that humanity would do best if it could exploit it to the full.

Para evitar catástrofes ambientais (FAPERJ)

Vilma Homero

05/07/2012

 Nelson Fernandes / UFRJ
 
  Novos métodos podem prever onde e quando
ocorrerão deslizamentos na região serrana

Quando várias áreas de Nova Friburgo, Petrópolis e Teresópolis sofreram deslizamentos, em janeiro de 2011, soterrando mais de mil pessoas em toneladas de lama e destroços, a pergunta que ficou no ar foi se o desastre poderia ter sido minimizado. No que depender do Instituto de Geociências da Universidade Federal do Rio de Janeiro (UFRJ), as consequências provocadas por cataclismas ambientais como esses poderão ser cada vez menores. Para isso, os pesquisadores estão desenvolvendo uma série de projetos multidisciplinares para viabilizar sistemas de análise de riscos. Um deles é o Prever, que, com suporte de programas computacionais, une os avanços alcançados em metodologias de sensoriamento remoto, geoprocessamento, geomorfologia e geotecnia, à modelagem matemática para a previsão do tempo em áreas mais suscetíveis a deslizamentos, como a região serrana. “Embora a realidade dos vários municípios daquela região seja bastante diferente, há em comum uma falta de metodologias voltadas à previsão para esse tipo de risco. O fundamental agora é desenvolver métodos capazes de prever a localização espacial e temporal desses processos. Ou seja, saber “onde” e “quando” esses deslizamentos podem ocorrer”, explica o geólogo Nelson Ferreira Fernandes, professor do Departamento de Geografia da UFRJ e Cientista do Nosso Estado da FAPERJ.Para elaborar métodos de previsão de risco, em tempo real, que incluam movimentos de massa deflagrados em resposta a entradas pluviométricas, os pesquisadores estão traçando um mapeamento, realizado a partir de sucessivas imagens captadas por satélites, que são cruzadas com mapas geológicos e geotécnicos. O Prever combina modelos de simulação climática e de previsão de eventos pluviométricos extremos, desenvolvidos na área da meteorologia, com modelos matemáticos de previsão, mais as informações desenvolvidos pela geomorfologia e pela geotecnia, que nos indicam as áreas mais suscetíveis a deslizamentos. Assim, podemos elaborar traçar previsões de risco, em tempo real, classificando os resultados de acordo com a gravidade desse risco, que varia continuamente, no espaço e no tempo”, explica Nelson.

Para isso, os Departamentos de Geografia, Geologia e Meteorologia do Instituto de Geociências da UFRJ se unem à Faculdade de Geologia da Universidade do Estado do Rio de Janeiro (Uerj) e ao Departamento de Engenharia Civil da Pontifícia Universidade Católica (PUC-Rio). Com a sobreposição de informações, pode-se apontar, nas imagens resultantes, as áreas mais sensíveis a deslizamentos. “Somando esses conhecimentos acadêmicos aos dados de órgãos estaduais, como o Núcleo de Análise de Desastres (Nade), do Departamento de Recursos Minerais (DRM-RJ), responsável pelo apoio técnico à Defesa Civil, estaremos não apenas atualizando constantemente os mapas usados hoje pelos órgãos do governo do estado e pela Defesa Civil, como estaremos também facilitando um planejamento mais preciso para a tomada de decisões.”

 Divulgação / UFRJ
Uma simulação mostra em imagem a possibilidade de
um deslizamento de massas na
 região de Jacarepaguá

Esse novo mapeamento também significa melhor qualidade e maior precisão e mais detalhamento de imagens. “Obviamente, com melhores instrumentos em mãos, o que quer dizer mapas mais detalhados e precisos, os gestores públicos também poderão planejar e agir de forma mais acurada e em tempo real”, afirma Nelson. Segundo o pesquisador, esses mapas precisam ter atualização constante para acompanhar a dinâmica da interferência da ocupação humana sobre a topografia das várias regiões. “Isso vem acontecendo seja pelo corte de encostas, seja pela ocupação de áreas aterradas ou pelas mudanças em consequência da drenagem de rios. Tudo isso altera a topografia e, no caso de chuvas mais fortes e prolongadas, pode tornar determinados solos mais propensos a deslizamentos ou a alagamentos e enchentes”, exemplifica Nelson.Mas os sistemas de análises de desastres e riscos ambientais também compreendem outras linhas de pesquisa. No Prever, se trabalha em duas linhas de ação distintas. “Uma delas é a de clima, em que detectamos as áreas em que haverá um aumento pluviométrico a longo prazo e fornecemos informações a órgãos de decisão e planejamento. Outra é a previsão de curtíssimo prazo, o chamadonowcasting.” No caso de previsão de longo prazo, a professora Ana Maria Bueno Nunes, do Departamento de Meteorologia da mesma universidade, vem trabalhando no projeto “Implementação de um Sistema de Modelagem Regional: Estudos de Tempo e Clima”, sob sua coordenação, com a proposta de uma reconstrução do hidroclima da América do Sul, uma extensão daquele projeto.

“Unindo dados sobre precipitação fornecidos por satélite às informações das estações atmosféricas, é possível, através de modelagem computacional, traçar estimativas de precipitação. Assim, podemos não apenas saber quando haverá chuvas de intensidade mais forte, ou mais prolongadas, como também observar em mapas passados qual foi a convergência de fatores que provocou uma situação de desastre. A reconstrução é uma forma de estudar o passado para entender cenários atuais que se mostrem semelhantes. E, com isso, ajudamos a melhorar os modelos de previsão”, afirma Ana. Estas informações, que a princípio servirão para uso acadêmico e científico, permitirão que se tenha dados cada vez mais detalhados de como se formam grandes chuvas, aquelas que são capazes de provocar inundações em determinadas áreas. “Isso permitirá não apenas compreender melhor as condições em que certas situações de calamidade acontecem, como prever quando essas condições podem se repetir. Com o projeto, estamos também formando recursos humanos ainda mais especializados nessa área”, avalia a pesquisadora, cujo trabalho conta com recursos de um Auxílio à Pesquisa (APQ 1).

Também integrante do projeto, o professor Gutemberg Borges França, da UFRJ, explica que existem três tipos de previsão meteorológica: a sinótica – que traça previsões numa média de 6h até sete dias, cobrindo alguns milhares de km, como o continente sul-americano; a de mesoescala, que faz previsões sobre uma média de 6h a dois dias, cobrindo algumas centenas de km, como o estado do Rio de Janeiro; e a de curto prazo, ou nowcasting, que varia de poucos minutos até 3h a 6h, sobre uma área específica de poucos km, como a região metropolitana do Rio de Janeiro, por exemplo.

Se previsões de longo prazo são importantes, as de curto prazo, ou nowcasting, também são. Segundo Gutemberg, os atuais modelos numéricos de previsão ainda são deficientes para realizar a previsão de curto prazo, que termina sendo feita em grande parte com base na experiência do meteorologista, pela interpretação das informações de várias fontes de dados disponíveis, como imagens de satélites; de estações meteorológicas de superfície e altitude; de radar e sodar (Sonic Detection and Ranging), e modelos numéricos. “No entanto, o meteorologista carece ainda hoje de ferramentas objetivas que possam auxiliá-lo na integração dessas diversas informações para realizar uma previsão de curto prazo mais acurada”, argumenta Gutemberg.Atualmente, o Rio de Janeiro já dispõe de estações de recepção de satélites, estação de altitude – radiosondagem – que geram perfis atmosféricos, estações meteorológicas de superfície e radar. O Laboratório de Meteorologia Aplicada do Departamento de Meteorologia, da UFRJ, está desenvolvendo, desde 2005, ferramentas de previsão de curto prazo, utilizando inteligência computacional, visando o aprimoramento das previsões de eventos meteorológicos extremos para o Rio de Janeiro. “Com inteligência computacional, temos essa informação em tempo mais curto e de forma mais acurada.”, resume.

© FAPERJ – Todas as matérias poderão ser reproduzidas, desde que citada a fonte.

This summer is ‘what global warming looks like’ (AP) + related & reactions

Jul 3, 1:10 PM EDT

By SETH BORENSTEIN
AP Science Writer

AP PhotoAP Photo/Matthew Barakat

WASHINGTON (AP) — Is it just freakish weather or something more? Climate scientists suggest that if you want a glimpse of some of the worst of global warming, take a look at U.S. weather in recent weeks.

Horrendous wildfires. Oppressive heat waves. Devastating droughts. Flooding from giant deluges. And a powerful freak wind storm called a derecho.

These are the kinds of extremes experts have predicted will come with climate change, although it’s far too early to say that is the cause. Nor will they say global warming is the reason 3,215 daily high temperature records were set in the month of June.

Scientifically linking individual weather events to climate change takes intensive study, complicated mathematics, computer models and lots of time. Sometimes it isn’t caused by global warming. Weather is always variable; freak things happen.

And this weather has been local. Europe, Asia and Africa aren’t having similar disasters now, although they’ve had their own extreme events in recent years.

But since at least 1988, climate scientists have warned that climate change would bring, in general, increased heat waves, more droughts, more sudden downpours, more widespread wildfires and worsening storms. In the United States, those extremes are happening here and now.

So far this year, more than 2.1 million acres have burned in wildfires, more than 113 million people in the U.S. were in areas under extreme heat advisories last Friday, two-thirds of the country is experiencing drought, and earlier in June, deluges flooded Minnesota and Florida.

“This is what global warming looks like at the regional or personal level,” said Jonathan Overpeck, professor of geosciences and atmospheric sciences at the University of Arizona. “The extra heat increases the odds of worse heat waves, droughts, storms and wildfire. This is certainly what I and many other climate scientists have been warning about.”

Kevin Trenberth, head of climate analysis at the National Center for Atmospheric Research in fire-charred Colorado, said these are the very record-breaking conditions he has said would happen, but many people wouldn’t listen. So it’s I told-you-so time, he said.

As recently as March, a special report an extreme events and disasters by the Nobel Prize-winning Intergovernmental Panel on Climate Change warned of “unprecedented extreme weather and climate events.” Its lead author, Chris Field of the Carnegie Institution and Stanford University, said Monday, “It’s really dramatic how many of the patterns that we’ve talked about as the expression of the extremes are hitting the U.S. right now.”

“What we’re seeing really is a window into what global warming really looks like,” said Princeton University geosciences and international affairs professor Michael Oppenheimer. “It looks like heat. It looks like fires. It looks like this kind of environmental disasters.”

Oppenheimer said that on Thursday. That was before the East Coast was hit with triple-digit temperatures and before a derecho – a large, powerful and long-lasting straight-line wind storm – blew from Chicago to Washington. The storm and its aftermath killed more than 20 people and left millions without electricity. Experts say it had energy readings five times that of normal thunderstorms.

Fueled by the record high heat, this was among the strongest of this type of storm in the region in recent history, said research meteorologist Harold Brooks of the National Severe Storm Laboratory in Norman, Okla. Scientists expect “non-tornadic wind events” like this one and other thunderstorms to increase with climate change because of the heat and instability, he said.

Such patterns haven’t happened only in the past week or two. The spring and winter in the U.S. were the warmest on record and among the least snowy, setting the stage for the weather extremes to come, scientists say.

Since Jan. 1, the United States has set more than 40,000 hot temperature records, but fewer than 6,000 cold temperature records, according to the National Oceanic and Atmospheric Administration. Through most of last century, the U.S. used to set cold and hot records evenly, but in the first decade of this century America set two hot records for every cold one, said Jerry Meehl, a climate extreme expert at the National Center for Atmospheric Research. This year the ratio is about 7 hot to 1 cold. Some computer models say that ratio will hit 20-to-1 by midcentury, Meehl said.

“In the future you would expect larger, longer more intense heat waves and we’ve seen that in the last few summers,” NOAA Climate Monitoring chief Derek Arndt said.

The 100-degree heat, drought, early snowpack melt and beetles waking from hibernation early to strip trees all combined to set the stage for the current unusual spread of wildfires in the West, said University of Montana ecosystems professor Steven Running, an expert on wildfires.

While at least 15 climate scientists told The Associated Press that this long hot U.S. summer is consistent with what is to be expected in global warming, history is full of such extremes, said John Christy at the University of Alabama in Huntsville. He’s a global warming skeptic who says, “The guilty party in my view is Mother Nature.”

But the vast majority of mainstream climate scientists, such as Meehl, disagree: “This is what global warming is like, and we’ll see more of this as we go into the future.”

Intergovernmental Panel on Climate Change report on extreme weather: http://ipcc-wg2.gov/SREX/

U.S. weather records:

http://www.ncdc.noaa.gov/extremes/records/

Seth Borenstein can be followed at http://twitter.com/borenbears

© 2012 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed. Learn more about our Privacy Policy and Terms of Use.

*   *   *

July 3, 2012

To Predict Environmental Doom, Ignore the Past

http://www.realclearscience.com

By Todd Myers

The information presented here cannot be used directly to calculate Earth’s long-term carrying capacity for human beings because, among other things, carrying capacity depends on both the affluence of the population being supported and the technologies supporting it. – Paul Ehrlich, 1986

One would expect scientists to pause when they realize their argument about resource collapse makes the king of environmental catastrophe, Paul Ehrlich, look moderate by comparison. Ehrlich is best known for a 40-year series of wildly inaccurate predictions of looming environmental disaster. Yet he looks positively reasonable compared to a paper recently published in the scientific journal Nature titled “Approaching a state shift in Earth’s biosphere.”

The paper predicts we are rapidly approaching a moment of “planetary-scale critical transition,” due to overuse of resources, climate change and other human-caused environmental damage. As a result, the authors conclude, this will “require reducing world population growth and per-capita resource use; rapidly increasing the proportion of the world’s energy budget that is supplied by sources other than fossil fuels,” and a range of other drastic policies. If these sound much like the ideas proposed in the 1970s by Ehrlich and others, like The Club of Rome, it is not a coincidence. TheNature paper is built on Ehrlich’s assumptions and cites his work more than once.

The Nature article, however, suffers from numerous simple statistical errors and assumptions rather than evidence. Its authors do nothing to deal with the fundamental mistakes that led Ehrlich and others like him down the wrong path so many times. Instead, the paper simply argues that with improved data, this time their predictions of doom are correct.

Ultimately, the piece is a good example of the great philosopher of science Thomas Kuhn’s hypothesis, written 50 years ago, that scientists often attempt to fit the data to conform to their particular scientific paradigm, even when that paradigm is obviously flawed. When confronted with failure to explain real-world phenomena, the authors of the Nature piece have, as Kuhn described in The Structure of Scientific Revolutions, devised “numerous articulations and ad hoc modifications of their theory in order to eliminate any apparent conflict.” Like scientists blindly devoted to a failed paradigm, the Nature piece simply tries to force new data to fit a flawed concept.

“Assuming this does not change”

During the last half-century, the world has witnessed a dramatic increase in food production. According to the U.N.’s Food and Agriculture Organization, yields per acre of rice have more than doubled, corn yields are more than one-and-a-half times larger than 50 years ago, and wheat yields have almost tripled. As a result, even as human population has increased, worldwide hunger has declined.

Despite these well-known statistics, the authors of the Nature study assume not only no future technological improvements, but that none have occurred over the last 200 years. The authors simply choose one data point and then project it both into the past and into the future. The authors explain the assumption that underlies their thesis in the caption to a graphic showing the Earth approaching environmental saturation. They write:

“The percentages of such transformed lands… when divided by 7,000,000,000 (the present global human population) yield a value of approximately 2.27 acres (0.92 ha) of transformed land for each person. That value was used to estimate the amount of transformed land that probably existed in the years 1800, 1900 and 1950, and which would exist in 2025 and 2045 assuming conservative population growth and that resource use does not become any more efficient.” (emphasis added)

In other words, the basis for their argument ignores the easily accessible data from the last half century. They take a snapshot in time and mistake it for a historical trend. In contrast to their claim of no change in the efficient use of resources, it would be difficult to find a time period in the last millennium when resource use did not become more efficient.

Ironically, this is the very error Ehrlich warns against in his 1986 paper – a paper the authors themselves cite several times. Despite Ehrlich’s admonition that projections of future carrying capacity are dependent upon technological change, the authors of the Nature article ignore history to come to their desired conclusion.

A Paradigm of Catastrophe

What would lead scientists to make such simplistic assumptions and flat-line projections? Indeed, what would lead Nature editors to print an article whose statistical underpinnings are so flawed? The simple belief in the paradigm of inevitable environmental catastrophe: humans are doing irreparable damage to the Earth and every bit of resource use moves us closer to that catastrophe. The catastrophe paradigm argues a simple model that eventually we will run out of space and resources, and determining the date of ultimate doom is a simple matter of doing the math.

Believing in this paradigm also justifies exaggeration in order to stave off the serious consequences of collapse. Thus, they describe the United Nations’ likely population estimate for 2050 as “the most conservative,” without explaining why. They claim “rapid climate change shows no signs of slowing” without providing a source citation for the claim, and despite an actual slowing of climate change over the last decade.

The need to avoid perceived global catastrophe also encourages the authors to blow past warning signs that their analysis is not built on solid foundations – as if the poor history of such projections were not already warning enough. Even as they admit the interactions “between overlapping complex systems, however, are providing difficult to characterize mathematically,” they base their conclusions on the simplest linear mathematical estimate that assumes nothing will change except population over the next 40 years. They then draw a straight line, literally, from today to the environmental tipping point.

Why is such an unscientific approach allowed to pass for science in a respected international journal? Because whatever the argument does not supply, the paradigm conveniently fills in. Even if the math isn’t reliable and there are obvious counterarguments, “everyone” understands and believes in the underlying truth – we are nearing the limits of the planet’s ability to support life. In this way the conclusion is not proven but assumed, making the supporting argument an impenetrable tautology.

Such a circumstance creates the conditions of scientific revolutions, where the old paradigm fails to explain real-world phenomena and is replaced by an alternative. Given the record of failure of the paradigm of resource catastrophe, dating back to the 1970s, one would hope we are moving toward such a change. Unfortunately, Nature and the authors of the piece are clinging to the old resource-depletion model, simply trying to re-work the numbers.

Let us hope policymakers recognize the failure of that paradigm before they make costly and dangerous policy mistakes that impoverish billions in the name of false scientific assumptions.

Todd Myers is the Environmental Director of the Washington Policy Center and author of the book Eco-Fads.

*   *   *

Washington Policy Center exposed: Todd Myers

The Washington Policy Center labels itself as a non-partisan think tank. It’s a mischaractization to say the least but that is their bread and butter. Based in Seattle, with a director in Spokane, the WPC’s mission is to “promote free-market solutions through research and education.” It makes sense they have an environmental director in the form of Todd Myers who has a new book called“Eco-Fads: How The Rise Of Trendy Environmentalism Is Harming The Environment.” You know, since polar bears love to swim.


From the WPC’s newsletter:

Wherever we turn, politicians, businesses and activists are promoting the latest fashionable “green” policy or product. Green buildings, biofuels, electric cars, compact fluorescent lightbulbs and a variety of other technologies are touted as the next key step in protecting the environment and promoting a sustainable future. Increasingly, however, scientific and economic information regarding environmental problems takes a back seat to the social and personal value of being seen and perceived as “green.”

As environmental consciousness has become socially popular, eco-fads supplant objective data. Politicians pick the latest environmental agenda in the same way we choose the fall fashions – looking for what will yield the largest benefit with our public and social circles.

Eco-Fads exposes the pressures that cause politicians, businesses, the media and even scientists to fall for trendy environmental fads. It examines why we fall for such fads, even when we should know better. The desire to “be green” can cloud our judgment, causing us to place things that make us appear green ahead of actions that may be socially invisible yet environmentally responsible.

By recognizing the range of forces that have taken us in the wrong direction, Eco-Fads shows how we can begin to get back on track, creating a prosperous and sustainable legacy for our planet’s future. Order Eco-Fads today for $26.95 (tax and shipping included).

This is what the newsletter doesn’t tell you about Todd Myers.

Myers has spoken at the Heartland Institute’s International Conference on Climate Change. In case you didn’t know, the Heartland Institute has received significant funding from ExxonMobil, Phillip Morris and numerous other corporations and conservative foundations with vested interest in the so-called debate around climate change. That conference was co-sponsored by numerous prominent climate change denier groups, think tanks and lobby groups, almost all of which have received money from the oil industry.

Why not just call it the Washington Fallacy Center? For a litte more background, including ties back to the Koch Brothers, go HERE. In fact, Jack Kemp calls it “The Heritage Foundation of the Northwest.”

*   *   *

 

Did climate change ’cause’ the Colorado wildfires?

By David Roberts

29 Jun 2012 1:50 PM

http://grist.org

Photo by USAF.

The wildfires raging through Colorado and the West are unbelievable. As of yesterday there were 242 fires burning, according to the National Interagency Fire Center. Almost 350 homes have been destroyed in Colorado Springs, where 36,000 people have been evacuated from their homes. President Obama is visiting today to assess the devastation for himself.

Obviously the priority is containing the fires and protecting people. But inevitably the question is going to come up: Did climate change “cause” the fires? Regular readers know that this question drives me a little nuts. Pardon the long post, but I want to try to tackle this causation question once and for all.

What caused the Colorado Springs fire? Well, it was probably a careless toss of a cigarette butt, or someone burning leaves in their backyard, or a campfire that wasn’t properly doused. [UPDATE:Turns out it was lightning.] That spark, wherever it came from, is what triggered the cascading series of events we call “a fire.” It was what philosophers call the proximate cause, the most immediate, the closest.

All the other factors being discussed — the intense drought covering the state, the dead trees left behind by bark beetles, the high winds — are distal causes. Distal causes are less tightly connected to their effects. The dead trees didn’t make any particular fire inevitable; there can be no fire without a spark. What they did is make it more likelythat a fire would occur. Distal causes are like that: probabilistic. Nonetheless, our intuitions tell us that distal causes are in many ways more satisfactory explanations. They tell us something about themeaning of events, not just the mechanisms, which is why they’re also called “ultimate” causes. It’s meaning we usually want.

When we say, “the fires in Colorado were caused by unusually dry conditions, high winds, and diseased trees,” no one accuses us of error or imprecision because it was “really” the matches or campfires that caused them. We are not expected to say, “no individual fire can be definitively attributed to hot, windy conditions, but these are the kinds of fires we would expect to see in those conditions.” Why waste the words? We are understood to be talking about distal causes.

When we talk about, not fires themselves, but the economic and socialimpacts of fires, the range of distal causes grows even broader. For a given level of damages, it’s not enough to have dry conditions and dead trees, not even enough to have fire — you also have to take into account the density of development, the responsiveness of emergency services, and the preparedness of communities for prevention or evacuation.

So if we say, “the limited human toll of the Colorado fires is the result of the bravery and skill of Western firefighters,” no one accuses us of error or imprecision because good firefighting was only one of many contributors to the final level of damages. Everything from evacuation plans to the quality of the roads to the vagaries of the weather contributed in some way to that state of affairs. But we are understood to be identifying a distal cause, not giving a comprehensive account of causation.

What I’m trying to say is, we are perfectly comfortable discussing distal causes in ordinary language. We don’t require scientistic literalism in our everyday talk.

The reason I’m going through all this, you won’t be surprised, is to tie it back to climate change. We know, of course, that climate change was not the proximate cause of the fires. It was a distal cause; it made the fires more likely. That much we know with a high degree of confidence, as this excellent review of the latest science by Climate Communication makes clear.

One can distinguish between distal causes by their proximity to effects. Say the drought made the fires 50 percent more likely than average June conditions in Colorado. (I’m just pulling these numbers out of my ass to illustrate a point.) Climate change maybe only made the fires 1 percent more likely. As a cause, it is more distal than the drought. And there are probably causes even more distal than climate change. Maybe the exact tilt of the earth’s axis this June made the fires 0.0001 percent more likely. Maybe the location of a particular proton during the Big Bang made them 0.000000000000000001 percent more likely. You get the point.

With this in mind, it’s clear that the question as it’s frequently asked — “did climate change cause the fires?” — is not going to get us the answer we want. If it’s yes or no, the answer is “yes.” But that doesn’t tell us much. What people really want to know when they ask that question is, “how proximate a cause is climate change?”

When we ask the question like that, we start to see why climate is such a wicked problem. Human beings, by virtue of their evolution, physiology, and socialization, are designed to heed causes within a particular range between proximate and distal. If I find my kid next to an overturned glass and a puddle of milk and ask him why the milk is spilled, I don’t care about the neurons firing and the muscles contracting. That’s too proximate. I don’t care about humans evolving with poor peripheral vision. That’s too distal. I care about my kid reaching for it and knocking it over. That’s not the only level of causal explanation that is correct, but it’s the level of causal explanation that is most meaningful to me.

For a given effect — a fire, a flood, a dead forest — climate change is almost always too distal a cause to make a visceral impression on us. We’re just not built to pay heed to those 1 percent margins. It’s too abstract. The problem is, wildfires being 1 percent more likely averaged over the whole globe actually means a lot more fires, a lot more damage, loss, and human suffering. Part of managing the Anthropocene is finding ways of making distal causes visceral, giving them a bigger role in our thinking and institutions.

That’s what the “did climate change cause XYZ?” questions are always really about: how proximate a cause climate change is, how immediate its effects are in our lives, how close it is.

There is, of course, a constant temptation among climate hawks to exaggerate how proximate it is, since, all things being equal, proximity = salience. But I don’t think that simply saying “climate change caused the fires” is necessarily false or exaggerated, any more than saying “drought caused the fires” is. The fact that the former strikes many people as suspect while the latter is immediately understood mostly just means that we’re not used to thinking of climate change as a distal cause among others.

That’s why we reach for awkward language like, “fires like this are consonant with what we would expect from climate change.” Not because that’s the way we discuss all distal causes — it’s clearly not — but simply because we’re unaccustomed to counting climate change among those causes. It’s an unfamiliar habit. As it grows more familiar, I suspect we’ll quit having so many of these tedious semantic disputes.

And I’m afraid that, in coming years, it will become all-too familiar.

*   *   *

 

Perspective On The Hot and Dry Continental USA For 2012 Based On The Research Of Judy Curry and Of McCabe Et Al 2004

http://pielkeclimatesci.wordpress.com

Photo is from June 26 2012 showing start of the June 26 Flagstaff firenear Boulder Colorado

I was alerted to an excellent presentation by Judy Curry [h/t to Don Bishop] which provides an informative explanation of the current hot and dry weather in the USA. The presentation is titled

Climate Dimensions of the Water Cycle by Judy Curry

First, there is an insightful statement by Judy where she writes in slide 5

CMIP century scale simulations are designed for assessing sensitivity to greenhouse gases using emissions scenarios They are not fit for the purpose of inferring decadal scale or regional climate variability, or assessing variations associated with natural forcing and internal variability. Downscaling does not help.

We need a much broader range of scenarios for regions (historical data, simple models, statistical models, paleoclimate analyses, etc). Permit creatively constructed scenarios as long as they can’t be falsified as incompatible with background knowledge.

With respect to the current hot and dry weather, the paper referenced by Judy in her Powerpoint talk

Gregory J. McCabe, Michael A. Palecki, and Julio L. Betancourt, 2004: Pacific and Atlantic Ocean influences on multidecadal drought frequency in the United States. PNAS 2004 101 (12) 4136-4141; published ahead of print March 11, 2004, doi:10.1073/pnas.0306738101

has the abstract [highlight added]

More than half (52%) of the spatial and temporal variance in multidecadal drought frequency over the conterminous United States is attributable to the Pacific Decadal Oscillation (PDO) and the Atlantic Multidecadal Oscillation (AMO). An additional 22% of the variance in drought frequency is related to a complex spatial pattern of positive and negative trends in drought occurrence possibly related to increasing Northern Hemisphere temperatures or some other unidirectional climate trend. Recent droughts with broad impacts over the conterminous U.S. (1996, 1999–2002) were associated with North Atlantic warming (positive AMO) and northeastern and tropical Pacific cooling (negative PDO). Much of the long-term predictability of drought frequency may reside in the multidecadal behavior of the North Atlantic Ocean. Should the current positive AMO (warm North Atlantic) conditions persist into the upcoming decade, we suggest two possible drought scenarios that resemble the continental-scale patterns of the 1930s (positive PDO) and 1950s (negative PDO) drought.

They also present the figure below with the title “Impact of AMO, PDO on 20-yr drought frequency (1900-1999)”.   The figures correspond to A: Warm PDO, cool AMO; B: Cool PDO, cool AMO; C: Warm PDO, warm AMO and D:  Cool PDO, warm AMO

The current Drought Monitor analysis shows a remarkable agreement with D, as shown below

As Judy shows in her talk (slide 8) since 1995 we have been in a warm phase of the AMO and have entered a cool phase of the PDO. This corresponds to D in the above figure.  Thus the current drought and heat is not an unprecedented event but part of the variations in atmospheric-ocean circulation features that we have seen in the past.  This reinforces what Judy wrote that

[w]e need a much broader range of scenarios for regions (historical data, simple models, statistical models, paleoclimate analyses

in our assessment of risks to key resources due to climate. Insightful discussions of the importance of these circulation features are also presented, as just a few excellent examples, by Joe Daleo  and Joe Bistardi on ICECAP, by Bob Tisdale at Bob Tisdale – Climate Observations, and in posts on Anthony Watts’s weblog Watts Up With That.

 

*   *   *

Hotter summers could be a part of Washington’s future

http://www.washingtonpost.com

By  and , Published: July 5

As relentless heat continues to pulverize Washington, the conversation has evolved from when will it end to what if it never does?

Are unbroken weeks of sweltering weather becoming the norm rather than the exception?

The answer to the first question is simple: Yes, it will end. Probably by Monday.

The answer to the second, however, is a little more complicated.

Call it a qualified yes.

“Trying to wrap an analysis around it in real time is like trying to diagnose a car wreck as the cars are still spinning,” said Deke Arndt, chief of climate monitoring at the National Climatic Data Center in Asheville, N.C. “But we had record heat for the summer season on the Eastern Seaboard in 2010. We had not just record heat, but all-time record heat, in the summer season in 2011. And then you throw that on top of this [mild] winter and spring and the year to date so far, it’s very consistent with what we’d expect in a warming world.”

Nothing dreadfully dramatic is taking place — the seasons are not about to give way to an endless summer.

Heat-trapping greenhouse gases pumped into the atmosphere may be contributing to unusually hot and long heat waves — the kind of events climate scientists have long warned will become more common. Many anticipate a steady trend of ever-hotter average temperatures as human activity generates more and more carbon pollution.

To some, the numbers recorded this month and in recent years fit together to suggest a balmy future.

“We had a warm winter, a cold spring and now a real hot summer,” said Jessica Miller, 21, a visitor from Ohio, as she sat on a bench beneath the trees in Lafayette Square. “I think the overall weather patterns are changing.”

Another visitor, who sat nearby just across from the White House, shared a similar view.

“I think it’s a natural changing of the Earth’s average temperatures,” said Joe Kaufman, a Pennsylvanian who had just walked over from Georgetown.

Arndt said he expects data for the first half of this year will show that it was the warmest six months on record. Experts predict that average temperatures will rise by 3 to 5 degrees by mid-century and by 6 to 10 degrees by the end of the century.

If that worst prediction comes true, 98 degrees will become the new normal at this time of year in Washington 88 years from now.

Will every passing year till then break records?

“Not so much record-breaking every year,” Arndt said. “But we’ll break records on the warm end more often than on the cold end, that’s for sure. As we continue to warm, we will be flirting with warm records much more than with cold records, and that’s what’s played out over much of the last few years.”

If the present is our future, it may be sizzling. The current heat wave has had eight consecutive days of 95-degree weather. The temperature may reach 106 on Saturday, and the first break will come Monday, when a few days of more seasonable highs in the upper 80s are expected.

The hot streak began June 28 and peaked the next day with a 104-degree record-breaker, the hottest temperature ever recorded here in June. That broke a record of 102 set in 1874 and matched in June 2011.