Arquivo da categoria: previsão

>The delicate wine grape has become our best early-warning system for the effects of global warming (Slate)

>
climate desk – In Vino Veritas

The delicate wine grape has become our best early-warning system for the effects of global warming.

By Mark Hertsgaard
Posted Monday, April 26, 2010, at 11:01 AM ET

John Williams has been making wine in California’s Napa Valley for nearly 30 years, and he farms so ecologically that his peers call him Mr. Green. But if you ask him how climate change will affect Napa’s world-famous wines, he gets irritated, almost insulted. “You know, I’ve been getting that question a lot recently, and I feel we need to keep this issue in perspective,” he told me. “When I hear about global warming in the news, I hear that it’s going to melt the Arctic, inundate coastal cities, displace millions and millions of people, spread tropical diseases and bring lots of other horrible effects. Then I get calls from wine writers and all they want to know is, ‘How is the character of cabernet sauvignon going to change under global warming?’ I worry about global warming, but I worry about it at the humanity scale, not the vineyard scale.”

Williams is the founder of Frog’s Leap, one of the most ecologically minded wineries in Napa and, for that matter, the world. Electricity for the operation comes from 1,000 solar panels erected along the merlot vines; the heating and cooling are supplied by a geothermal system that taps into the Earth’s heat. The vineyards are 100 percent organic and—most radical of all, considering Napa’s dry summers—there is no irrigation.

Yet despite his environmental fervor, Williams dismisses questions about preparing Frog’s Leap for the impacts of climate change. “We have no idea what effects global warming will have on the conditions that affect Napa Valley wines, so to prepare for those changes seems to me to be whistling past the cemetery,” he says, a note of irritation in his voice. “All I know is, there are things I can do to stop, or at least slow down, global warming, and those are things I should do.”

Williams has a point about keeping things in perspective. At a time when climate change is already making it harder for people in Bangladesh to find enough drinking water, it seems callous to fret about what might happen to premium wines. But there is much more to the question of wine and climate change than the character of pinot noir. Because wine grapes are extraordinarily sensitive to temperature, the industry amounts to an early-warning system for problems that all food crops—and all industries—will confront as global warming intensifies. In vino veritas, the Romans said: In wine there is truth. The truth now is that the Earth’s climate is changing much faster than the wine business, and virtually every other business on Earth, is preparing for.

All crops need favorable climates, but few are as vulnerable to temperature and other extremes as wine grapes. “There is a fifteenfold difference in the price of cabernet sauvignon grapes that are grown in Napa Valley and cabernet sauvignon grapes grown in Fresno,” in California’s hot Central Valley, says Kim Cahill, a consultant to the Napa Valley Vintners’ Association. “Cab grapes grown in Napa sold [in 2006] for $4,100 a ton. In Fresno the price was $260 a ton. The difference in average temperature between Napa and Fresno was 5 degrees Fahrenheit.”

Numbers like that help explain why climate change is poised to clobber the global wine industry, a multibillion-dollar business whose decline would also damage the much larger industries of food, restaurants, and tourism. Every business on Earth will feel the effects of global warming, but only the ski industry—which appears doomed in its current form—is more visibly targeted by the hot, erratic weather that lies in store over the next 50 years. In France, the rise in temperatures may render the Champagne region too hot to produce fine champagne. The same is true for the legendary reds of Châteauneuf du Pape, where the stony white soil’s ability to retain heat, once considered a virtue, may now become a curse. The world’s other major wine-producing regions—California, Italy, Spain, Australia—are also at risk.

If current trends continue, the “premium wine grape production area [in the United States] … could decline by up to 81 percent by the late 21st century,” a team of scientists wrote in a study published in the Proceedings of the National Academy of Sciences in 2006. The culprit was not so much the rise in average temperatures but an increased frequency of extremely hot days, defined as above 35 degrees Celsius (95 degrees Fahrenheit). If no adaptation measures were taken, these increased heat spikes would “eliminate wine grape production in many areas of the United States,” the scientists wrote.

In theory, winemakers can defuse the threat by simply shifting production to more congenial locations. Indeed, Champagne grapes have already been planted in England and some respectable vintages harvested. But there are limits to this strategy. After all, temperature is not the sole determinant of a wine’s taste. What the French call terroir—a term that refers to the soil of a given region but also includes the cultural knowledge of the people who grow and process grapes—is crucial. “Wine is tied to place more than any other form of agriculture, in the sense that the names of the place are on the bottle,” says David Graves, the co-founder of the Saintsbury wine company in the Napa Valley. “If traditional sugar-beet growing regions in eastern Colorado had to move north, nobody would care. But if wine grapes can’t grow in the Napa Valley anymore—which is an extreme statement, but let’s say so for the sake of argument—suddenly you have a global warming poster child right up there with the polar bears.”

A handful of climate-savvy winemakers such as Graves are trying to rouse their colleagues to action before it is too late, but to little avail. Indeed, some winemakers are actually rejoicing in the higher temperatures of recent years. “Some of the most expensive wines in Spain come from the Rioja Alta and Rioja Alavesa regions,” Pancho Campo, the founder and president of the Wine Academy of Spain, says. “They are getting almost perfect ripeness every year now for Tempranillo. This makes the winemakers say, ‘Who cares about climate change? We are getting perfect vintages.’ The same thing has happened in Bordeaux. It is very difficult to tell someone, ‘This is only going to be the case for another few years.’ “

The irony is, the wine business is better situated than most to adapt to global warming. Many of the people in the industry followed in their parents’ footsteps and hope to pass the business on to their kids and grandkids someday. This should lead them to think further ahead than the average corporation, with its obsessive focus on this quarter’s financial results. But I found little evidence this is happening.

The exception: Alois Lageder’s family has made wine in Alto Adige, the northernmost province in Italy, since 1855. The setting, at the foot of the Alps, is majestic. Looming over the vines are massive outcroppings of black and gray granite interspersed with flower-strewn meadows and wooded hills that inevitably call to mind The Sound of Music. Locals admire Lageder for having led Alto Adige’s evolution from producing jug wine to boasting some of the best whites in Italy. In October 2005, Lageder hosted the world’s first conference on the future of wine under climate change. “We must recognize that climate change is not a problem of the future,” Lageder told his colleagues. “It is here today and we must adapt now.”

As it happens, Alto Adige is the location of one of the most dramatic expressions of modern global warming: the discovery of the so-called Iceman—the frozen remains of a herder who lived in the region 5,300 years ago. The corpse was found in 1991 in a mountain gully, almost perfectly preserved—even the skin was intact—because it had lain beneath mounds of snow and ice since shortly after his death (a murder, forensic investigators later concluded from studying the trajectory of an arrowhead lodged in his left shoulder). He would not have been found were it not for global warming, says Hans Glauber, the director of the Alto Adige Ecological Institute: “Temperatures have been rising in the Alps about twice as fast as in the rest of the world,” he notes.

Lageder heard about global warming in the early 1990s and felt compelled to take action. It wasn’t easy—”I had incredible fights with my architect about wanting good insulation,” he says—but by 1996 he had installed the first completely privately financed solar-energy system in Italy. He added a geothermal energy system as well. Care was taken to integrate these cutting-edge technologies into the existing site; during a tour, I emerged from a dark fermentation cellar with its own wind turbine into the bright sunlight of a gorgeous courtyard dating to the 15th century. Going green did make the renovation cost 30 percent more, Lageder says, “but that just means there is a slightly longer amortization period. In fact, we made up the cost difference through increased revenue, because when people heard about what we were doing, they came to see it and they ended up buying our wines.”

The record summer heat that struck Italy and the rest of Europe in 2003, killing tens of thousands, made Lageder even more alarmed. “When I was a kid, the harvest was always after Nov. 1, which was a cardinal date,” he told me. “Nowadays, we start between the 5th and 10th of September and finish in October.” Excess heat raises the sugar level of grapes to potentially ruinous levels. Too much sugar can result in wine that is unbalanced and too alcoholic—wine known as “cooked” or “jammy.” Higher temperatures may also increase the risk of pests and parasites, because fewer will die off during the winter. White wines, whose skins are less tolerant of heat, face particular difficulties as global warming intensifies. “In 2003, we ended up with wines that had between 14 and 16 percent alcohol,” Lageder recalled, “whereas normally they are between 12 and 14 percent. The character of our wine was changing.”

A 2 percent increase in alcohol may sound like a tiny difference, but the effect on a wine’s character and potency is considerable. “In California, your style of wine is bigger, with alcohol levels of 14 and 15, even 16 percent,” Lageder continued. “I like some of those wines a lot. But the alcohol level is so high that you have one glass and then”—he slashed his hand across his throat—”you’re done; any more and you will be drunk. In Europe, we prefer to drink wine throughout the evening, so we favor wines with less alcohol. Very hot weather makes that harder to achieve.”

There are tricks grape growers and winemakers can use to lower alcohol levels. The leaves surrounding the grapes can be allowed to grow bushier, providing more shade. Vines can be replaced with different clones or rootstocks. Growing grapes at higher altitudes, where the air is cooler, is another option. So is changing the type of grapes being grown.

But laws and cultural traditions currently stand in the way of such adaptations. So-called AOC laws (Appellation d’Origine Côntrollée) govern wine-grape production throughout France, and in parts of Italy and Spain, as well. As temperatures rise further, these AOC laws and kindred regulations are certain to face increased challenge. “I was just in Burgundy,” Pancho Campo told me in March 2008, “and producers there are very concerned, because they know that chardonnay and pinot noir are cool-weather wines, and climate change is bringing totally the contrary. Some of the producers were even considering starting to study Syrah and other varieties. At the moment, they are not allowed to plant other grapes, but these are questions people are asking.”

The greatest resistance, however, may come from the industry itself. “Some of my colleagues may admire my views on this subject, but few have done much,” says Lageder. “People are trying to push the problem away, saying, ‘Let’s do our job today and wait and see in the future if climate change becomes a real problem.’ But by then it will be too late to save ourselves.”

If the wine industry does not adapt to climate change, life will go on—with less conviviality and pleasure, perhaps, but it will go on. Fine wine will still be produced, most likely by early adapters such as Lageder, but there will be less of it. By the law of supply and demand, that suggests the best wines of tomorrow will cost even more than the ridiculous amounts they fetch today. White wine may well disappear from some regions. Climate-sensitive reds such as pinot noir are also in trouble. It’s not too late for winemakers to save themselves through adaptation. But it’s disconcerting to see so much dawdling in an industry with so much incentive to act. If winemakers aren’t motivated to adapt to climate change, what businesses will be?

The answer seems to be very few. Even in Britain, where the government is vigorously championing adaptation, the private sector lags in understanding the adaptation imperative, much less implementing it. “I bet if I rang up 100 small businesses in the U.K. and mentioned adaptation, 90 of them wouldn’t know what I was talking about,” says Gareth Williams, who works with the organization Business in the Community, helping firms in northeast England prepare for the storms and other extreme weather events that scientists project for the region. “When I started this job, I gave a presentation to heads of businesses,” said Williams, who spent most of his career in the private sector. “I presented the case for adaptation, and in the question-and-answer period, one executive said, ‘We’re doing quite a lot on adaptation already.’ I said, ‘Oh, what’s that?’ He said, ‘We’re recycling, and we’re looking at improving our energy efficiency.’ I thought to myself, ‘Oh, my, he really didn’t get it at all. This is going to be a struggle.’ “

“Most of us are not very good at recognizing our risks until we are hit by them,” explains Chris West, the director of the U.K. government’s Climate Impact Program. “People who run companies are no different.” Before joining UKCIP in 1999, West had spent most of his career working to protect endangered species. Now, the species he is trying to save is his own, and the insights of a zoologist turn out to be quite useful. Adapting to changing circumstances is, after all, the essence of evolution—and of success in the modern economic marketplace. West is fond of quoting Darwin: “It is not the strongest of the species that survives … nor the most intelligent that survives. It is the one that is the most adaptable to change.”

This story comes from the Climate Desk collaboration.

Article URL: http://www.slate.com/id/2251870/

>Geoengenharia

>
Especiais
Caminhos para o clima

31/3/2010

Por Fábio de Castro

Agência FAPESP – Cientistas brasileiros e britânicos discutiram nesta terça-feira (30/3), por meio de videoconferência, possibilidades de cooperação entre instituições dos dois países para desenvolvimento de estudos e programas de pesquisa conjuntos na área de geoengenharia, que inclui diversos métodos de intervenção de larga escala no sistema climático do planeta, com a finalidade de moderar o aquecimento global.

O “Café Scientifique: Encontro Brasileiro-Britânico sobre Geoengenharia”, promovido pelo British Council, Royal Society e FAPESP, foi realizado nas sedes do British Council em São Paulo e em Londres, na Inglaterra.

O ponto de partida para a discussão foi o relatório Geoengenharia para o clima: Ciência, governança e incerteza, apresentado pelo professor John Shepherd, da Royal Society. Em seguida, Luiz Gylvan Meira Filho, pesquisador do Instituto de Estudos Avançados da Universidade de São Paulo (USP), apresentou um breve panorama da geoengenharia no Brasil.

A FAPESP foi representada pelo coordenador executivo do Programa FAPESP de Pesquisa sobre Mudanças Climáticas Globais, Carlos Afonso Nobre, pesquisador do Centro de Previsão de Tempo e Estudos Climáticos (CPTEC) do Instituto Nacional de Pesquisas Espaciais (Inpe).

De acordo com Nobre, a reunião serviu para um contato inicial entre os cientistas dos dois países. “A reunião teve um caráter exploratório, já que o próprio conceito de geoengenharia ainda não foi definido com precisão. O objetivo principal era avaliar o interesse das duas partes em iniciar alguma pesquisa conjunta nessa área e expor potenciais contribuições que cada um pode dar nesse sentido”, disse Nobre à Agência FAPESP.

Segundo Nobre, a geoengenharia é um conjunto de possibilidades de intervenção dividido em dois métodos bastante distintos: o manejo de radiação solar e a remoção de dióxido de carbono. Durante a reunião, os brasileiros deixaram claro que têm interesse apenas na segunda vertente.

O manejo de radiação solar, de acordo com o relatório britânico, inclui técnicas capazes de refletir a luz do Sol a fim de diminuir o aquecimento global, como a instalação de espelhos no espaço, o uso de aerossóis estratosféricos – com aplicação de sulfatos, por exemplo –, reforço do albedo das nuvens e incremento do albedo da superfície terrestre, com instalação de telhados brancos nas edificações.

A remoção de dióxido de carbono, por outro lado, inclui metodologias de captura do carbono da atmosfera – ou “árvores artificiais” –, geração de carbono por pirólise de biomassa, sequestro de carbono por meio de bioenergia, fertilização do oceano e armazenamento de carbono no solo ou nos oceanos.

A principal diferença entre as duas vertentes é que os métodos de manejo de radiação solar funcionam com mais rapidez, em prazos de um ou dois anos, enquanto os métodos de remoção de gás carbônico levam várias décadas para surtirem efeito.

Sem plano B

O relatório avaliou todas as técnicas segundo eficácia, prazo de funcionamento, segurança e custo. Seria preciso ainda estudar os impactos sociais, politicos e éticos, de acordo com os cientistas britânicos.

Nobre aponta que o Brasil teria interesse em contribuir com estudos relacionados à vertente da remoção de dióxido de carbono, que seria coerente com o estágio avançado das pesquisas já realizadas no país em áreas como bioenergia e métodos de captura de carbono.

“Sou muito cético em relação ao manejo de energia de radiação solar. A implementação dessas técnicas é rápida, mas, quando esses dispositivos forem desativados – o que ocorrerá inevitavelmente, já que não é sustentável mantê-los por vários milênios –, a situação do clima voltará rapidamente ao cenário anterior. Seria preciso, necessariamente, reduzir rapidamente a causa das mudanças climáticas, que são as emissões de gases de efeito estufa”, disse Nobre.

De acordo com ele, as técnicas de manejo de energia solar são vistas, em geral, como um “plano B”, em caso de iminência de um desastre climático de grandes consequências. Ou seja, seriam acionadas emergencialmente quando os sistemas climáticos estivessem atingindo pontos de saturação que provocariam mudanças irreversíveis – os chamados tipping points.

“Mas o problema é que vários tipping points foram atingidos e já não há mais plano B. O derretimento do gelo do Ártico, por exemplo, de acordo com 80% dos glaciologistas, atingiu o ponto de saturação. Em algumas décadas, no verão, ali não haverá mais gelo. Não podemos criar a ilusão de que é possível acionar um plano B. Não há sistemas de governança capazes de definir o momento de lançar essas alternativas”, disse.

A vertente da remoção do dióxido de carbono, por outro lado, deverá ser amplamente estudada, de acordo com Nobre. “Essa vertente segue a linha lógica do restabelecimento da qualidade atmosférica. O princípio é fazer a concentração dos gases voltar a um estado de equilíbrio no qual o planeta se manteve por pelo menos 1 ou 2 milhões de anos.”

Ainda assim, essas soluções de engenharia climáticas devem ser encaradas com cuidado. “A natureza é muito complexa e as soluções de engenharia não são fáceis, especialmente em escala global. Acho que vale a pena estudar as várias técnicas de remoção de gás carbônico e definir quais delas têm potencial – mas sempre lembrando que são processos lentos que vão levar décadas ou séculos. Nada elimina a necessidade de reduzir emissões”, disse Nobre.

>Referência em previsões climáticas

>
Agência FAPESP – 26/3/2010

O Centro de Previsão de Tempo e Estudos Climáticos (CPTEC), do Instituto Nacional de Pesquisas Espaciais (Inpe), passará a integrar um seleto grupo de centros mundiais de previsão climática sazonal.

O centro foi recomendado pela Comissão para Sistemas Básicos da Organização Meteorológica Mundial (OMM) como um Global Producing Center (GPC) ou Centro Produtor Global de previsões de longo prazo.

Após confirmação da OMM – prevista para junho deste ano – o CPTEC receberá um selo de qualidade para suas previsões climáticas sazonais. Segundo avaliação dos especialistas da OMM, o CPTEC atende a vários critérios, com destaque para a metodologia empregada, disseminação de produtos de previsão na internet e existência de um ciclo operacional fixo de previsão climática sazonal.

Em contrapartida, o centro passará a participar de atividades internacionais da OMM, contribuindo com o Centro de Verificação de Previsão de Longo Prazo. Desde 2006, a OMM, por meio do Programa Global de Processamento de Dados e Sistemas de Previsão, passou a atestar a qualidade dos centros de pesquisa e de previsão climática que atendam a determinados quesitos, intitulando-os GPCs de previsões de longo prazo.

Com a recomendação, o CPTEC passará também a integrar um grupo de centros mundiais de previsão climática sazonal, como os National Centers for Environmental Prediction (Estados Unidos), o European Centre for Medium-Range Weather Forecasts (União Europeia), o UK Met Office (Reino Unido), o Meteo-France, o Metorological Service of Canada, o Bureau of Meteorology da Austrália e o Japan Meteorological Agency, entre outros.

>The beginning of modern tornado forecasting

>
Fawbush and Miller

By Bill Murray
Whatever-weather.com, March 20, 2010

The Californian had little experience with forecasting Midwestern weather. He had been at Tinker Air Force Base in Oklahoma less than three weeks. When World War II broke out, he left his classes at Occidental College where he was enrolled, and enlisted in the Army Air Corps. He ended up in the weather forecaster school in Grand Rapids, Michigan. The demand was so great that new forecasters were put to work after a hurried nine month course in meteorology. His tours of duty were mainly in the South Pacific, forecasting weather for the forces that were battling the Axis. He was eventually promoted to the rank of Captain.

At the end of the war, he was assigned to Fort Benning in Georgia, where he honed his ability to map out the details of weather at different altitudes and visualize those details. This ability to understand a three dimensional picture of the atmosphere is critical to weather forecasting. On the afternoon of March 20, 1948, he plotted the weather at the surface and aloft across the vast and flat terrain of the American Plains. There was nothing on the charts out of the ordinary It looked like a dry and boring forecast with just some gusty winds through the evening hours.

He settled in to get acquainted with the backup forecaster, who was also from California. About 9 p.m., much to the forecasters’ surprise, they began to see surface reports of lightning from stations just to the southwest and west of Oklahoma City. Echoes appeared on their weather radar, less than twenty minutes away. The backup forecaster, a Staff Sergeant, sat down to type up a warning that thunderstorms were approaching. To their horror, at 9:52 p.m., a report streamed across the teletype from Will Rogers Airport, just seven miles to their southwest that a tornado was on the ground, visible from the airport. Sure enough, illuminated by lightning, a huge funnel was visible almost immediately. It roared across the base, doing $10 million worth of damage and injuring several personnel.

There were recriminations immediately. A panel of investigators from Washington flew in the next morning. The nervous weather officer and his superior answered questions. The board of inquiry listened to the facts and quickly rendered a decision. The event was not forecastable given the state of the art in meteorology. But later that day, Colonel Robert Miller and Major Ernest Fawbush were summoned to the Commanding General’s office. He directed Fawbush and Miller to investigate the possibility of forecasting tornadoes.

They reviewed the weather charts from the day before, as well as those from other tornadic events. They identified that tornadoes seemed to occur in warm, moist airmasses, with strong winds aloft. But the difficulty lay in delineating the areas which were most likely to experience the destructive storms. Issuing a tornado forecast for a single point seemed improbable.

But less than a week later on the 25th, Fawbush and Miller looked at their weather charts and then at each other. The weather pattern looked nearly identical to that of the 20th. What were the odds that another tornado would strike the base? Infinitesimal. They went to the General and told him what they saw. The General ordered the base secured. They reconvened with the General at early afternoon. It was then that they issued the first tornado forecast. Sure enough, a tornado moved across the well-prepared base that evening.

It was the beginning of modern tornado forecasting.

[This is one more historical example of the fact that unpredictability raises the specter of blame and accountability to forecasters. RT]

>The clouds of unknowing (The Economist)

>
The science of climate change

There are lots of uncertainties in climate science. But that does not mean it is fundamentally wrong

Mar 18th 2010 | From The Economist print edition

FOR anyone who thinks that climate science must be unimpeachable to be useful, the past few months have been a depressing time. A large stash of e-mails from and to investigators at the Climatic Research Unit of the University of East Anglia provided more than enough evidence for concern about the way some climate science is done. That the picture they painted, when seen in the round—or as much of the round as the incomplete selection available allows—was not as alarming as the most damning quotes taken out of context is little comfort. They offered plenty of grounds for both shame and blame.

At about the same time, glaciologists pointed out that a statement concerning Himalayan glaciers in the most recent report of the Intergovernmental Panel on Climate Change (IPCC) was wrong. This led to the discovery of other poorly worded or poorly sourced claims made by the IPCC, which seeks to create a scientific consensus for the world’s politicians, and to more general worries about the panel’s partiality, transparency and leadership. Taken together, and buttressed by previous criticisms, these two revelations have raised levels of scepticism about the consensus on climate change to new heights.

Increased antsiness about action on climate change can also be traced to the recession, the unedifying spectacle of last December’s climate-change summit in Copenhagen, the political realities of the American Senate and an abnormally cold winter in much of the northern hemisphere. The new doubts about the science, though, are clearly also a part of that story. Should they be?

In any complex scientific picture of the world there will be gaps, misperceptions and mistakes. Whether your impression is dominated by the whole or the holes will depend on your attitude to the project at hand. You might say that some see a jigsaw where others see a house of cards. Jigsaw types have in mind an overall picture and are open to bits being taken out, moved around or abandoned should they not fit. Those who see houses of cards think that if any piece is removed, the whole lot falls down. When it comes to climate, academic scientists are jigsaw types, dissenters from their view house-of-cards-ists.

The defenders of the consensus tend to stress the general consilience of their efforts—the way that data, theory and modelling back each other up. Doubters see this as a thoroughgoing version of “confirmation bias”, the tendency people have to select the evidence that agrees with their original outlook. But although there is undoubtedly some degree of that (the errors in the IPCC, such as they are, all make the problem look worse, not better) there is still genuine power to the way different arguments and datasets in climate science tend to reinforce each other.

The doubters tend to focus on specific bits of empirical evidence, not on the whole picture. This is worthwhile—facts do need to be well grounded—but it can make the doubts seem more fundamental than they are. People often assume that data are simple, graspable and trustworthy, whereas theory is complex, recondite and slippery, and so give the former priority. In the case of climate change, as in much of science, the reverse is at least as fair a picture. Data are vexatious; theory is quite straightforward. Constructing a set of data that tells you about the temperature of the Earth over time is much harder than putting together the basic theoretical story of how the temperature should be changing, given what else is known about the universe in general.

Absorb and reflect

The most relevant part of that universal what-else is the requirement laid down by thermodynamics that, for a planet at a constant temperature, the amount of energy absorbed as sunlight and the amount emitted back to space in the longer wavelengths of the infra-red must be the same. In the case of the Earth, the amount of sunlight absorbed is 239 watts per square metre. According to the laws of thermodynamics, a simple body emitting energy at that rate should have a temperature of about –18ºC. You do not need a comprehensive set of surface-temperature data to notice that this is not the average temperature at which humanity goes about its business. The discrepancy is due to greenhouse gases in the atmosphere, which absorb and re-emit infra-red radiation, and thus keep the lower atmosphere, and the surface, warm (see the diagram below). The radiation that gets out to the cosmos comes mostly from above the bulk of the greenhouse gases, where the air temperature is indeed around –18ºC.

Adding to those greenhouse gases in the atmosphere makes it harder still for the energy to get out. As a result, the surface and the lower atmosphere warm up. This changes the average temperature, the way energy moves from the planet’s surface to the atmosphere above it and the way that energy flows from equator to poles, thus changing the patterns of the weather.

No one doubts that carbon dioxide is a greenhouse gas, good at absorbing infra-red radiation. It is also well established that human activity is putting more of it into the atmosphere than natural processes can currently remove. Measurements made since the 1950s show the level of carbon dioxide rising year on year, from 316 parts per million (ppm) in 1959 to 387ppm in 2009. Less direct records show that the rise began about 1750, and that the level was stable at around 280ppm for about 10,000 years before that. This fits with human history: in the middle of the 18th century people started to burn fossil fuels in order to power industrial machinery. Analysis of carbon isotopes, among other things, shows that the carbon dioxide from industry accounts for most of the build-up in the atmosphere.

The serious disagreements start when discussion turns to the level of warming associated with that rise in carbon dioxide. For various reasons, scientists would not expect temperatures simply to rise in step with the carbon dioxide (and other greenhouse gases). The climate is a noisy thing, with ups and downs of its own that can make trends hard to detect. What’s more, the oceans can absorb a great deal of heat—and there is evidence that they have done so—and in storing heat away, they add inertia to the system. This means that the atmosphere will warm more slowly than a given level of greenhouse gas would lead you to expect.

There are three records of land-surface temperature put together from thermometer readings in common use by climatologists, one of which is compiled at the Climatic Research Unit of e-mail infamy. They all show warming, and, within academia, their reliability is widely accepted. Various industrious bloggers are not so convinced. They think that adjustments made to the raw data introduce a warming bias. They also think the effects of urbanisation have confused the data because towns, which are sources of heat, have grown up near weather stations. Anthony Watts, a retired weather forecaster who blogs on climate, has set up a site, surfacestations.org, where volunteers can help record the actual sites of weather instruments used to provide climate data, showing whether they are situated close to asphalt or affected by sources of bias.

Those who compile the data are aware of this urban heat-island effect, and try in various ways to compensate for it. Their efforts may be insufficient, but various lines of evidence suggest that any errors it is inserting are not too bad. The heat-island effect is likely to be strongest on still nights, for example, yet trends from data recorded on still nights are not that different from those from windy ones. And the temperature of waters at the surface of the seas shows similar trends to that on land over the past century, as does the record of air temperature over the oceans as measured at night (see chart 1).

A recent analysis by Matthew Menne and his colleagues at America’s National Oceanic and Atmospheric Administration, published in the Journal of Geophysical Research, argued that trends calculated from climate stations that surfacestation.org found to be poorly sited and from those it found well sited were more or less indistinguishable. Mr Watts has problems with that analysis, and promises a thorough study of the project’s findings later.

There is undoubtedly room for improvement in the surface-temperature record—not least because, at the moment, it provides only monthly mean temperatures, and there are other things people would like to know about. (When worrying about future heatwaves, for example, hot days and nights, not hot months, are the figures of most interest.) In February Britain’s Met (ie, meteorological) Office called for the creation of a new set of temperature databases compiled in rigorously transparent ways and open to analysis and interpretation by all and sundry. Such an initiative would serve science well, help restore the credibility of land-surface records, and demonstrate an openness on the part of climate science which has not always been evident in the past.

Simplify and amplify

For many, the facts that an increase in carbon dioxide should produce warming, and that warming is observed in a number of different indicators and measurements, add up to a primafacie case for accepting that greenhouse gases are warming the Earth and that the higher levels of greenhouse gases that business as usual would bring over the course of this century would warm it a lot further.

The warming caused by a given increase in carbon dioxide can be calculated on the basis of laboratory measurements which show how much infra-red radiation at which specific wavelengths carbon dioxide molecules absorb. This sort of work shows that if you double the carbon dioxide level you get about 1ºC of warming. So the shift from the pre-industrial 280ppm to 560ppm, a level which on current trends might be reached around 2070, makes the world a degree warmer. If the level were to double again, to 1,100ppm, which seems unlikely, you would get another degree.

The amount of warming expected for a doubling of carbon dioxide has become known as the “climate sensitivity”—and a climate sensitivity of one degree would be small enough to end most climate-related worries. But carbon dioxide’s direct effect is not the only thing to worry about. Several types of feedback can amplify its effect. The most important involve water vapour, which is now quite well understood, and clouds, which are not. It is on these areas that academic doubters tend to focus.

As carbon dioxide warms the air it also moistens it, and because water vapour is a powerful greenhouse gas, that will provide further warming. Other things people do—such as clearing land for farms, and irrigating them—also change water vapour levels, and these can be significant on a regional level. But the effects are not as large.

Climate doubters raise various questions about water vapour, some trivial, some serious. A trivial one is to argue that because water vapour is such a powerful greenhouse gas, carbon dioxide is unimportant. But this ignores the fact that the level of water vapour depends on temperature. A higher level of carbon dioxide, by contrast, governs temperature, and can endure for centuries.

A more serious doubting point has to do with the manner of the moistening. In the 1990s Richard Lindzen, a professor of meteorology at the Massachusetts Institute of Technology, pointed out that there were ways in which moistening might not greatly enhance warming. The subsequent two decades have seen much observational and theoretical work aimed at this problem. New satellites can now track water vapour in the atmosphere far better than before (see chart 2). As a result preliminary estimates based on simplifications have been shown to be reasonably robust, with water-vapour feedbacks increasing the warming to be expected from a doubling of carbon dioxide from 1ºC without water vapour to about 1.7ºC. Dr Lindzen agrees that for parts of the atmosphere without clouds this is probably about right.

This moistening offers a helpful way to see what sort of climate change is going on. When water vapour condenses into cloud droplets it gives up energy and warms the surrounding air. This means that in a world where greenhouse warming is wetting the atmosphere, the lower parts of the atmosphere should warm at a greater rate than the surface, most notably in the tropics. At the same time, in an effect that does not depend on water vapour, an increase in carbon dioxide will cause the upper stratosphere to cool. This pattern of warming down below and cooling up on top is expected from greenhouse warming, but would not be expected if something other than the greenhouse effect was warming the world: a hotter sun would heat the stratosphere more, not less.

During the 1990s this was a point on which doubters laid considerable weight, because satellite measurements did not show the warming in the lower atmosphere that theory would predict. Over the past ten years, though, this picture has changed. To begin with, only one team was turning data from the relevant instruments that have flown on weather satellites since the 1970s into a temperature record resolved by altitude. Now others have joined them, and identified errors in the way that the calculations (which are complex and depend on a number of finicky details) were carried out. Though different teams still get different amounts and rates of warming in the lower atmosphere, there is no longer any denying that warming is seen. Stratospheric cooling is complicated by the effects of ozone depletion, but those do not seem large enough to account for the degree of cooling that has been seen there, further strengthening the case for warming by the greenhouse effect and not some other form of climate perturbation.

On top of the effect of water vapour, though, the clouds that form from it provide a further and greater source of uncertainty. On the one hand, the droplets of water of which these are made also have a strong greenhouse effect. On the other, water vapour is transparent, whereas clouds reflect light. In particular, they reflect sunlight back into space, stopping it from being absorbed by the Earth. Clouds can thus have a marked cooling effect and also a marked warming effect. Which will grow more in a greenhouse world?

Model maze

It is at this point that detailed computer models of the climate need to be called into play. These models slice the atmosphere and oceans into stacks of three-dimensional cells. The state of the air (temperature, pressure, etc) within each cell is continuously updated on the basis of what its state used to be, what is going on in adjacent cells and the greenhousing and other properties of its contents.

These models are phenomenally complex. They are also gross oversimplifications. The size of the cells stops them from explicitly capturing processes that take place at scales smaller than a hundred kilometres or so, which includes the processes that create clouds.

Despite their limitations, climate models do capture various aspects of the real world’s climate: seasons, trade winds, monsoons and the like. They also put clouds in the places where they are seen. When used to explore the effect of an increase in atmospheric greenhouse gases on the climate these models, which have been developed by different teams, all predict more warming than greenhouse gases and water-vapour feedback can supply unaided. The models assessed for the IPCC’s fourth report had sensitivities ranging from 2.1ºC to 4.4ºC. The IPCC estimated that if clouds were not included, the range would be more like 1.7ºC to 2.1ºC. So in all the models clouds amplify warming, and in some the amplification is large.

However, there are so far no compelling data on how clouds are affecting warming in fact, as opposed to in models. Ray Pierrehumbert, a climate scientist at the University of Chicago who generally has a strong way with sceptics, is happy to agree that there might be processes by which clouds rein in, rather than exaggerate, greenhouse-warming effects, but adds that, so far, few have been suggested in any way that makes sense.

Dr Lindzen and a colleague suggested a plausible mechanism in 2001. They proposed that tropical clouds in an atmosphere with more greenhouse gas might dry out neighbouring parts of the sky, making them more transparent to outgoing infra-red. The evidence Dr Lindzen brought to bear in support of this was criticised in ways convincing enough to discourage other scientists from taking the idea further. A subsequent paper by Dr Lindzen on observations that would be compatible with his ideas about low sensitivity has also suffered significant criticisms, and he accepts many of them. But having taken them on board has not, he thinks, invalidated his line of research.

Arguments based on past climates also suggest that sensitivity is unlikely to be low. Much of the cooling during the ice ages was maintained by the presence of a large northern hemisphere ice cap reflecting away a lot of sunlight, but carbon dioxide levels were lower, too. To account for all of the cooling, especially in the southern hemisphere, is most easily done with a sensitivity of temperature to carbon dioxide higher than Dr Lindzen would have it.

Before the ice age, the Earth had a little more carbon dioxide and was a good bit warmer than today—which suggests a fairly high sensitivity. More recently, the dip in global temperatures after the eruption of Mt Pinatubo in the Philippines in 1991, which inserted a layer of sunlight-diffusing sulphur particles into the stratosphere, also bolsters the case for a sensitivity near the centre of the model range—although sensitivity to a transient event and the warming that follows a slow doubling of carbon dioxide are not exactly the same sort of thing.

Logs and blogs

Moving into data from the past, though, brings the argument to one of the areas that blog-based doubters have chosen as a preferred battleground: the temperature record of the past millennium, as construed from natural records that are both sensitive to temperature and capable of precise dating. Tree rings are the obvious, and most controversial, example. Their best known use has been in a reconstruction of temperatures over the past millennium published in Nature in 1998 and widely known as the hockey stick, because it was mostly flat but had a blade sticking up at the 20th-century end. Stephen McIntyre, a retired Canadian mining consultant, was struck by the very clear message of this graph and delved into the science behind it, a process that left him and followers of his blog, Climate Audit, intensely sceptical about its value.

In 2006 a review by America’s National Research Council endorsed points Mr McIntyre and his colleagues made on some methods used to make the hockey stick, and on doubts over a specific set of tree rings. Despite this it sided with the hockey stick’s overall conclusion, which did little to stem the criticism. The fact that tree-ring records do not capture recent warming adds to the scepticism about the value of such records.

For many of Mr McIntyre’s fans (though it is not, he says, his central concern) the important thing about this work is that the hockey stick seemed to abolish the “medieval warm period”. This is a time when temperatures are held to have been as high as or higher than today’s—a warmth associated with the Norse settlement of Greenland and vineyards in England. Many climate scientists suspect this phenomenon was given undue prominence by climatologists of earlier generations with an unduly Eurocentric view of the world. There is evidence for cooling at the time in parts of the Pacific.

Doubters for the most part are big fans of the medieval warm period, and see in the climate scientists’ arguments an attempt to rewrite history so as to maximise the drama of today’s warming and minimise the possibility that natural variation might explain the 20th-century record. The possibility of more climatic variability, though, does not, in itself, mean that greenhouse warming is not happening too. And if the medieval warmth were due to some external factor, such as a slightly brighter sun, that would suggest that the climate was indeed quite sensitive.

Looking at the more recent record, logged as it has been by thermometers, you might hope it could shed light on which of the climate models is closest to being right, and thus what the sensitivity actually is. Unfortunately, other confounding factors make this difficult. Greenhouse gases are not the only climatically active ingredients that industry, farming and land clearance add to the atmosphere. There are also aerosols—particles of pollution floating in the wind. Some aerosols cool the atmosphere. Other, sootier, ones warm it. The aggregate effect, globally, is thought to be a cooling, possibly a quite strong one. But the overall history of aerosols, which are mostly short-lived, is nothing like as well known as that of greenhouse gases, and it is unlikely that any of the models are properly capturing their chemistry or their effects on clouds.

Taking aerosols into account, climate models do a pretty good job of emulating the climate trends of the 20th century. This seems odd, since the models have different sensitivities. In practice, it appears that the way the aerosols are dealt with in the models and the sensitivity of those models tend to go hand in hand; sensitive models also have strong cooling aerosol effects.

Reto Knutti of ETH Zurich, an expert on climate sensitivity, sees this as evidence that, consciously or unconsciously, aerosols are used as counterweights to sensitivity to ensure that the trends look right. This is not evidence of dishonesty, and it is not necessarily a bad thing. Since the models need to be able to capture the 20th century, putting them together in such a way that they end up doing so makes sense. But it does mean that looking at how well various models match the 20th century does not give a good indication of the climate’s actual sensitivity to greenhouse gas.

Adding the uncertainties about sensitivity to uncertainties about how much greenhouse gas will be emitted, the IPCC expects the temperature to have increased by 1.1ºC to 6.4ºC over the course of the 21st century. That low figure would sit fairly well with the sort of picture that doubters think science is ignoring or covering up. In this account, the climate has natural fluctuations larger in scale and longer in duration (such as that of the medieval warm period) than climate science normally allows, and the Earth’s recent warming is caused mostly by such a fluctuation, the effects of which have been exaggerated by a contaminated surface-temperature record. Greenhouse warming has been comparatively minor, this argument would continue, because the Earth’s sensitivity to increased levels of carbon dioxide is lower than that seen in models, which have an inbuilt bias towards high sensitivities. As a result subsequent warming, even if emissions continue full bore, will be muted too.

It seems unlikely that the errors, misprisions and sloppiness in a number of different types of climate science might all favour such a minimised effect. That said, the doubters tend to assume that climate scientists are not acting in good faith, and so are happy to believe exactly that. Climategate and the IPCC’s problems have reinforced this position.

Using the IPCC’s assessment of probabilities, the sensitivity to a doubling of carbon dioxide of less than 1.5ºC in such a scenario has perhaps one chance in ten of being correct. But if the IPCC were underestimating things by a factor of five or so, that would still leave only a 50:50 chance of such a desirable outcome. The fact that the uncertainties allow you to construct a relatively benign future does not allow you to ignore futures in which climate change is large, and in some of which it is very dangerous indeed. The doubters are right that uncertainties are rife in climate science. They are wrong when they present that as a reason for inaction.

Comments to this article here.

>U.S. Scientists Urge Action on Climate Change

>
On March 11, 2000 U.S. scientists and economists signed on to a statement imploring the Senate to move swiftly and comprehensively on the issue of climate change. The signatories are all experts in relevant fields of study on climate change. The statement is the first time leading U.S. scientists and economists have come together to issue a joint message of concern on climate change. The list of signatories included eight Nobel laureates, 32 members of the National Academy of Sciences, 10 members from the National Academy of Engineering, and more than 100 members of the Intergovernmental Panel on Climate Change, who shared a 2007 Nobel Peace Prize.

“If anything, the climate problem is actually worse than reported earlier,” wrote Leon Lederman, Director Emeritus of the Fermi National Accelerator Laboratory in Batavia, Illinois, and a Nobel Prize-winning physicist, in an individual statement in the letter to the Senate. “Physicists tend to be super critical of strong conclusions, but the data on global warming now indicate the conclusions are not nearly strong enough.”

Read the statement here.

>Marcelo Gleiser: Criação imperfeita (Folha Mais!)

>
A noção de que a natureza pode ser decifrada pelo reducionismo precisa ser abolida.

14 março 2010

Desde tempos imemoriais, ao se deparar com a imensa complexidade da natureza, o homem buscou nela padrões repetitivos, algum tipo de ordem. Isso faz muito sentido. Afinal, ao olharmos para os céus, vemos que existem padrões organizados, movimentos periódicos que se repetem, definindo ciclos naturais aos quais estamos profundamente ligados: o nascer e o pôr do Sol, as fases da Lua, as estações do ano, as órbitas planetárias.

Com Pitágoras, 2.500 anos atrás, a busca por uma ordem natural das coisas foi transformada numa busca por uma ordem matemática: os padrões que vemos na natureza refletem a matemática da criação. Cabe ao filósofo desvendar esses padrões, revelando assim os segredos do mundo.

Ademais, como o mundo é obra de um arquiteto universal (não exatamente o Deus judaico-cristão, mas uma divindade criadora mesmo assim), desvendar os segredos do mundo equivale a desvendar a “mente de Deus”. Escrevi recentemente sobre como essa metáfora permanece viva ainda hoje e é usada por físicos como Stephen Hawking e muitos outros.

Essa busca por uma ordem matemática da natureza rendeu -e continua a render- muitos frutos. Nada mais justo do que buscar uma ordem oculta que explica a complexidade do mundo. Essa abordagem é o cerne do reducionismo, um método de estudo baseado na ideia de que a compreensão do todo pode ser alcançada através do estudo das suas várias partes.

Os resultados dessa ordem são expressos através de leis, que chamamos de leis da natureza. As leis são a expressão máxima da ordem natural. Na realidade, as coisas não são tão simples. Apesar da sua óbvia utilidade, o reducionismo tem suas limitações. Existem certas questões, ou melhor, certos sistemas, que não podem ser compreendidos a partir de suas partes. O clima é um deles; o funcionamento da mente humana é outro.

Os processos bioquímicos que definem os seres vivos não podem ser compreendidos a partir de leis simples, ou usando que moléculas são formadas de átomos. Essencialmente, em sistemas complexos, o todo não pode ser reduzido às suas partes.

Comportamentos imprevisíveis emergem das inúmeras interações entre os elementos do sistema. Por exemplo, a função de moléculas com muitos átomos, como as proteínas, depende de como elas se “dobram”, isto é, de sua configuração espacial. O funcionamento do cérebro não pode ser deduzido a partir do funcionamento de 100 bilhões de neurônios.

Sistemas complexos precisam de leis diferentes, que descrevem comportamentos resultantes da cooperação de muitas partes. A noção de que a natureza é perfeita e pode ser decifrada pela aplicação sistemática do método reducionista precisa ser abolida. Muito mais de acordo com as descobertas da ciência moderna é que devemos adotar uma abordagem múltipla, e que junto ao reducionismo precisamos utilizar outros métodos para lidar com sistemas mais complexos. Claro, tudo ainda dentro dos parâmetros das ciências naturais, mas aceitando que a natureza é imperfeita e que a ordem que tanto procuramos é, na verdade, uma expressão da ordem que buscamos em nós mesmos.

É bom lembrar que a ciência cria modelos que descrevem a realidade; esses modelos não são a realidade, só nossas representações dela. As “verdades” que tanto admiramos são aproximações do que de fato ocorre.

As simetrias jamais são exatas. O surpreendente na natureza não é a sua perfeição, mas o fato de a matéria, após bilhões de anos, ter evoluído a ponto de criar entidades capazes de se questionarem sobre a sua existência.

MARCELO GLEISER é professor de física teórica no Dartmouth College, em Hanover (EUA) e autor do livro “A Criação Imperfeita”

>Climate scepticism ‘on the rise’, BBC poll shows

>
The number of British people who are sceptical about climate change is rising, a poll for BBC News suggests.

BBC News, Sunday, 7 February 2010

The Populus poll of 1,001 adults found 25% did not think global warming was happening, an increase of 10% since a similar poll was conducted in November.

The percentage of respondents who said climate change was a reality had fallen from 83% in November to 75% this month.

And only 26% of those asked believed climate change was happening and “now established as largely man-made”.

The findings are based on interviews carried out on 3-4 February.

In November 2009, a similar poll by Populus – commissioned by the Times newspaper – showed that 41% agreed that climate change was happening and it was largely the result of human activities.

“It is very unusual indeed to see such a dramatic shift in opinion in such a short period,” Populus managing director Michael Simmonds told BBC News.

“The British public are sceptical about man’s contribution to climate change – and becoming more so,” he added.

“More people are now doubters than firm believers.”

The Department for Environment, Food and Rural Affairs’ (Defra) chief scientific adviser, Professor Bob Watson, called the findings “very disappointing”.

“The fact that there has been a very significant drop in the number of people that believe that we humans are changing the Earth’s climate is serious,” he told BBC News.

“Action is urgently needed,” Professor Watson warned.

“We need the public to understand that climate change is serious so they will change their habits and help us move towards a low carbon economy.”

‘Exaggerated risks’

Of the 75% of respondents who agreed that climate change was happening, one-in-three people felt that the potential consequences of living in a warming world had been exaggerated, up from one-in-five people in November.

The number of people who felt the risks of climate change had been understated dropped from 38% in November to 25% in the latest poll.

During the intervening period between the two polls, there was a series of high profile climate-related stories, some of which made grim reading for climate scientists and policymakers.

In November, the contents of emails stolen from a leading climate science unit led to accusations that a number of researchers had manipulated data.

And in January, the Intergovernmental Panel on Climate Change (IPCC) admitted that it had made a mistake in asserting that Himalayan glaciers could disappear by 2035.

All of this happened against the backdrop of many parts of the northern hemisphere being gripped by a prolonged period of sub-zero temperatures.

However, 73% of the people who said that they were aware of the “science flaws” stories stated that the media coverage had not changed their views about the risks of climate change.

“People tend to make judgements over time based on a whole range of different sources,” Mr Simmonds explained.

He added that it was very unusual for single events to have a dramatic impact on public opinion.

“Normally, people make their minds up over a longer period and are influenced by all the voices they hear, what they read and what people they know are talking about.”

>Climate change scientists losing ‘PR war’ (BBC)

>
A Nobel peace prize-winning Welsh physicist says climate change scientists are losing “a PR war” against sceptics with vested interests.

BBC News, Thursday, 11 February 2010

Sir John Houghton said there were millions of internet references to a comment he never made which appears to to show him “hyping up” global warming.

A poll for BBC news suggests the number of British people who are sceptical about climate change is rising.

Sir John believes recent news stories may have contributed to scepticism.

He told BBC Wales’ Dragon’s Eye programme: “If you Google my name on the web and look for a quote, the quote you will find is one that goes like this.

“It says ‘unless we announce disasters, no-one will listen’.

“I have never said that. The origin of the quote according to some of the people who write about it… [they] say it comes from the first edition of my global warming book, published in 1994.

“It does not appear in that book in any shape or form.”

Sir John, who co-chaired the UN Intergovernmental Panel on Climate Change’s (IPCC) scientific assessment group for 14 years, received the Nobel peace prize in 2007 as part of an IPCC delegation.

He said most scientists were not very good at public relations and just wanted to get on with their work.

Asked if he believed climate change scientists were now in a “PR war” with sceptics, he said: “We are in a way and we’re losing that war because we’re not good at PR.

“Your average scientist is not a good PR person because he wants to get on with his science.

Stolen e-mails

“So we need to look, I suppose, for some good PR people to help us get our messages across in an honest and open and sensible way, without causing the sort of furore, the sort of polarisation that has occurred because of the people who are trying to deny it, and trying to deny it so vehemently that the media is taking so much notice of them.”

The number of British people who are sceptical about climate change is rising, according to a new poll.

The Populus poll of 1,001 adults found 25% did not think global warming was happening, an increase of 10% since a similar poll in November.

Stolen e-mails from the University of East Anglia led to accusations, since denied, that climate change data was being manipulated.

Last month, the IPCC had to admit it had been mistaken in claiming Himalayan glaciers could disappear by 2035.

Sir John said some reporting of these stories had given mistakes undue significance and deliberately misrepresented other information.

‘Vested interests’

He believes some sceptics are influenced by concerns other than scientific truth, comparing them to now discredited lobbyists who argued smoking did not cause cancer.

He said: “A lot of it comes from the United States, from vested interests, coal and oil interests in the United States which are very strong and which employ thousands of lobbyists in Washington to try and influence members of Congress that climate change is not happening.

“So it’s a major problem in the United States and it does spill over to this country too.”

>Signs of Damage to Public Trust in Climate Findings (N. Y. Times/Dot Earth blog)

>
By ANDREW C. REVKIN
February 5, 2010, 4:27 pm

CBS News has run a report summarizing fallout from the illegal distribution of climate scientists’ email messages and files and problems with the 2007 report from the Intergovernmental Panel on Climate Change. The conclusion is that missteps and mistakes are creating broader credibility problems for climate science.

Senator James M. Inhofe was quick to add the report to the YouTube channel of the minority on the Environment and Public Works committee:

Ralph J. Cicerone, the president of the National Academy of Sciences, has an editorial in this week’s edition of the journal Science (subscription only) noting the same issue. Over all, he wrote, “My reading of the vast scientific literature on climate change is that our understanding is undiminished by this incident; but it has raised concern about the standards of science and has damaged public trust in what scientists do.”

Dr. Cicerone, an atmospheric scientist, added that polls and input he has received from various sources indicate that “public opinion has moved toward the view that scientists often try to suppress alternative hypotheses and ideas and that scientists will withhold data and try to manipulate some aspects of peer review to prevent dissent. This view reflects the fragile nature of trust between science and society, demonstrating that the perceived misbehavior of even a few scientists can diminish the credibility of science as a whole.” (A BBC report on its latest survey on climate views supports Dr. Cicerone’s impression.)

What should scientists do? Dr. Cicerone acknowledged both the importance of improving transparency and the challenges in doing so:

“It is essential that the scientific community work urgently to make standards for analyzing, reporting, providing access to, and stewardship of research data operational, while also establishing when requests for data amount to harassment or are otherwise unreasonable. A major challenge is that acceptable and optimal standards will vary among scientific disciplines because of proprietary, privacy, national security and cost limitations. Failure to make research data and related information accessible not only impedes science, it also breeds conflicts.”

As recently as last week, senior members of the intergovernmental climate panel had told me that some colleagues did not see the need for changes in practices and were convinced that the recent flareup over errors in the 2007 report was a fleeting inconvenience. I wonder if they still feel that way.

UPDATE: Here’s some additional reading on the I.P.C.C’s travails and possible next steps for the climate panel:

IPCC Flooded by Criticism, by Quirin Schiermeier in Nature News.

Anatomy of I.P.C.C.’s Mistake on Himalayan Glaciers and Year 2035, by Bidisha Banerjee and George Collins in the Yale Forum on Climate Change and the Media.

* * *

After Emergence of Climate Files, an Uncertain Forecast

By ANDREW C. REVKIN
December 1, 2009, 10:56 am

Roger A. Pielke Jr. is a political scientist at the University of Colorado who has long focused on climate and disasters and the interface of climate science and policy. He has been among those seeking some clarity on temperature data compiled by the Climatic Research Unit of the University of East Anglia, which is now at the center of a storm over thousands of e-mail messages and documents either liberated or stolen from its servers (depending on who is describing the episode). [UPDATED 11:45 a.m. with a couple more useful voices “below the fold.”]

On Monday, I asked him, in essence, if the shape of the 20th-century temperature curve were to shift much as a result of some of the issues that have come up in the disclosed e-mail messages and files, would that erode confidence in the keystone climate question (the high confidence expressed by the Intergovernmental Panel on Climate Change in 2007 that most warming since 1950 is driven by human activities)?

This is Dr. Pielke’s answer. (I added boldface to the take-home points.):

Here is my take, in a logical ordering, from the perspective of an informed observer:

The circumstances:

1. There are many adjustments made to the raw data to account for biases and other factors.
2. Some part of the overall warming trend is as a result of these adjustments.
3. There are legitimately different ways to do the adjusting. Consider that in the e-mails, [Phil] Jones writes that he thinks [James] Hansen’s approach to urban effects is no good. There are also debates over how to handle ocean temperatures from buckets versus intake valves on ships and so on. And some of the procedures for adjusting are currently contested in the scientific literature.
4. Presumably once the data is readily available how these legitimate scientific choices are made about the adjusting would be open to scrutiny and debate.
5. People will then be much more able to cherry pick adjustment procedures to maximize or minimize the historical trends, but also to clearly see how others make decisions about adjustments.
6. Mostly this matters for pre-1979, as the R.S.S. and U.A.H. satellite records provide some degree of independent checking.

Now the implications:

A. If it turns out that the choices made by CRU, GISS, NOAA fall on the “maximize historical trends” end of the scale, that will not help their perceived credibility for obvious reasons. On the other hand, if their choices lead to the middle of the range or even low end, then this will enhance their credibility.
B. The surface temps matter because they are a key basis for estimates of climate sensitivity in the models used to make projections. So people will fight over small differences, even if everyone accepts a significant warming trend. (This is a key point for understanding why people will fight over small differences.)
C. When there are legitimate debates over procedures in science (i.e., competing certainties from different scientists), then this will help the rest of us to understand that there are irreducible uncertainties across climate science.
D. In the end, I would hypothesize that the result of the freeing of data and code will necessarily lead to a more robust understanding of scientific uncertainties, which may have the perverse effect of making the future less clear, i.e., because it will result in larger error bars around observed temperature trends which will carry through into the projections.
E. This would have the greatest implications for those who have staked a position on knowing the climate future with certainty — so on both sides, those arguing doom and those arguing, “Don’t worry be happy.”

So, in the end, Dr. Pielke appears to say, closer scrutiny of the surface-temperature data could undermine definitive statements of all kinds — that human-driven warming is an unfolding catastrophe or something concocted. More uncertainty wouldn’t produce a climate comfort zone, given that poorly understood phenomena can sometimes cause big problems. But it would surely make humanity’s energy and climate choices that much tougher.

[UPDATE, 11:45 a.m.] Andrew Freedman at the Capital Weather Gang blog has interviewed Gerald North, the climate scientist who headed the National Academies panel that examined the tree-ring data and “hockey stick” graphs. Some excerpts:

On whether the emails and files undermine Dr. North’s confidence in human-driven climate change:

This hypothesis (Anthropogenic GW) fits in the climate science paradigm that 1) Data can be collected and assembled in ways that are sensible. 2) These data can be used to test and or recalibrate climate simulation models. 3) These same models can be used to predict future and past climates. It is understood that this is a complicated goal to reach with any precision. The models are not yet perfect, but there is no reason to think the approach is wrong.

On Stephen McIntyre of Climateaudit.org:

I do think he has had an overall positive effect. He has made us re-examine the basis for our assertions. In my opinion this sorts itself out in the due course of the scientific process, but perhaps he has made a community of science not used to scrutiny take a second look from time to time. But I am not sure he has ever uncovered anything that has turned out to be significant.

Also, please note below that Michael Schlesinger at the University of Illinois sent in a response to sharp criticisms of his Dot Earth contribution from Roger Pielke, Sr., at the University of Colorado, Boulder. (Apologies for Colorado State affiliation earlier; he’s moved.)

>Profetas das chuvas preveem inverno rigoroso

>
Os três profetas populares que mais acertam em previsões de chuvas, prenunciam forte quadra invernosa

Diário do Nordeste, Caderno Regional – 08/01/2010

Quixadá. Cientistas populares de todo o Estado se reúnem neste município do Sertão Central, amanhã, para divulgar seus prognósticos sobre a quadra invernosa deste ano. O XIV Encontro dos Profetas da Chuva ocorre neste sábado no antigo Clube dos Agrônomos, no entorno do Açude do Cedro. Segundo o presidente do Instituto de Pesquisa de Violas e Poesia Cultural Popular do Sertão Central, João Soares, são esperados, no mínimo, 30 profetas e dezenas de convidados. Ontem, alguns deles anteciparam, com exclusividade, para o Diário do Nordeste, as suas previsões.

Responsáveis pelo maior número de acertos na última década, Chico Leiteiro, Paulo Costa e Antônio Lima tem praticamente as mesmas previsões para o inverno deste ano. Deve chover muito. No levantamento efetuado pela reportagem, o trio tem percentual de acerto, avaliado o quadro geral no Ceará, superior a 80%. No segundo sábado de janeiro de 2009, Francisco Quintino dos Santos, o Chico Leito, previu diante do público inundações no Estado. Fato que se confirmou em muitos municípios cearense no Interior.

Leiteiro observa as “carregações” do tempo, as plantas, abelhas para fazer seu prognóstico. Diz que, quando os inchuís ou colmeias de marimbondos, também conhecidos como capuxus ou vespas do papel, estão novinhos, no início do ano, é sinal de bom inverno. Associando essas observações ao movimento de outros isentos, do vento, das nuvens e dos astros, tira suas conclusões para o período invernoso a seguir. Ele começou a se familiarizar com esses sinais aos 11 anos de idade e, desde então, foi aumentando sua sensibilidade para os sinais da natureza.

Cantoria dos pássaros

Tendo no cupim seu principal parceiro na hora de prever a próxima estação chuvosa, Antônio Tavares da Silva – o Lima vem dos avós – também encontra na cantoria do sabiá, da cuã e da mãe-de-lua, motivo para animação. Para ele, as árvores também falam ou carregam suas ramagens de esperança para o lavrador. Afirma, categórico: “Se os homens são atingidos por inundações é porque não respeitam a natureza. Os pássaros tem como se proteger, ganharam asas para voar”.

Um pouco mais alto observa Paulo Costa, o único com formação acadêmica no grupo de profetas populares e um dos participantes dos encontros da categoria desde a primeira edição, em 1997. Associando o movimento dos astros a numerologia, complementando seus estudos com um ritual místico herdado do avô, denominado por ele “Arca de Noé”, em matéria de diagnóstico invernoso só perde para Chico Leiteiro. São apenas 3% a menos de percentual de acerto entre um e outro.

Aposentado, poeta popular por amor a arte dos versos, o profeta Erasmo Barreira tem como certa a chegada de inverno observando a árvore do juazeiro carregada e na casa do João-de-barro. Curiosamente, a natureza deu a esse pássaro o dom de construir seu ninho em forma de forno com a porta virada para o poente quando vem chuva, no nascente. No ano passado, ele expôs um ninho de João-de-barro ao público. Dessa vez pretende levar um bagaço de formigueiro, outra prova de chuvarada por vir. Aprendeu tudo com o pai, José Pergentino Barreira, que hoje completa 102 anos. No início, o poeta levava as previsões do pai para o Encontro. Chegou a receber o título de “Profeta de Procuração”. Para ele a cerimônia que reúne os sábios populares é um dos momentos mais importantes já experimentados em sua vida.

Homenagens especiais

Na cerimônia de apresentação dos “mestres da chuva” um deles, João Ferreira de Lima, receberá uma homenagem especial, póstuma. Ele morreu no início do ano. Completaria 81 anos em abril próximo. Nas suas experiências o profeta popular, conhecido por sua seriedade, se baseava principalmente na barra de Natal. João Ferreira nasceu em Choró, quando o município ainda pertencia a Quixadá.

Além dele, o presidente da Assembleia Legislativa, Domingos Filho, o diretor regional do Sebrae, Alci Porto, o secretário de Cultura do Estado, Auto Filho, o secretário de Desenvolvimento Econômico de Quixadá, Nascimento Marques, a presidenta da Fundação Cultural Rachel de Queiroz, Sandra Venâncio, o radialista Jonas Sousa e o cantor Raimundo Fagner serão homenageados.

Fagner não poderá comparecer ao Encontro. Ele faz show no Rio. A comenda será entregue a um representante. João Soares e Helder Cortez, idealizadores e organizadores do evento, explicam: Ele tem uma identificação peculiar com a simplicidade sertaneja, procura na natureza os sinais da esperança.

MAIS INFORMAÇÕES
XIV Encontro de Profetas da Chuva
Amanhã, às 9h – Clube do Agrônomo / Aç. do Cedro / Quixadá, V Encanta Quixadá, hoje, às 19h, (88) 9635.0828

O QUE ELES PENSAM

Sinais da natureza orientam prognóstico no CE

Com certeza teremos inverno grosso este ano novamente. Eu não queria prever coisa assim, mas será meio perigoso. Se a quadra da Lua Cheia não me enganar, vai acontecer isso mesmo. Até o mar cresce quando ela aparece. Esses estudiosos falam desse tal de “Ninho”, mas pelo jeito até os pássaros vão ter de procurar lugar mais seguro pra ficar, principalmente se construíram casa dentro de rio. O inverno deve ser grande a partir do fim de março.
Chico Leiteiro
Leiteiro

O nosso povo pode esperar muita chuva este ano. Nas minhas observações, o inverno se configura mesmo a partir de março, com chuvas pesadas. A tendência é ser, inclusive, superior ao do ano passado. Os ciclos se completam de fevereiro para março e de abril para julho, onde a expectativa é de aguaceiro. Os números dos meses provam isso. Os dias das semanas de fevereiro são os mesmos de março e os de abril são iguais aos de julho. As chuvas devem continuar até o mês de outubro.
Paulo Costa
Odontólogo

O inverno começa firme ainda no fim do mês de fevereiro para o início de março e se segura firme até maio. A chuvarada deve ser na mesma proporção do ano passado, mas em áreas isoladas do Interior do Estado. Além dos sinais dos bichos e dos insetos, os ventos do Aracati anunciam isso. Vem muita chuva por ai, com ventos fortes e muita trovoada. Formiga não é satélite, mas tem antena, e as delas tão alvoroçadas, indicando que vem muita água por ai no Estado.
Antônio Lima
Agricultor

V ENCANTA QUIXADÁ

Cantadores de viola abrem encontro

Noitada cultural ao som de violas e repentes marca a abertura, hoje, do encontro dos profetas das chuvas

Quixadá. Quem também rende tributo à tradição nordestina e aos observadores do tempo são os cantadores de viola. Seis duplas se apresentam na véspera do Encontro, na noitada cultural do V Encanta Quixadá, no Centro Cultural Rachel de Queiroz, hoje à noite. Geraldo Amâncio comanda a peleja de repente. Além dos repentistas, está programada a apresentação do espetáculo teatral “Profetas da Chuva”. Chico Mariano e Paroara contam para o público suas experiências. O curioso da encenação são os intérpretes. Os dois profetas de Quixadá são vividos por duas atrizes: Clara Colin e Paula Cavalcanti. Elas já apresentam a peça desde 2007 no Sul do País. Simples e narrativo o prólogo, intercalado com cantos de modas sertanejos, envolve o público à linguagem peculiar e observações dos protagonistas sobre a quadra invernosa no sertão. “Acontece muita prosa enquanto o céu se prepara para ficar bonito de chover”.

As duas atrações, o Encontro dos Profetas da Chuva e o Encanta Quixadá são abertas ao público. A entrada é franca. Parcerias com o Sebrae, Banco do Brasil e Caixa Econômica e mais de uma dezena de patrocinadores garantem o acesso gratuito e a realização dos dois eventos.

Quando o comerciário João Soares de Freitas e o agrônomo Hélder dos Santos Cortez resolveram reunir um grupo de observadores da natureza, a buscarem nela os sinais de probabilidade invernosa no Interior do Ceará, não imaginavam quão importante papel científico e cultural protagonizavam a partir de 1997. No pequeno auditório da Câmara de Dirigentes Lojistas (CDL) de Quixadá, reuniram Antão Mendes, Antônio Lima, Antônio Alexandre dos Santos, Antônio Anastácio da Silva (Paroara), Expedito Epifânio da Silva, Francisco Mariano e Joaquim Ferreira Santiago, o Joaquim Muqueca, Paulo Costa, Raimundo Mota Silva e Ribamar Lima.

João Soares recorda do I Encontro. O inverno daquele ano foi considerado bom. Observado o registro efetuado por ele e o parceiro idealizador, então gerente regional da Cagece, Hélder Cortez, houve divergências entre os prognósticos dos profetas da chuva.

Cinco deles estimaram inverno fraco. Outros cinco avaliaram como regular. Apenas as observações de Antônio Alexandre dos Santos se enquadram nas perspectivas. Ele foi curto e objetivo: bom inverno. Este ano não será diferente. Quem souber melhor interpretar a natureza, acerta mais.

ORIGEM

1997 foi o ano de início da promoção do Encontro de Profetas das Chuvas, evento idealizado pelo comerciário João Soares de Freitas e o engenheiro agrônomo Hélder dos Santos Cortez

ALEX PIMENTEL
Colaborador

>Do folclore à cidadania ambiental

>
Ensaio

Do folclore à cidadania

Em tempos de discussão acirrada sobre os efeitos do aquecimento global, o pesquisador Renzo Taddei avalia que as populações tradicionalmente vitimadas pelas secas têm contribuição decisiva para consolidação do que chama de cidadania ambiental

Renzo Taddei
especial para O POVO
Fortaleza – 24 Out 2009 – 12h48min
Foto: TALITA ROCHA

A ideia de que as variações do clima vão afetar de forma decisiva o futuro das pessoas é uma novidade com a qual habitantes de muitos centros urbanos do planeta – e do Brasil – agora têm que se acostumar. Para a população do nordeste semiárido, não há nenhuma novidade aí. Foi assim desde sempre. Por isso, poderíamos dizer que, pelo menos nesse sentido, há uma certa cearencização da população urbana mundial.

Brincadeiras a parte, o fato é que há lições a serem aprendidas com as sociedades que convivem com ameaças climáticas há muito tempo. Se ficarem confirmadas previsões a respeito da expansão do semiárido brasileiro e da desertificação de partes do cerrado, como resultado do aquecimento global, o Ceará e estados vizinhos irão, certamente, exportar conhecimento a respeito da convivência com o semiárido.

Os esforços de convivência com o semiárido deram-se, no Ceará, por exemplo, em duas frentes. Uma, mais antiga, está pautada nos conhecimentos da cultura popular, que desde muito cedo na história da região – antes mesmo da chegada dos europeus – já buscava decifrar os enigmas colocados pelo clima, fator fundamental para a garantia da sobrevivência. A outra, mas recente, é o uso da técnica e da ciência na criação de condições materiais para que a vulnerabilidade das populações à seca seja reduzida.

No primeiro grupo, encontramos um sem-número de métodos tradicionais de previsão do tempo e do clima, componentes da base de conhecimentos comuns e compartilhados da população sertaneja, ainda que feitos mais visíveis nas figuras dos profetas da chuva do sertão. Além da atividade de prever as chuvas, as estratégias populares combinam ainda formas específicas de organizar sua produção local, como a escolha de sementes e a criação de animais resistentes à pouca chuva. E tudo isso emoldurado dentro de um marco de referência religioso, em que todas as coisas, inclusive o clima, são parte de uma ordem cósmica regida pelo criador, a quem cabe aos sertanejos pedir proteção e amparo.

No segundo, têm papel de destaque o envio de engenheiros ao estado, por Pedro II, ainda no período imperial, de modo a iniciar um longo processo de institucionalização dos esforços de combate aos efeitos das secas. O momento mais decisivo deste processo foi, certamente, a criação do que viria mais tarde a ser conhecido pela sigla Dnocs.

O Dnocs faz um século de vida. É interessante, e inspirador, ver como, depois de tanto tempo trilhando caminhos separados, paralelos, atuando com os mesmos objetivos, a técnica e a cultura popular têm finalmente a oportunidade de se encontrar. Há dois eventos que ocorrem regularmente no Ceará que sinalizam nessa direção. O primeiro é a reunião anual de profetas da chuva do município de Quixadá. Nessa reunião, que acontece no segundo sábado do mês de janeiro, profetas da região anunciam publicamente os seus prognósticos para a estação de chuvas.

O que têm chamado a minha atenção, desde que comecei a observar estes encontros, é o fato de que muitos dos profetas apresentam-se como “pesquisadores“, uma vez que, dizem eles, suas previsões são baseadas em observação sistemática da natureza, e não em inspirações metafísicas. Por essa razão, muitos destes indivíduos não se sentem confortáveis com o rótulo de profetas. Além disso, é comum que suas previsões incluam algum grau de incerteza associada ao clima, da mesma forma como ocorre em previsões meteorológicas.

A diferença, obviamente, está na linguagem: entre os profetas, a incerteza frequentemente é comunicada através da mensagem “só Deus sabe tudo“. Ou seja, fica claro, nas palavras e previsões dos profetas da chuva, que já não há uma rejeição do conhecimento técnico e das formas “urbanas“ de pensamento, como encontramos, por exemplo, nos poemas Cante lá que eu canto cá ou Ao dotô do avião, de Patativa do Assaré.

O segundo evento, que ocorre com mais frequência, são as reuniões dos comitês de bacias hidrográficas, que ocorrem em todo o Ceará, e, graças à atuação do Dnocs, em várias outras regiões do Nordeste. Nos comitês, técnicos e membros das lideranças populares locais discutem as formas de uso e administração das águas dos açudes do Estado. Como não há água sem chuva, não há conhecimento hídrico sem conhecimento meteorológico. É nos comitês, desta forma, que o conhecimento popular e o conhecimento técnico interagem, criando a possibilidade da criação da verdadeira participação democrática.

Esse panorama está inserido dentro de um contexto mundial em que a expressão cidadania ambiental ganha significados novos e importantes. Todos os habitantes desse planeta se entendem, agora, de alguma forma vinculados à questão ambiental global, seja como consumidores, seja como afetados. E exigem informação a respeito do meio ambiente, e voz no que tange a como sociedade e meio ambiente interagem.

Está aí colocado um desafio real, para o mundo e para o Ceará. Infelizmente, é preciso que se diga, no Ceará e nos demais estados do Nordeste, ainda que a participação popular seja fundamental nos comitês de bacias, há certa resistência com relação ao conhecimento popular. A visão técnica a respeito da natureza é preponderante, e muitas vezes outras formas de conhecimento não tem vez nem voz. Na democratização das águas no Brasil, há um passo faltando, o passo da inclusão da diversidade cultural. Para que se crie uma verdadeira democracia ambiental, aqui e em outros países, todas as vozes têm que ter sua vez.

RENZO TADDEI é antropólogo e professor da Universidade Federal do Rio de Janeiro (UFRJ)

>I know I know nothing; but at least I know that

>
Willem Buiter – FT’s Maverecon Blog

September 10, 2009 3:00pm

Science with very few (if any) data

Doing statistical analysis on a sample of size 1 is either a very frustrating or a very liberating exercise. The academic discipline known as history falls into that category. Most applied social science, including applied economics, does too. Applied economists use means fair and foul to try to escape from the reality that economics is not a discipline where controlled experiments are possible. The situation that an economically relevant problem can be studied by means of a control group and a treatment group that are identical as regards all but one external or exogenous driver, whose influence can as a result be isolated, identified and measured, does not arise in practice.

Time series analysis (which establishes associations or correlations between measurable variables over time), cross section analysis (which studies associations or correlations between between variables observed during a common time period but differing by some other criterion (say location, individual or family identity or whatnot)), and panel data analysis, which combines time-series and cross-section data and methodology, all struggle (mostly in vain) with the problem that the economic analyst cannot control for all relevant influences on the behaviour of the phenomenon he is investigating, GDP growth or unemployment, say. The reason for this is first and foremost that the investigating economist doesn’t have a clue as to what should be on an exhaustive list of possible relevant drivers of GDP growth or unemployment. Second, many of the key variables he is aware of may not be measurable or only be measurable with serious errors. Expectations are an example. Finally, the long list of possible explanatory variables that are omitted from consideration are extremely unlikely to be statistically independent of the errors or residuals from a statistical or econometric analysis/estimation that uses just a truncated set of explanatory variables. The result: biased and inconsistent estimates of parameters and other features of the relationship between the explanandum and the explanans.

Economists have made a growth industry of seeking out or concocting quasi-natural experiments that might look like controlled experiments in the natural sciences. Freakonomics: A Rogue Economist Explores the Hidden Side of Everything by Steven Levitt and Stephen J. Dubner contains a sampling of this kind of work.

I have not read a single one of these quasi-natural experiment studies where one could not easily come up with a long list of potentially relevant omitted explanatory variables. Such ‘unobserved heterogeneity’ means that other things were definitely not equal, and the attribution of cause and effect is bound to be flawed. In addition, the determined chase for yet another quasi-controlled or natural experiment has led many economists to look under the lamppost for their missing knowledge, not because that is where they lost it, but because that’s where the light is. A flood of cute but irrelevant studies of issues of no conceivable economic significance has been undertaken simply because a cute but irrelevant natural experiment had been conducted.

Experimental economics was the last refuge of the empirically desperate. It mostly involves paying a bunch of undergraduates (or, if you have a very small budget, graduates) to play a variety of simple games with monetary and occasionally non-monetary pay-offs for the participants. The actions of the players in these highly artificial settings – artificial if only because the players are aware they are the guineau pigs in some academic experiment – are meant to cast light on the behaviour of people in real-word situations subject to similar constraints and facing similar incentives. Halfway between proper experimental or laboratory economics and natural experiments are ‘constructed experiments’ (aka randomised experiments) in which a (team of) economists conducts an experiment in a real-world setting and uses randomised evaluation methods to make inferences and test hypotheses. Typically, the guinea pigs in such randomised experiments are a selection of Indian villagers or other poor population groups – a little research grant goes a long way in a very poor environment. Again, the intentions are good, but the ceteris paribus assumption – that all other material influences on behaviour have either been controlled for or don’t distort the results of the study because they are independent of the unexplained component of behaviour in the constructed experiment – is (a) untestable and (b) generally implausible.

Of course, economics and the other social sciences are not alone in being bereft of meaningful controlled experiments. Two of the jewels in the ‘hard’ or natural science crown, cosmology and the theory of evolution, provide generous companionship for the social science waifs. Scientists at CERN may be able (wittingly or unwittingly) to create little bangs and mini black holes a few miles below ground in Switzerland and France, but this does not amount to a test of the big bang theory. They have no more than one dodgy observation on that. Evolutionary biologists may be able to observe evolution at work in real time ‘in the small’, that is, in microorganisms, butterflies etc. but they don’t have replications of the 4.5 billion year history of the earth, through a collection of parallel universes each of which differs from the others in one observable and measurable aspect only. This does not mean that anything goes. Finding fossils that can confidently be dated to be around 150 million years old makes rather a hash of strict young earth creationist accounts that date the creation of the universe to somewhere between 5,700 and 10,000 years ago. Other propositions, like intelligent design, cannot be proven or disproved and are therefore not scientific in nature.

So what is the poor economist to do when confronted with the need to make predictions or counterfactual analyses? Fundamentally, you pray and you punt.

How do we evaluate the reliability or quality of such forecasts and analyses? Ex-post, by matching them up against outcomes, in the case of forecasts. This is not straighforward – very few economists make completely unconditional forecasts, but it is in principle feasible. In the case of a counterfactual analysis where the counterfactual policy action or event did not take place – who knows? A necessary test of the acceptability of a counterfactual argument is its logical coherence – its internal consistency. That probably gets rid of about 90 percent of what is released into the public domain, but still leaves us with a lot of counterfactual propositions (and forecasts that can not yet be tested against outcomes).

For the counterfactual proposition and the forecasts beyond today that survive the test of logical coherence, all we have is the one data point of history to lend some plausiblity.

What will be the likely shape of the recovery in the overdeveloped world?

What does history teach us about the likely shape of the recovery in the overdeveloped world following the bottoming out of global GDP? Is the financial collapse phase of the Great Depression a good guide? Probably not, because the banking sector and the financial system of the late 1920s and early 1930s were so different from those that went down the chute starting August 2007. A financial sector that is one layer deep, with banks funding themselves mainly from household deposits and investing mainly in loans to the non-financial business sector, is a very different animal from the multi-layered financial sector in the second half of the first decade of the 21st century.

The modern banking system is part of a multi-layered financial sector. It funds itself to a large degree in financial wholesale markets and has on the asset side of its balance sheet many financial instruments issued by other banks and by non-bank financial institutions, including off-balance sheets vehicles of the banks. Rapid deleveraging of a 1930s-style single-layer financial system is bound to be massively disruptive for the real economy. Rapid deleveraging of a modern multi-layered financial system needs not be massively disruptive for the real economy, although it will be if it is done in an uncoordinated, voluntary manner. Since most liabilities (assets) of banks are assets (liabilities) of other banks and non-bank financial institutions, an intelligent, coordinated netting or write-down of intra-financial sector assets and liabilities is technically possible without significant impact on the exposure (on both sides of the balance sheet) of the financial system as a whole to the non-financial sectors.

There are many legal and political obstacles to such a de-layering of financial intermediation – it amounts to the temporary imposition of a form of central planning on the key banks and their financial sector counterparties, but it could be done if the political will were there.

This important difference between the situation of the 1930s and that facing us today makes one more optimistic about the pace of the recovery to come. Against that, much of the tentative insight I have gained about the financial crisis has not come from the lessons of the 1930s but from emerging markets crises since the 1970s. Except for the important qualifier that the US dollar is a global reserve currency, and that the US government (and private sector) has most of its domestic and external liabilities denominated in US dollars, the pathologies of financial boom, bubble and bust in the US, the UK, Iceland, Ireland and Spain (and many of the Central and East European emerging market economies) track those of classical emerging market crises in South America, Asia and CEE in the 1990s, rather well.

The emerging market analogy makes one less optimistic about a robust recovery, as typically, emerging markets whose financial sector was destroyed by a serious financial crisis took many years to recover their pre-crisis growth rates and often never recovered their pre-crisis GDP paths.

But cleary, there are many differences in economic structure, policy regimes and external environment between the overdeveloped world of today and either the industrialised world of the 1930s or the emerging markets of the 1980s and 1990s. For starters, we are now aware of what happened in the 1930s and in the emerging markets (the arrow of history flies only in one direction). Another key difference is that today’s emerging markets and developing countries, whose domestic financial sectors have not been destroyed by the financial crisis, add up to 50 percent of global GDP. Even if China itself cannot be a global locomotive (not even the little engine that could), a sufficient number of emerging markets jointly could lead the global economy out of recession. Controlling for all the things that make ceteris non paribus is a hopeless task. But just because it is hopeless does not mean that we can avoid it. Decisions have to be taken and choices have to be made on the basis of the best available information and analysis, even if the best is pretty crummy. It is, however, key that if it is indeed the case that the best is no good, we admit to this and don’t pretend otherwise. False certainty can be deadly.

Earlier this week, Joseph Stiglitz, told Bloomberg that the U.S. economy faces a significant chance of contracting again after emerging from the worst recession since the Great Depression of the 1930s.

“There’s a significant chance of a W, but I don’t think it’s inevitable,” … . The economy “could just bounce along the bottom.” Sure, but how does he know it whether it will be a V, a W, a rotated and reflected L, a W or double dip, a triple jump or a quadruple bypass? With a sample of at most size one to underpin one’s forecast, the quality of the argument/analysis that produces and supports the forecast is essential. The forecast itself is indeed likely to be much less important than the reasoning behind it. If the conclusion is open-ended, the story better be good.

>Forecasting: Relationship of trust is permanently damaged

>
By Chris Giles – Financial Times – October 5 2009

No one, not even the most gloomy economic forecaster, predicted the global economy’s sudden collapse a year ago. Those that said the crisis would get nasty can feel vindicated, but even they did not foresee the global crisis of confidence that followed the collapse of Lehman Brothers.

Arguably, they could not, says Bart van Ark, chief economist of the Conference Board, the global business organisation. “What you cannot forecast is a shock,” he says.

But in a less precise sense, the crisis of the past year has been the most predicted shock in history. For years, economists and officials have warned that many aspects of the world’s economic system were unsustainable.

“Even if nobody could be sure exactly when the timebomb was going to explode, it was at least useful to know we were sitting on one,” says Andrew Smith, chief economist of accountancy firm KPMG.

Most thought a crisis would arise from a loss of confidence in the dollar, resulting from the huge and prolonged US trade deficit. Instead, something similar arose after it was found that the capital flows to the US had been frittered away by banks lending money to unsuitable sub-prime borrowers and keeping those debts in their off-balance-sheet vehicles.

But though the risk of an economic crisis was regularly highlighted, it was impossible to incorporate into economic models and was therefore put to one side – filed in the “interesting and scary, but not very likely” box. That meant economic forecasts have had a terrible recent record.

Just over a year ago, in summer 2008, the International Monetary Fund thought the prospects for both 2008 and 2009 had improved and were forecasting world economic growth of 3.9 per cent in 2009 – far from a recession. The forecast was a reflection of the prevailing consensus and fears of global inflation.

Then Lehman filed for bankruptcy and economists began a painful process of recognising the likelihood of recession in the advanced world, the spread of the pain to the entire global economy and that the economic crisis was even deeper than had been thought.

By November, the IMF had almost halved its 2009 forecast to 2.2 per cent. And by April this year the Fund was talking about a “slump”, with world output sliding in 2009 by 1.4 per cent, the first drop since the second world war.

By spring 2009, the consensus of private forecasters expected the worst economic performance in the world economy since the Great Depression and little recovery in 2010.

The ever-changing outlook has raised many questions for those making policy. And yet economists have never been so uncertain about the future. It depends crucially on the financial system and its interaction with confidence and trust among people around the world.

Willem Buiter, professor of European political economy at the London School of Economics, argues that it is a fallacy to think economic models, particularly those based on history, can hope to understand the fundamental relationships in a large and complex economy. “So what is the poor economist to do when confronted with the need to make predictions or counterfactual analyses? Fundamentally, you pray and you punt,” he says.

Economists and policymakers are doing just that, alongside telling everyone that their forecasts are dependent on specific events and highly uncertain. Users of forecasts now know they cannot simply take them on trust.

In future there is also likely to be a greater tendency for economists to use their own judgment rather than the outcome of many equations run through a large computer, says Mr van Ark. “Don’t just rely on your model,” he says, “and you want to combine the short-term outlook with a longer term perspective”.

Modesty among economists is also something that will be needed in future, a soft skill that comes with difficulty to many.

But with signs that the global economy is recovering and will grow next year, the world is already hanging on forecasters’ words, however unwise that might be.

Economists will still be stuck with the problem that the unpredictable will happen. And when it does, the vast majority will have much egg on their faces.

* * *

Há paralelos muito interessantes entre o que a economia está vivendo agora, e o cotidiano da meteorologia. Ambos vivem de prognósticos sobre o futuro.

Economistas, que nos Estados Unidos gostam de atuar na área de psicologia (e vice-versa – mas psicólogos, até onde eu sei, não fazem prognósticos públicos), agora sentem na pele os dilemas e pressões só sentidas por meteorologistas.

E, com relação ao uso da experiência e do bom senso como complemento (ou no lugar de) modelos matemáticos – vide parágrafos finais do texto acima -, quando um meteorologista faz isso, é criticado por todos os lados (vide livro de Gary Alan Fine, Authors of the Storm).