Arquivo da tag: ciência

Conferência sobre aquecimento começa sem clima na África do Sul (Folha de São Paulo)

C e-mail 4393, de 28 de Novembro de 2011.

COP-17, que reúne 190 países até o dia 10, não tem o objetivo de conseguir um novo acordo.

Já virou clichê dizer que as conferências do clima nunca alcançam o objetivo desejado. A COP-17 (17ª Conferência das Partes da Convenção do Clima das Nações Unidas), que começa hoje sob o signo da crise econômica, deve romper esse padrão: nela, o próprio objetivo foi diluído. Os diplomatas de 190 países que se reúnem de hoje ao próximo dia 10 em Durban, na África do Sul, não perseguem mais um acordo global contra emissões de gases-estufa. O que está em jogo é a continuidade ou não do acordo que existe hoje, o pífio Protocolo de Kyoto.

Para a diplomacia brasileira, a reunião terá sido um sucesso se as nações desenvolvidas concordarem em prolongar a vida do protocolo até 2020. E um fracasso em Durban traria um ônus extra para o Brasil, que sediará a próxima conferência ambiental da ONU, a Rio +20.

Kyoto, assinado em 1997, previa que os países industrializados cortassem suas emissões em 5,2% em relação a 1990 até 2012. Como se sabe, os EUA ficaram de fora, e o acordo teve impacto virtualmente nulo sobre a concentração global de gases-estufa na atmosfera, que cresceu 7% de 1997 a 2011.

Não há acordo sobre o tipo de regime que possa ampliar o combate às emissões de carbono depois que ele expirar. “Se deixarmos morrer Kyoto, o consenso é que não se vai mais conseguir um acordo desse tipo”, disse o embaixador André Corrêa do Lago, negociador-chefe do Brasil na área de clima.

Ainda mais inútil – O problema é que também há consenso de que um eventual segundo período de compromisso de Kyoto será ainda mais inútil do que o primeiro para o objetivo-mor da convenção: evitar que o planeta aqueça mais de 2°C. Os EUA, principal emissor histórico, não ratificarão Kyoto nunca. Os países emergentes, hoje os maiores emissores do planeta, não têm metas obrigatórias pelo acordo.

E outros países industrializados com obrigações no acordo, como Japão e Rússia, já anunciaram que não participarão de um segundo período: apenas dizem que vão implementar as metas voluntárias de corte de emissões com que se comprometeram na conferência de Copenhague, em 2009.

Corrêa do Lago admite que esse cenário deixa dentro de Kyoto apenas a União Europeia e outros países menores, que somam somente 15% das emissões mundiais. Sem Kyoto, porém, os países em desenvolvimento temem que se perca a diferenciação que obriga os países ricos (que poluíram mais no passado) a fazer mais.

Os países desenvolvidos, por sua vez, apelam para um acordo único. Na semana passada, o ministro do Ambiente britânico, Chris Huhne, defendeu que um tratado legalmente vinculante que envolvesse também os emergentes fosse fechado em 2015. O Brasil – que se obrigou, por lei, a cortar emissões até 2020 – não fecha a porta a um acordo desses. Mas antes os ricos terão de entregar Kyoto.

Outro impasse deve girar em torno do dinheiro que os países ricos prometeram desembolsar para o combate à mudança climática nos pobres: US$ 30 bilhões entre 2010 e 2012 e um Fundo Verde de US$ 100 bilhões por ano a partir de 2020. Com a crise da dívida dos EUA e o colapso financeiro da Europa, os principais doadores, falar em dinheiro para o clima é a proverbial corda em casa de enforcado.

A crise tem feito os países ricos levantarem dúvidas sobre que tipo de verba constitui o Fundo Verde. O discurso dos ricos agora, dizem diplomatas brasileiros, é que o dinheiro do fundo verde deve ser sobretudo privado. “Não foram setores privados que se comprometeram com o dinheiro, portanto eles não poderão ser cobrados”, afirmou o diplomata brasileiro André Odembreit.

É tarde para conter aquecimento, diz análise – Enquanto os diplomatas tentam tirar as negociações internacionais sobre o clima da irrelevância, cientistas alertam que é provavelmente tarde demais para evitar a mudança climática perigosa.

Um relatório divulgado na semana passada pelo Pnuma (Programa das Nações Unidas para o Meio Ambiente) sugere que o planeta terá em 2020, na melhor das hipóteses, 6 bilhões de toneladas de CO₂ “sobrando” no ar em relação ao que precisaria para cumprir a meta de evitar um aquecimento global maior do que 2°C neste século. Para ter mais de 66% de chance de cumprir a meta, seria preciso limitar as emissões de gases-estufa a 44 bilhões de toneladas de CO₂ em 2020.

Hoje elas são de 50 bilhões de toneladas, e permanecerão nessa faixa somente se todos os países cumprirem estritamente as metas mais ambiciosas com as quais disseram que poderiam se comprometer no Acordo de Copenhague, em 2009 -a UE, por exemplo, disse que cortaria 30% de suas emissões em vez dos 20% que prometeu, mas só se outros países aumentassem sua ambição.

Caso pouco seja feito – o que parece o cenário mais provável considerando o contexto político atual -, as emissões atingirão 55 bilhões de toneladas, e o “buraco” para cumprir a meta será de 9 bilhões em vez de 6 bilhões de toneladas de CO2 em 2020. Mesmo a trajetória mais benigna de emissões põe o planeta no rumo de esquentar de 2,5°C a 5°C até 2100.

O relatório do Pnuma, intitulado “Bridging the Gap” (algo como “Tapando o Buraco”), tenta passar uma mensagem positiva: ele afirma que é “tecnicamente possível e economicamente viável” fechar o buraco de 6 bilhões de toneladas até 2020 cortando emissões em vários setores.

A probabilidade de que isso aconteça, porém, é tão pequena que nem os cientistas que elaboraram o documento acreditam nela. “Até a véspera da divulgação do estudo, nós ainda estávamos divididos sobre se deveríamos passar uma mensagem esperançosa ou pessimista”, disse à Folha Suzana Kahn Ribeiro, professora da Coppe-UFRJ e uma das coordenadoras do trabalho.

Na divulgação, porém, prevaleceu a necessidade política do Pnuma de adotar a estratégia da esperança, para estimular os negociadores em Durban a tentar um resultado mais ambicioso.

Can Climate Science Predict Extreme Weather? (Scientific American)

This year’s rash of severe weather is changing climate science. As policymakers call for better information, scientists are scrambling to understand the link between increasing emissions and natural disaster

By Joshua Zaffos and The Daily Climate  | November 2, 2011

Halloween Weekend Snow Paints a Ghostly Picture in the U.S. NortheastImage: NASA Goddard Photo and Video

DENVER, Colo. – 2011 may well be remembered as the year of extremeweather in the United States, with drought in Texas, floods along the Mississippi River, a freak October snowstorm on the East Coast. Tornadoes alone would make the year memorable, with some 1,270 twisters causing 544 deaths and $25 billion in damages.

The outbreak is reshaping climate science, as researchers hone their abilities to predict severe weather and link the record-shattering destruction to humanity’s increasing emissions.

The goal: To provide better information to policymakers and local officials who must plan for and adapt to changes. “It’s a rapidly developing field,” said Peter Stott, head of climate monitoring and attribution at the Met Office, Britain’s national weather service.

Stott led one of the first studies attributing a single extreme weather event to climate change: The 2003 European heat wave, which killed 40,000 and was the hottest summer on record since 1540.  The study concluded that human influence more than doubled the event’s likelihood.

Last week, a study published in the journal Proceedings of the National Academy of Sciences concluded there was an 80 percent chance that the killer Russian heat wave of 2010 would not have happened without the added push of global warming.

Now, Stott and other researchers are melding weather forecasting skills with pioneering computer models to attribute – or link – individual weather events to climate change. Understanding how climate change influences the weather is increasingly seen as key to predicting natural disasters, Stott said, and the new studies should help policymakers anticipate the conditions and trends associated with weather extremes. “There’s this very strong connection between attribution and prediction,” noted Stott, who spoke on these issues before colleagues last week at the World Climate Research Programme conference here in Denver.

The efforts are steering the next steps of the Intergovernmental Panel on Climate Change, the international body of scientists that reviews and assesses the vast pool of climate research for policymakers worldwide. The next IPCC assessment is due in 2013 – the fifth from the panel since 1990. For the first time it will include “predictions” – near-term and long-term climate forecasts based on actual conditions – instead of “projections” that simulate hypothetical scenarios and carbon emission rates.

The distinction is an advance for climate science and the IPCC, said Kevin Trenberth, who runs the Climate Analysis Section at the National Center for Atmospheric Research in Boulder, Colo.

Previous IPCC assessments have cited projections that are not grounded in current or historical conditions. Instead, policymakers are given “what-if” scenarios, such as how the future climate might react to different greenhouse-gas emissions over the coming decades. The results show changes between two assumed moments in time, Trenberth said, but they lack a starting point tied to observed data and, ultimately, are informed guesses of future carbon dioxide levels and their consequences.

Climate model predictions, on the other hand, are like weather forecasts. They start from a current or historical moment to analyze climate changes. By grabbing more measurements and using new techniques, advanced climate models reveal more clearly how the atmosphere responds to increased water moisture, warmer sea temperatures and melting sea ice, all impacts of increased carbon. Compared to projections, predictions allow scientists to offer near-term climate forecasts, which should help policymakers prepare for potential adaptations in the next few decades.

The change from projections to predictions is made possible in part by a new generation of more powerful computer models. The last IPCC assessment report, published in 2007, made minor mention of feedbacks – environmental processes and interactions that can intensify extreme climate events, said Sonia Seneviratne, a professor at the Swiss Federal Institute of Technology in Zurich, who specializes on the topic. New data have since opened up scientists’ understanding of their importance.

Seneviratne is the lead author of an IPCC special report to be released later this month that focuses on climate change and extreme events. Research in this area has been “substantial and justifies a separate assessment,” she said, adding that it’s particularly a topic of interest for officials concerned with disaster preparedness and risk reduction.

The special report concludes that scientists are “virtually certain” the world will see more extremes in heat and that some places in the world will become “increasingly marginal as places to live,” according to the Associated Press, which obtained a draft. The draft also concludes there is at least a two-in-three chance that man-made global warming has already worsened weather extremes, according to the AP. The document is subject to change and needs approval from diplomats meeting in Uganda mid-month.

There are some caveats to these new climate predictions, however.

Writing in a scientific journal last year, Trenberth warned colleagues that the promise of more accurate representations of climate change will introduce new scientific uncertainties inherent to modeling more complex and realistic situations. Just as weather forecasting evokes its share of skepticism and doubt, climate predictions will likely represent a new communications challenge – and fodder for controversy and criticism – for climate scientists, said Trenberth.

“It’s about communication,” agreed Stott, who is the lead author of the 2013 IPCC report’s section on attribution and detection of climate change. “An understanding of where extreme weather fits into the longer-term picture of a changing climate helps people put this into context, and [whether] this is something that is going to become more common in the future and therefore we need to give more attention and be more prepared for these things.”

Stott and other scientists at a handful of modeling centers worldwide are focusing on the relation between climate change and extreme weather through a new initiative, Attribution of Climate-related Events. Stott says the project will move scientists further along toward forecasting extreme events and mapping the interactions with climate change.

“The goal is to be able to develop the tools and the skills, so we know when we can be confident and provide trustworthy assessments, and to do this in a timely fashion – in the immediate aftermath of a particular situation,” Stott said. “At the moment, we’re not really geared up for that. It’s very much research mode.> We’ve hardly scratched the surface.”

This article originally appeared at The Daily Climate, the climate change news source published by Environmental Health Sciences, a nonprofit media company.

Joshua Zaffos is an independent journalist based in Fort Collins, Colo., His work has also appeared in High Country, Miller-McCune, and Wired. DailyClimate.org is a foundation-funded news service that covers climate change.

The human cause of climate change: Where does the burden of proof lie? (Wiley)

Dr. Kevin Trenberth advocates reversing the ‘null hypothesis’

Public release date: 3-Nov-2011
Contact: Ben Norman
44-124-377-0375
Wiley-Blackwell

The debate may largely be drawn along political lines, but the human role in climate change remains one of the most controversial questions in 21st century science. Writing in WIREs Climate Change Dr Kevin Trenberth, from the National Center for Atmospheric Research, argues that the evidence for anthropogenic climate change is now so clear that the burden of proof should lie with research which seeks to disprove the human role.

In response to Trenberth’s argument a second review, by Dr Judith Curry, focuses on the concept of a ‘null hypothesis’ the default position which is taken when research is carried out. Currently the null hypothesis for climate change attribution research is that humans have no influence.

“Humans are changing our climate. There is no doubt whatsoever,” said Trenberth. “Questions remain as to the extent of our collective contribution, but it is clear that the effects are not small and have emerged from the noise of natural variability. So why does the science community continue to do attribution studies and assume that humans have no influence as a null hypothesis?”

To show precedent for his position Trenberth cites the 2007 report by the Intergovernmental Panel on Climate Change which states that global warming is “unequivocal”, and is “very likely” due to human activities.

Trenberth also focused on climate attribution studies which claim the lack of a human component, and suggested that the assumptions distort results in the direction of finding no human influence, resulting in misleading statements about the causes of climate change that can serve to grossly underestimate the role of humans in climate events.

“Scientists must challenge misconceptions in the difference between weather and climate while attribution studies must include a human component,” concluded Trenberth. “The question should no longer be is there a human component, but what is it?”

In a second paper Dr Judith Curry, from the Georgia Institute of Technology, questions this position, but argues that the discussion on the null hypothesis serves to highlight fuzziness surrounding the many hypotheses related to dangerous climate change.

“Regarding attribution studies, rather than trying to reject either hypothesis regardless of which is the null, there should be a debate over the significance of anthropogenic warming relative to forced and unforced natural climate variability,” said Curry.

Curry also suggested that the desire to reverse the null hypothesis may have the goal of seeking to marginalise the climate sceptic movement, a vocal group who have challenged the scientific orthodoxy on climate change.

“The proponents of reversing the null hypothesis should be careful of what they wish for,” concluded Curry. “One consequence may be that the scientific focus, and therefore funding, would also reverse to attempting to disprove dangerous anthropogenic climate change, which has been a position of many sceptics.”

“I doubt Trenberth’s suggestion will find much support in the scientific community,” said Professor Myles Allen from Oxford University, “but Curry’s counter proposal to abandon hypothesis tests is worse. We still have plenty of interesting hypotheses to test: did human influence on climate increase the risk of this event at all? Did it increase it by more than a factor of two?”

###

All three papers are free online:

Trenberth. K, “Attribution of climate variations and trends to human influences and natural variability”: http://doi.wiley.com/10.1002/wcc.142

Curry. J, “Nullifying the climate null hypothesis”: http://doi.wiley.com/10.1002/wcc.141

Allen. M, “In defense of the traditional null hypothesis: remarks on the Trenberth and Curry opinion articles”: http://doi.wiley.com/10.1002/wcc.145

Mixed messages on climate ‘vulnerability’ (BBC)

13 November 2011 Last updated at 14:45 GMT

Cyclist in floodThere are concerns that climate change may exacerbate flooding in cities such as Bangkok

One of the most striking new voices on climate change that’s emerged since the UN summit in Copenhagen two years ago is the Climate Vulnerable Forum.

The grouping includes small island states vulnerable to extreme weather events and sea level rise, those with immense spans of low-lying coastline such as Vietnam and Bangladesh, and dry nations of East Africa.

It’s currently holding a meeting in Bangladesh, with UN Secretary-General Ban Ki-moon as the keynote speaker.

These countries feel vulnerable as a result of several types of projected climate impact.

In increasing order of suddenness, there are what you might call “steady-state” impacts such as rising sea levels; increased separation of weather into more concentrated wet periods and dry periods; and a greater occurrence of extreme weather events such as hurricanes, floods, heatwaves and droughts.

But what can science really tell us about these extremes?

While the vulnerable meet in Dhaka, the Intergovernmental Panel on Climate Change (IPCC) will be sitting down in Kampala to answer the question.

For almost a week, government delegates will pore over the summary of the IPCC’s latest report on extreme weather, with the lead scientific authors there as well. They’re scheduled to emerge on Friday with an agreed document.

The draft, which has found its way into my possession, contains a lot more unknowns than knowns.

On the one hand, it says it is “very likely” that the incidence of cold days and nights has gone down and the incidence of warm days and nights has risen globally.

And the human and financial toll of extreme weather events has risen.

Human hand fingered?

But when you get down to specifics, the academic consensus is far less certain.

Glacier, AlaskaEnhanced glacier melt could speed up sea level rise in the coming decades

There is “low confidence” that tropical cyclones have become more frequent, “limited-to-medium evidence available” to assess whether climatic factors have changed the frequency of floods, and “low confidence” on a global scale even on whether the frequency has risen or fallen.

In terms of attribution of trends to rising greenhouse gas concentrations, the uncertainties continue.

While it is “likely” that anthropogenic influences are behind the changes in cold days and warm days, there is only “medium confidence” that they are behind changes in extreme rainfall events, and “low confidence” in attributing any changes in tropical cyclone activity to greenhouse gas emissions or anything else humanity has done.

(These terms have specific meanings in IPCC-speak, with “very likely” meaning 90-100% and “likely” 66-100%, for example.)

And for the future, the draft gives even less succour to those seeking here a new mandate for urgent action on greenhouse gas emissions, declaring: “Uncertainty in the sign of projected changes in climate extremes over the coming two to three decades is relatively large because climate change signals are expected to be relatively small compared to natural climate variability”.

It’s also explicit in laying out that the rise in impacts we’ve seen from extreme weather events cannot be laid at the door of greenhouse gas emissions: “Increasing exposure of people and economic assets is the major cause of the long-term changes in economic disaster losses (high confidence).

“Long-term trends in normalized economic disaster losses cannot be reliably attributed to natural or anthropogenic climate change.”

The succour only lasts for so long, however.

If the century progresses without restraints on greenhouse gas emissions, their impacts will come to dominate, it forecasts:

  • “It is very likely that the length, frequency and/or intensity of warm spells, including heat waves, will continue to increase over most land areas…
  • “It is likely that the frequency of heavy precipitation or the proportion of total rainfall from heavy falls will increase in the 21st Century over many areas of the globe…
  • “Mean tropical cyclone maximum wind speed is likely to increase…
  • “There is medium confidence that droughts will intensify in the 21st Century in some seasons and areas…
  • “Low-probability high-impact changes associated with the crossing of poorly understood thresholds cannot be excluded, given the transient and complex nature of the climate system.”

The draft report makes clear that lack of evidence or lack of confidence on a particular impact doesn’t mean it won’t occur; just that it’s hard to tell.

Climate a distraction?

It’s impossible to read the draft without coming away with the impression that with or without anthropogenic climate change, extreme weather impacts are going to be felt more and more, simply because there are more and more people on planet Earth – particularly in the swelling “megacities” of the developing world that overwhelmingly lie on the coast or on big rivers close to the coast.

President NasheedPresident Nasheed of the Maldives has warned that climate change may mean the end of his nation

The current Bangkok floods are a case in point.

As UK academic Mike Hulme and others have argued, such events will occur whether exacerbated by climate change or not; and vulnerable societies need protection irrespective of climate change.

He’s argued for a divorce, therefore, between the issues of adaptation, which he says could usefully be added into the overall process of overseas development assistance, and mitigation of emissions.

It’s not proved to be a popular notion with developing world governments, which remain determined to tie the two together in the UN climate process.

Governments of vulnerable countries argue that as developed nations caused the climate change problem, they must compensate those that suffer its impacts with money above and beyond aid.

Developing countries like the fact that under the UN climate process, the rich are committed to funding adaptation for the poor.

Yet as the brief prepared for the Dhaka meeting by the humanitarian charity Dara shows, it isn’t happening anywhere near as fast as it ought to be.

Only 8% of the “fast-start finance” pledged in Copenhagen, it says, has actually found its way to recipients.

It’s possible – no, it’s “very likely” – that the IPCC draft will be amended as the week progresses, and presumably the governments represented at the Climate Vulnerable Forum will be asking their delegates to inject a greater sense of urgency.

Although there are sobering messages, they’re not for everyone.

The warning that “some local areas will become increasingly marginal as places to live or in which to maintain livelihoods” under increased climate impacts, and that “for locations such as atolls, in some cases it is possible that many residents will have to relocate” are, in their understated way, quite chilling.

But very few of the world’s seven billion live on atolls; so will this be enough to provide a wake-up call to other countries?

It’s also possible to argue that extreme weather isn’t really the issue for the small island developing states, or for those with long flat coastlines.

The big issue (which the IPCC is much more confident about) is sea level rise – slow, progressive, predictable; capable of being dealt with in some cases (think the Netherlands) provided the will and money are there.

But capable of wiping a country off the map if those two factors are absent.

This is one of the reasons why the Climate Vulnerable Forum established itself.

They felt that although both developed and developing nations understood vulnerability in theory, they didn’t get the message viscerally.

Whether they will by the end of the week when the IPCC releases the final version, I’m not so sure.

Anthropologists Consider a New Code of Ethics (The Chronicle of Higher Education)

November 20, 2011

By Dan Berrett

Today’s anthropologists are apt to work far away from the unspoiled villages that brought fame to the discipline’s early practitioners.

Instead, they might be in a hospital room observing patients, at a construction site gauging its archaeological significance, or in a corporate office examining organizational behavior, among other scenarios.

Those diverse contexts may explain why it has proved to be no easy job for anthropologists to create a new set of ethical guidelines. After three years spent seeking opinion and working on new guidelines, the American Anthropological Association is moving toward changes that some in the discipline fear will water down anthropologists’ obligations to the people they study.

“Dealing with ethics codes is complicated,” said David H. Price, a member of the committee charged with revising the guidelines. The word was echoed last week by fellow committee members at a panel on ethics at the association’s annual meeting here. Basic ethical principles might seem clear at the outset, but then point to different courses of action depending on the context, said Mr. Price, a professor at Saint Martin’s University, in Washington. “You can start with something simple, like ‘Do no harm,'” he said, and then find yourself hamstrung if those guidelines are written too specifically ­— or lost at sea if they are too vague.

One of the most notable changes in the proposed new code was to remove what many anthropologists call the “prime directive.”

The previous code, which dates to 1998 (though incremental changes have been made since then), told anthropologists that they “have primary ethical obligations to the people, species, and materials they study and to the people with whom they work.”

By many accounts, that directive has meant that an anthropologist’s obligation to his or her research subject can eclipse the goal of acquiring new knowledge. In other words, if research goes against the interests of subjects, then that research ought to be stopped.

The newer version, which the association’s executive board accepted for review at this year’s meeting but did not formally adopt, is more nuanced. It explains that the primary ethical obligation is “to avoid doing harm to the lives, communities, or environments” that anthropologists study.

The shift struck some as important. At other sessions during the annual meeting, several speakers and audience members said they held themselves to a different standard. It was not enough to keep from hurting their subjects. They should advocate for them.

The new code may do little to change that sense of obligation. It persists, in part, because of the assumption that an anthropologist is still that lone researcher closely observing a vulnerable tribe in a remote area, some on the committee said.

“That pure anthropology maybe never existed,” said Dena K. Plemmons, chair of the committee and a research ethicist at the University of California at San Diego. “Our subjects are tremendously diverse and we have diverse responsibilities.”

For example, Simon J. Craddock Lee, an assistant professor of medical anthropology at the University of Texas Southwestern Medical Center at Dallas, said his subjects are “well-paid cancer surgeons who give care to disenfranchised people.”

He has obligations to both groups, he said. “If my subjects are doctors, how do I balance my obligations to the people who are truly vulnerable?”

One audience member suggested that his chief loyalty should be to the person or group who is most at risk of harm among those being studied.

While that might seem straightforward, Mr. Lee replied, everyone—including the poor and vulnerable—has an agenda.

“We can’t assume there’s a David-and-Goliath relationship,” he said. “It’s not clean enough to say you can sort the good sheep from the goats.”

Ethics, or Politics?

The question of clandestine research offered another case in which a seemingly simple principle can become complicated when applied to field work. To some, discouraging clandestine research meant that an anthropologist should never deceive subjects and should always share his or her work publicly.

But Laura A. McNamara, an anthropologist who works for the U.S. Department of Energy’s Sandia National Laboratories, disagreed, saying that some anthropologists study classified information; they cannot make their findings public.

Even deceit can have its place, she added. Nancy Scheper-Hughes, a professor of medical anthropology at the University of California at Berkeley, for example, did research that exposed the organ-trafficking trade. Her work never would have been made public if she had believed that her primary obligation was to her subjects, who were, after all, organ traffickers.

The real problem, Ms. McNamara and her fellow committee members agreed, is not when research is clandestine, but when it is “compartmentalized,” which means a researcher may not know who is using or financing the research, or what the implications will be.

“There is no way you can communicate an informed perspective,” she said.

How anthropologists wield ethical guidelines also came up for scrutiny. Anthropologists push most fervently to revise their ethics when they disagree with the politics underlying controversial research, several speakers noted.

“We go to high Sturm und Drang” about ethics, Ms. McNamara said, when political objections arise about who is doing anthropological research for whom—especially when it’s for the government, corporations, or the rich and powerful. “Ethics becomes conflated with politics in ways that I find profoundly distressing,” she said.

Some anthropologists pushed to revise the ethics code in 2007, said Ms. Plemmons, when acontroversy erupted over the Human Terrain System, a program that embedded anthropologists with United States military units. The association’s executive board disapproved of anthropologists’ involvement in the act of making war, calling it “an unacceptable application of anthropological expertise” which should, instead, serve “the humane causes of global peace and social justice.”

Education and Punishment

Committee members said they also heard from anthropologists who wanted an ethics code that could be enforced. That way, anthropologists who act badly could be punished or cast out of the discipline.

The association once held the power to adjudicate claims of ethical breaches, Mr. Price said. But when he reviewed records of the association’s work from that period, he saw that most claims involved what he called “sleaziness,” or cases in which professors harassed students or took credit for their research. While unethical, those breaches were not specific to anthropology and needed no separate code beyond those that already exist, he said.

Assuming responsibility for adjudicating ethical disputes presented another set of problems, said several speakers. It would mean a new mission and structure for the association, which would have to hire investigators to police wrongdoing and claim the power to credential who gets to call him- or herself an anthropologist. Many times, such complaints can be handled through an institutional review board or a university.

The association has seen first-hand how difficult such investigations can be. In 2001 and 2002, it probed claims of wrongdoing and ethical malpractice against anthropologists and geneticists in the Amazon in the 1960s. The association later published a report finding fault with some of the scholars’ conduct in what became known as the Darkness in El Dorado controversy (after a journalist’s account by that name), only to rescind its own report in 2005.

Besides, the ethics committee surveyed members and learned that most anthropologists are not all that interested in using ethical guidelines as a means to punish each other. What most anthropologists wanted, they said, was some form of general guidance, an educational tool to train future anthropologists.

World headed for irreversible climate change in five years, IEA warns (The Guardian)

If fossil fuel infrastructure is not rapidly changed, the world will ‘lose for ever’ the chance to avoid dangerous climate change

Fiona Harvey, environment correspondent
guardian.co.uk, Wednesday 9 November 2011 10.01 GMT
Pollution due to carbon emissions due to rise says IEA : Coal burning power plant, Kentucky, USA

Any fossil fuel infrastructure built in the next five years will cause irreversible climate change, according to the IEA. Photograph: Rex Features

The world is likely to build so many fossil-fuelled power stations, energy-guzzling factories and inefficient buildings in the next five years that it will become impossible to hold global warming to safe levels, and the last chance of combating dangerous climate change will be “lost for ever”, according to the most thorough analysis yet of world energy infrastructure.

Anything built from now on that produces carbon will do so for decades, and this “lock-in” effect will be the single factor most likely to produce irreversible climate change, the world’s foremost authority on energy economics has found. If this is not rapidly changed within the next five years, the results are likely to be disastrous.

“The door is closing,” Fatih Birol, chief economist at the International Energy Agency, said. “I am very worried – if we don’t change direction now on how we use energy, we will end up beyond what scientists tell us is the minimum [for safety]. The door will be closed forever.”

If the world is to stay below 2C of warming, which scientists regard as the limit of safety, then emissions must be held to no more than 450 parts per million (ppm) of carbon dioxide in the atmosphere; the level is currently around 390ppm. But the world’s existing infrastructure is already producing 80% of that “carbon budget”, according to the IEA’s analysis, published on Wednesday. This gives an ever-narrowing gap in which to reform the global economy on to a low-carbon footing.

If current trends continue, and we go on building high-carbon energy generation, then by 2015 at least 90% of the available “carbon budget” will be swallowed up by our energy and industrial infrastructure. By 2017, there will be no room for manoeuvre at all – the whole of the carbon budget will be spoken for, according to the IEA’s calculations.

Birol’s warning comes at a crucial moment in international negotiations on climate change, as governments gear up for the next fortnight of talks in Durban, South Africa, from late November. “If we do not have an international agreement, whose effect is put in place by 2017, then the door to [holding temperatures to 2C of warming] will be closed forever,” said Birol.

But world governments are preparing to postpone a speedy conclusion to the negotiations again. Originally, the aim was to agree a successor to the 1997 Kyoto protocol, the only binding international agreement on emissions, after its current provisions expire in 2012. But after years of setbacks, an increasing number of countries – including the UK, Japan and Russia – now favour postponing the talks for several years.

Both Russia and Japan have spoken in recent weeks of aiming for an agreement in 2018 or 2020, and the UK has supported this move. Greg Barker, the UK’s climate change minister, told a meeting: “We need China, the US especially, the rest of the Basic countries [Brazil, South Africa, India and China] to agree. If we can get this by 2015 we could have an agreement ready to click in by 2020.” Birol said this would clearly be too late. “I think it’s very important to have a sense of urgency – our analysis shows [what happens] if you do not change investment patterns, which can only happen as a result of an international agreement.”

Nor is this a problem of the developing world, as some commentators have sought to frame it. In the UK, Europe and the US, there are multiple plans for new fossil-fuelled power stations that would contribute significantly to global emissions over the coming decades.

The Guardian revealed in May an IEA analysis that found emissions had risen by a record amount in 2010, despite the worst recession for 80 years. Last year, a record 30.6 gigatonnes (Gt) of carbon dioxide poured into the atmosphere from burning fossil fuels, a rise of 1.6Gt on the previous year. At the time, Birol told the Guardian that constraining global warming to moderate levels would be “only a nice utopia” unless drastic action was taken.

The new research adds to that finding, by showing in detail how current choices on building new energy and industrial infrastructure are likely to commit the world to much higher emissions for the next few decades, blowing apart hopes of containing the problem to manageable levels. The IEA’s data is regarded as the gold standard in emissions and energy, and is widely regarded as one of the most conservative in outlook – making the warning all the more stark. The central problem is that most industrial infrastructure currently in existence – the fossil-fuelled power stations, the emissions-spewing factories, the inefficient transport and buildings – is already contributing to the high level of emissions, and will do so for decades. Carbon dioxide, once released,stays in the atmosphere and continues to have a warming effect for about a century, and industrial infrastructure is built to have a useful life of several decades.

Yet, despite intensifying warnings from scientists over the past two decades, the new infrastructure even now being built is constructed along the same lines as the old, which means that there is a “lock-in” effect – high-carbon infrastructure built today or in the next five years will contribute as much to the stock of emissions in the atmosphere as previous generations.

The “lock-in” effect is the single most important factor increasing the danger of runaway climate change, according to the IEA in its annual World Energy Outlook, published on Wednesday.

Climate scientists estimate that global warming of 2C above pre-industrial levels marks the limit of safety, beyond which climate change becomes catastrophic and irreversible. Though such estimates are necessarily imprecise, warming of as little as 1.5C could cause dangerous rises in sea levels and a higher risk of extreme weather – the limit of 2C is now inscribed in international accords, including the partial agreement signed at Copenhagen in 2009, by which the biggest developed and developing countries for the first time agreed to curb their greenhouse gas output.

Another factor likely to increase emissions is the decision by some governments to abandon nuclear energy, following the Fukushima disaster. “The shift away from nuclear worsens the situation,” said Birol. If countries turn away from nuclear energy, the result could be an increase in emissions equivalent to the current emissions of Germany and France combined. Much more investment in renewable energy will be required to make up the gap, but how that would come about is unclear at present.

Birol also warned that China – the world’s biggest emitter – would have to take on a much greater role in combating climate change. For years, Chinese officials have argued that the country’s emissions per capita were much lower than those of developed countries, it was not required to take such stringent action on emissions. But the IEA’s analysis found that within about four years, China’s per capita emissions were likely to exceed those of the EU.

In addition, by 2035 at the latest, China’s cumulative emissions since 1900 are likely to exceed those of the EU, which will further weaken Beijing’s argument that developed countries should take on more of the burden of emissions reduction as they carry more of the responsibility for past emissions.

In a recent interview with the Guardian recently, China’s top climate change official, Xie Zhenhua, called on developing countries to take a greater part in the talks, while insisting that developed countries must sign up to a continuation of the Kyoto protocol – something only the European Union is willing to do. His words were greeted cautiously by other participants in the talks.

Continuing its gloomy outlook, the IEA report said: “There are few signs that the urgently needed change in direction in global energy trends is under way. Although the recovery in the world economy since 2009 has been uneven, and future economic prospects remain uncertain, global primary energy demand rebounded by a remarkable 5% in 2010, pushing CO2 emissions to a new high. Subsidies that encourage wasteful consumption of fossil fuels jumped to over $400bn (£250.7bn).”

Meanwhile, an “unacceptably high” number of people – about 1.3bn – still lack access to electricity. If people are to be lifted out of poverty, this must be solved – but providing people with renewable forms of energy generation is still expensive.

Charlie Kronick of Greenpeace said: “The decisions being made by politicians today risk passing a monumental carbon debt to the next generation, one for which they will pay a very heavy price. What’s seriously lacking is a global plan and the political leverage to enact it. Governments have a chance to begin to turn this around when they meet in Durban later this month for the next round of global climate talks.”

One close observer of the climate talks said the $400bn subsidies devoted to fossil fuels, uncovered by the IEA, were “staggering”, and the way in which these subsidies distort the market presented a massive problem in encouraging the move to renewables. He added that Birol’s comments, though urgent and timely, were unlikely to galvanise China and the US – the world’s two biggest emittters – into action on the international stage.

“The US can’t move (owing to Republican opposition) and there’s no upside for China domestically in doing so. At least China is moving up the learning curve with its deployment of renewables, but it’s doing so in parallel to the hugely damaging coal-fired assets that it is unlikely to ever want (to turn off in order to) to meet climate targets in years to come.”

Energy demand

Energy demand Source: IEAChristiana Figueres, the UN climate chief, said the findings underlined the urgency of the climate problem, but stressed the progress made in recent years. “This is not the scenario we wanted,” she said. “But making an agreement is not easy. What we are looking at is not an international environment agreement — what we are looking at is nothing other than the biggest industrial and energy revolution that has ever been seen.”

Where Did Global Warming Go? (N.Y. Times)

By ELISABETH ROSENTHAL
Published: October 15, 2011

Mark Pernice and Scott Altmann

IN 2008, both the Democratic and Republican candidates for president, Barack Obama and John McCain, warned about man-made global warming and supported legislation to curb emissions. After he was elected, President Obama promised “a new chapter in America’s leadership on climate change,” and arrived cavalry-like at the 2009 United Nations Climate Conference in Copenhagen to broker a global pact.

But two years later, now that nearly every other nation accepts climate change as a pressing problem, America has turned agnostic on the issue.

In the crowded Republican presidential field, most seem to agree with Gov. Rick Perry of Texas that “the science is not settled” on man-made global warming, as he said in a debate last month. Alone among Republicans onstage that night, Jon M. Huntsman Jr. said that he trusted scientists’ view that the problem was real. At the moment, he has the backing of about 2 percent of likely Republican voters.

Though the evidence of climate change has, if anything, solidified, Mr. Obama now talks about “green jobs” mostly as a strategy for improving the economy, not the planet. He did not mention climate in his last State of the Union address. Meanwhile, the administration is fighting to exempt United States airlines from Europe’s new plan to charge them for CO2 emissions when they land on the continent. It also seems poised to approve a nearly 2,000-mile-long pipeline, from Canada down through the United States, that will carry a kind of oil. Extracting it will put relatively high levels of emissions into the atmosphere.

“In Washington, ‘climate change’ has become a lightning rod, it’s a four-letter word,” said Andrew J. Hoffman, director of the University of Michigan’s Erb Institute for Sustainable Development.

Across the nation, too, belief in man-made global warming, and passion about doing something to arrest climate change, is not what it was five years or so ago, when Al Gore’s movie had buzz and Elizabeth Kolbert’s book about climate change, “Field Notes From a Catastrophe,” was a best seller. The number of Americans who believe the earth is warming dropped to 59 percent last year from 79 percent in 2006, according to polling by the Pew Research Group. When the British polling firm Ipsos Mori asked Americans this past summer to list their three most pressing environmental worries, “global warming/climate change” garnered only 27 percent, behind even “overpopulation.”

This fading of global warming from the political agenda is a mostly American phenomenon. True, public enthusiasm for legislation to tackle climate change has flagged somewhat throughout the developed world since the recession of 2008. Nonetheless, in many other countries, legislation to control emissions has rolled out apace. Just last Wednesday, Australia’s House of Representatives passed a carbon tax, which is expected to easily clear the country’s Senate. Europe’s six-year-old carbon emissions trading system continues its yearly expansion. In 2010, India passed a carbon tax on coal. Even China’s newest five-year plan contains a limited pilot cap-and-trade system, under which polluters pay for excess pollution.

The United States is the “one significant outlier” on responding to climate change, according to a recent global research report produced by HSBC, the London-based bank. John Ashton, Britain’s special representative for climate change, said in an interview that “in the U.K., in Europe, in most places I travel to” — but not in the United States — “the starting point for conversation is that this is real, there are clear and present dangers, so let’s get a move on and respond.” After watching the Republican candidates express skepticism about global warming in early September, former President Bill Clinton put it more bluntly, “I mean, it makes us — we look like a joke, right?”

Americans — who produce twice the emissions per capita that Europeans do — are in many ways wired to be holdouts. We prefer bigger cars and bigger homes. We value personal freedom, are suspicious of scientists, and tend to distrust the kind of sweeping government intervention required to confront rising greenhouse gas emissions.

“Climate change presents numerous ideological challenges to our culture and our beliefs,” Professor Hoffman of the Erb Institute says. “People say, ‘Wait a second, this is really going to affect how we live!’ ”

There are, of course, other factors that hardened resistance: America’s powerful fossil-fuel industry, whose profits are bound to be affected by any greater control of carbon emissions; a cold American winter in 2010 that made global warming seem less imminent; and a deep recession that made taxes on energy harder to talk about, and job creation a more pressing issue than the environment — as can be seen in the debate over the pipeline from Canada.

But it is also true that Europe has endured a deep recession and has had mild winters. What’s more, some of the loudest climate deniers are English. Yet the European Union is largely on target to meet its goal of reducing emissions by at least 20 percent over 1990 levels by 2020.

Connie Hedegaard, the European Union’s commissioner on climate action, told me recently: “Look, it was not a piece of cake here either.”

In fact, many countries in Europe have come to see combating climate change and the move to a “greener” economy as about “opportunities rather than costs,” Mr. Ashton said. In Britain, the low-carbon manufacturing sector has been one of the few to grow through the economic slump.

“One thing I’ve been pleasantly surprised about in the E.U. is that despite the economic and financial crisis, the momentum on climate change has more or less continued,” Mr. Ashton said.

And Conservatives, rather than posing an obstacle, are directing aggressive climate policies in much of the world. Before becoming the European Union’s commissioner for climate action, Ms. Hedegaard was a well-known Conservative politician in her native Denmark. In Britain, where a 2008 law required deep cuts in emissions, a coalition Conservative government is now championing a Green Deal.

In the United States, the right wing of the Republican Party has managed to turn skepticism about man-made global warming into a requirement for electability, forming an unlikely triad with antiabortion and gun-rights beliefs. In findings from a Pew poll this spring, 75 percent of staunch conservatives, 63 percent of libertarians and 55 percent of Main Street Republicans said there was no solid evidence of global warming.

“This has become a partisan political issue here in a way it has not elsewhere,” said Andrew Kohut, president of the Pew Research Center. “We are seeing doubts in the U.S. largely because the issue has become a partisan one, with Democrats” — 75 percent of whom say they believe there is strong evidence of climate change — “seeing one thing and Republicans another.”

Europeans understand the challenges in the United States, though they sound increasingly impatient. “We are very much aware of the political situation in the United States and we don’t say ‘do this,’ when we know it can’t get through Congress,” said Ms. Hedegaard, when she was in New York for the United Nations General Assembly last month. But she added:

“O.K. if you can’t commit today, when can you? When are you willing to join in? Australia is making a cap-and-trade system. South Korea is introducing one. New Zealand and the E.U. have it already. So when is the time? That’s the question for the U.S.”

MEANWHILE, in the developing world, emerging economies like India and China are now pursuing aggressive climate policies. “Two years ago the assumption was that the developed world would have to lead, but now China, India and Brazil have jumped in with enthusiasm, and are moving ahead,” said Nick Robins of HSBC Global Research.

Buffeted by two years of treacherous weather that they are less able to handle than richer nations — from floods in India to water shortages in China — developing countries are feeling vulnerable. Scientists agree that extreme weather events will be more severe and frequent on a warming planet, and insurance companies have already documented an increase.

So perhaps it is no surprise that regard for climate change as “a very serious problem” has risen significantly in many developing nations over the past two years. A 2010 Pew survey showed that more than 70 percent of people in China, India and South Korea were willing to pay more for energy in order to address climate change. The number in the United States was 38 percent. China’s 12th five-year plan, for 2011-2015, directs intensive investment to low carbon industries. In contrast, in the United States, there is “no prospect of moving ahead” at a national legislative level, Mr. Robins said, although some state governments are addressing the issue.

In private, scientific advisers to Mr. Obama say he and his administration remain committed to confronting climate change and global warming. But Robert E. O’Connor, program director for decision, risk and management sciences at the National Science Foundation in Washington, said a bolder leader would emphasize real risks that, apparently, now feel distant to many Americans. “If it’s such an important issue, why isn’t he talking about it?”

Elisabeth Rosenthal is a reporter and blogger on environmental issues for The New York Times.

Desafios do “tsunami de dados” (FAPESP)

Lançado pelo Instituto Microsoft Research-FAPESP de Pesquisas em TI, o livro O Quarto Paradigma debate os desafios da eScience, nova área dedicada a lidar com o imenso volume de informações que caracteriza a ciência atual

07/11/2011

Por Fábio de Castro

Agência FAPESP – Se há alguns anos a falta de dados limitava os avanços da ciência, hoje o problema se inverteu. O desenvolvimento de novas tecnologias de captação de dados, nas mais variadas áreas e escalas, tem gerado um volume tão imenso de informações que o excesso se tornou um gargalo para o avanço científico.

Nesse contexto, cientistas da computação têm se unido a especialistas de diferentes áreas para desenvolver novos conceitos e teorias capazes de lidar com a enxurrada de dados da ciência contemporânea. O resultado é chamado de eScience.

Esse é o tema debatido no livro O Quarto Paradigma – Descobertas científicas na era da eScience, lançado no dia 3 de novembro pelo Instituto Microsoft Research-FAPESP de Pesquisas em TI.

Organizado por Tony Hey, Stewart Tansley, Kristin Tolle – todos da Microsoft Research –, a publicação foi lançada na sede da FAPESP, em evento que contou com a presença do diretor científico da Fundação, Carlos Henrique de Brito Cruz.

Durante o lançamento, Roberto Marcondes Cesar Jr., do Instituto de Matemática e Estatística (IME) da Universidade de São Paulo (USP), apresentou a palestra “eScience no Brasil”. “O Quarto Paradigma: computação intensiva de dados avançando a descoberta científica” foi o tema da palestra de Daniel Fay, diretor de Terra, Energia e Meio Ambiente da MSR.

Brito Cruz destacou o interesse da FAPESP em estimular o desenvolvimento da eScience no Brasil. “A FAPESP está muito conectada a essa ideia, porque muitos dos nossos projetos e programas apresentam essa necessidade de mais capacidade de gerenciar grandes conjuntos de dados. O nosso grande desafio está na ciência por trás dessa capacidade de lidar com grandes volumes de dados”, disse.

Iniciativas como o Programa FAPESP de Pesquisa sobre Mudanças Climáticas Globais (PFPMCG), o BIOTA-FAPESP e o Programa FAPESP de Pesquisa em Bioenergia (BIOEN) são exemplos de programas que têm grande necessidade de integrar e processar imensos volumes de dados.

“Sabemos que a ciência avança quando novos instrumentos são disponibilizados. Por outro lado, os cientistas normalmente não percebem o computador como um novo grande instrumento que revoluciona a ciência. A FAPESP está interessada em ações para que a comunidade científica tome consciência de que há grandes desafios na área de eScience”, disse Brito Cruz.

O livro é uma coleção de 26 ensaios técnicos divididos em quatro seções: “Terra e meio ambiente”, “Saúde e bem-estar”, “Infraestrutura científica” e “Comunicação acadêmica”.

“O livro fala da emergência de um novo paradigma para as descobertas científicas. Há milhares de anos, o paradigma vigente era o da ciência experimental, fundamentada na descrição de fenômenos naturais. Há algumas centenas de anos, surgiu o paradigma da ciência teórica, simbolizado pelas leis de Newton. Há algumas décadas, surgiu a ciência computacional, simulando fenômenos complexos. Agora, chegamos ao quarto paradigma, que é o da ciência orientada por dados”, disse Fay.

Com o advento do novo paradigma, segundo ele, houve uma mudança completa na natureza da descoberta científica. Entraram em cena modelos complexos, com amplas escalas espaciais e temporais, que exigem cada vez mais interações multidisciplinares.

“Os dados, em quantidade incrível, são provenientes de diferentes fontes e precisam também de abordagem multidisciplinar e, muitas vezes, de tratamento em tempo real. As comunidades científicas também estão mais distribuídas. Tudo isso transformou a maneira como se fazem descobertas”, disse Fay.

A ecologia, uma das áreas altamente afetadas pelos grandes volumes de dados, é um exemplo de como o avanço da ciência, cada vez mais, dependerá da colaboração entre pesquisadores acadêmicos e especialistas em computação.

“Vivemos em uma tempestade de sensoriamento remoto, sensores terrestres baratos e acesso a dados na internet. Mas extrair as variáveis que a ciência requer dessa massa de dados heterogêneos continua sendo um problema. É preciso ter conhecimento especializado sobre algoritmos, formatos de arquivos e limpeza de dados, por exemplo, que nem sempre é acessível para o pessoal da área de ecologia”, explicou.

O mesmo ocorre em áreas como medicina e biologia – que se beneficiam de novas tecnologias, por exemplo, em registros de atividade cerebral, ou de sequenciamento de DNA – ou a astronomia e física, à medida que os modernos telescópios capturam terabytes de informação diariamente e o Grande Colisor de Hádrons (LHC) gera petabytes de dados a cada ano.

Instituto Virtual

Segundo Cesar Jr., a comunidade envolvida com eScience no Brasil está crescendo. O país tem 2.167 cursos de sistemas de informação ou engenharia e ciências da computação. Em 2009, houve 45 mil formados nessas áreas e a pós-graduação, entre 2007 e 2009, tinha 32 cursos, mil orientadores, 2.705 mestrandos e 410 doutorandos.

“A ciência mudou do paradigma da aquisição de dados para o da análise de dados. Temos diferentes tecnologias que produzem terabytes em diversos campos do conhecimento e, hoje, podemos dizer que essas áreas têm foco na análise de um dilúvio de dados”, disse o membro da Coordenação da Área de Ciência e Engenharia da Computação da FAPESP.

Em 2006, a Sociedade Brasileira de Computação (SBC) organizou um encontro a fim de identificar os problemas-chave e os principais desafios para a área. Isso levou a diferentes propostas para que o Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq) criasse um programa específico para esse tipo de problema.

“Em 2009, realizamos uma série de workshops na FAPESP, reunindo, para discutir essa questão, cientistas de áreas como agricultura, mudanças climáticas, medicina, transcriptômica, games, governo eletrônico e redes sociais. A iniciativa resultou em excelentes colaborações entre grupos de cientistas com problemas semelhantes e originou diversas iniciativas”, disse César Jr.

As chamadas do Instituto Microsoft Research-FAPESP de Pesquisas em TI, segundo ele, têm sido parte importante do conjunto de iniciativas para promover a eScience, assim como a organização da Escola São Paulo de Ciência Avançada em Processamento e Visualização de Imagens Computacionais. Além disso, a FAPESP tem apoiado diversos projetos de pesquisa ligados ao tema.

“A comunidade de eScience em São Paulo tem trabalhado com profissionais de diversas áreas e publicado em revistas de várias delas. Isso é indicação de qualidade adquirida pela comunidade para encarar o grande desafio que teremos nos próximos anos”, disse César Jr., que assina o prefácio da edição brasileira do livro.

  • O Quarto Paradigma
    Organizadores: Tony Hey, Stewart Tansley e Kristin Tolle
    Lançamento: 2011
    Preço: R$ 60
    Páginas: 263
    Mais informações: www.ofitexto.com.br

Scientists Find Evidence of Ancient Megadrought in Southwestern U.S. (Science Daily)

ScienceDaily (Nov. 6, 2011) — A new study at the the University of Arizona’s Laboratory of Tree-Ring Research has revealed a previously unknown multi-decade drought period in the second century A.D. The findings give evidence that extended periods of aridity have occurred at intervals throughout our past.

A cross section of wood shows the annual growth rings trees add with each growing season. Dark bands of latewood form the boundary between each ring and the next. Counting backwards from the bark reveals a tree’s age. (Credit: Photo by Daniel Griffin/Laboratory of Tree-Ring Research)

Almost 900 years ago, in the mid-12th century, the southwestern U.S. was in the middle of a multi-decade megadrought. It was the most recent extended period of severe drought known for this region. But it was not the first.

The second century A.D. saw an extended dry period of more than 100 years characterized by a multi-decade drought lasting nearly 50 years, says a new study from scientists at the University of Arizona.

UA geoscientists Cody Routson, Connie Woodhouse and Jonathan Overpeck conducted a study of the southern San Juan Mountains in south-central Colorado. The region serves as a primary drainage site for the Rio Grande and San Juan rivers.

“These mountains are very important for both the San Juan River and the Rio Grande River,” said Routson, a doctoral candidate in the environmental studies laboratory of the UA’s department of geosciences and the primary author of the study, which is upcoming in Geophysical Research Letters.

The San Juan River is a tributary for the Colorado River, meaning any climate changes that affect the San Juan drainage also likely would affect the Colorado River and its watershed. Said Routson: “We wanted to develop as long a record as possible for that region.”

Dendrochronology is a precise science of using annual growth rings of trees to understand climate in the past. Because trees add a normally clearly defined growth ring around their trunk each year, counting the rings backwards from a tree’s bark allows scientists to determine not only the age of the tree, but which years were good for growth and which years were more difficult.

“If it’s a wet year, they grow a wide ring, and if it’s a dry year, they grow a narrow ring,” said Routson. “If you average that pattern across trees in a region you can develop a chronology that shows what years were drier or wetter for that particular region.”

Darker wood, referred to as latewood because it develops in the latter part of the year at the end of the growing season, forms a usually distinct boundary between one ring and the next. The latewood is darker because growth at the end of the growing season has slowed and the cells are more compact.

To develop their chronology, the researchers looked for indications of climate in the past in the growth rings of the oldest trees in the southern San Juan region. “We drove around and looked for old trees,” said Routson.

Literally nothing is older than a bristlecone pine tree: The oldest and longest-living species on the planet, these pine trees normally are found clinging to bare rocky landscapes of alpine or near-alpine mountain slopes. The trees, the oldest of which are more than 4,000 years old, are capable of withstanding extreme drought conditions.

“We did a lot of hiking and found a couple of sites of bristlecone pines, and one in particular that we honed in on,” said Routson.

To sample the trees without damaging them, the dendrochronologists used a tool like a metal screw that bores a tiny hole in the trunk of the tree and allows them to extract a sample, called a core. “We take a piece of wood about the size and shape of a pencil from the tree,” explained Routson.

“We also sampled dead wood that was lying about the land. We took our samples back to the lab where we used a visual, graphic technique to match where the annual growth patterns of the living trees overlap with the patterns in the dead wood. Once we have the pattern matched we measure the rings and average these values to generate a site chronology.”

“In our chronology for the south San Juan mountains we created a record that extends back 2,200 years,” said Routson. “It was pretty profound that we were able to get back that far.”

The chronology extends many years earlier than the medieval period, during which two major drought events in that region already were known from previous chronologies.

“The medieval period extends roughly from 800 to 1300 A.D.,” said Routson. “During that period there was a lot of evidence from previous studies for increased aridity, in particular two major droughts: one in the middle of the 12th century, and one at the end of the 13th century.”

“Very few records are long enough to assess the global conditions associated with these two periods of Southwestern aridity,” said Routson. “And the available records have uncertainties.”

But the chronology from the San Juan bristlecone pines showed something completely new:

“There was another period of increased aridity even earlier,” said Routson. “This new record shows that in addition to known droughts from the medieval period, there is also evidence for an earlier megadrought during the second century A.D.”

“What we can see from our record is that it was a period of basically 50 consecutive years of below-average growth,” said Routson. “And that’s within a much broader period that extends from around 124 A.D. to 210 A.D. — about a 100-year-long period of dry conditions.”

“We’re showing that there are multiple extreme drought events that happened during our past in this region,” said Routson. “These megadroughts lasted for decades, which is much longer than our current drought. And the climatic events behind these previous dry periods are really similar to what we’re experiencing today.”

The prolonged drought in the 12th century and the newly discovered event in the second century A.D. may both have been influenced by warmer-than-average Northern Hemisphere temperatures, Routson said: “The limited records indicate there may have been similar La Nina-like background conditions in the tropical Pacific Ocean, which are known to influence modern drought, during the two periods.”

Although natural climate variation has led to extended dry periods in the southwestern U.S. in the past, there is reason to believe that human-driven climate change will increase the frequency of extreme droughts in the future, said Routson. In other words, we should expect similar multi-decade droughts in a future predicted to be even warmer than the past.

Routson’s research is funded by fellowships from the National Science Foundation, the Science Foundation Arizona and the Climate Assessment of the Southwest. His advisors, Woodhouse of the School of Geography and Development and Overpeck of the department of geosciences and co-director of the UA’s Institute of the Environment, are co-authors of the study.

The Human Cause of Climate Change: Where Does the Burden of Proof Lie? (Science Daily)

ScienceDaily (Nov. 3, 2011) — The debate may largely be drawn along political lines, but the human role in climate change remains one of the most controversial questions in 21st century science. Writing in WIREs Climate Change Dr Kevin Trenberth, from the National Center for Atmospheric Research, argues that the evidence for anthropogenic climate change is now so clear that the burden of proof should lie with research which seeks to disprove the human role.

Polar bear on melting ice. Experts argue that the evidence for anthropogenic climate change is now so clear that the burden of proof should lie with research which seeks to disprove the human role. (Credit: iStockphoto/Kristian Septimius Krogh)

In response to Trenberth’s argument a second review, by Dr Judith Curry, focuses on the concept of a ‘null hypothesis’ the default position which is taken when research is carried out. Currently the null hypothesis for climate change attribution research is that humans have no influence.

“Humans are changing our climate. There is no doubt whatsoever,” said Trenberth. “Questions remain as to the extent of our collective contribution, but it is clear that the effects are not small and have emerged from the noise of natural variability. So why does the science community continue to do attribution studies and assume that humans have no influence as a null hypothesis?”

To show precedent for his position Trenberth cites the 2007 report by the Intergovernmental Panel on Climate Change which states that global warming is “unequivocal,” and is “very likely” due to human activities.

Trenberth also focused on climate attribution studies which claim the lack of a human component, and suggested that the assumptions distort results in the direction of finding no human influence, resulting in misleading statements about the causes of climate change that can serve to grossly underestimate the role of humans in climate events.

“Scientists must challenge misconceptions in the difference between weather and climate while attribution studies must include a human component,” concluded Trenberth. “The question should no longer be is there a human component, but what is it?”

In a second paper Dr Judith Curry, from the Georgia Institute of Technology, questions this position, but argues that the discussion on the null hypothesis serves to highlight fuzziness surrounding the many hypotheses related to dangerous climate change.

“Regarding attribution studies, rather than trying to reject either hypothesis regardless of which is the null, there should be a debate over the significance of anthropogenic warming relative to forced and unforced natural climate variability,” said Curry.

Curry also suggested that the desire to reverse the null hypothesis may have the goal of seeking to marginalise the climate sceptic movement, a vocal group who have challenged the scientific orthodoxy on climate change.

“The proponents of reversing the null hypothesis should be careful of what they wish for,” concluded Curry. “One consequence may be that the scientific focus, and therefore funding, would also reverse to attempting to disprove dangerous anthropogenic climate change, which has been a position of many sceptics.”

“I doubt Trenberth’s suggestion will find much support in the scientific community,” said Professor Myles Allen from Oxford University, “but Curry’s counter proposal to abandon hypothesis tests is worse. We still have plenty of interesting hypotheses to test: did human influence on climate increase the risk of this event at all? Did it increase it by more than a factor of two?”

Ministro participa da inauguração de radar meteorológico do Ceará (Ascom do governo do Ceará)

JC e-mail 4378, de 04 de Novembro de 2011.

Aloizio Mercadante e o governador do Ceará, Cid Gomes, inauguraram o Radar Meteorológico Banda-S, em Quixeramobim (CE). Equipamento ajudará na previsão de secas e cheias.

Previsão de secas e cheias, mudanças climáticas e todos os eventos ligados a meteorologia passam a ser informados pela Fundação Cearense de Meteorologia e Recursos Hídricos (Funceme) com mais previsão, já que agora o órgão conta com um novo equipamento para captação dessas informações.

O novo Radar Meteorológico Banda-S foi inaugurado nesta quinta-feira (3) pelo governador Cid Gomes e o ministro da Ciência e Tecnologia, Aloizio Mercadante. Localizado no Morro de Santa Maria, em Quixeramobim, no Sertão Central, o equipamento vai funcionar como parte da Rede Cearense de Radares (RCR), por meio da integração com o Radar Doppler de Banda X instalado em Fortaleza. “Parece um equipamento aparentemente simples, mas por trás existe uma utilidade inimaginável. A tecnologia pode ser um aliado na melhoria da qualidade de vida da população, que é o nosso compromisso”, destacou Cid Gomes durante a inauguração.

Segundo explicou o governador, o novo equipamento pode informar condições climáticas bem específicas, como por exemplo “que no município de Nova Olinda, no Cariri, choveu cinco milímetros”, exemplificou. “Na medida que uma informação dessas é casada com outras, isso vai ajudar a diagnosticar por exemplo um período de seca ou de cheias. Somos um estado com quase 300 mil pequenos agricultores, e eles precisam de informações concretas para cuidar da colheita. E nisso o Radar vai ser bastante útil”, ressaltou Cid Gomes.

O Radar Banda-S tem capacidade para estimar uma precipitação dentro de um raio de 200 quilômetros. Além disso, pode fazer o monitoramento de sistemas meteorológicos que atuam em um alcance de até 400 quilômetros. Por sua capacidade e localização, também será possível obter informações não só do Ceará, como de vários estados nordestinos. “Esse é um instrumento de planejamento agrícola que vai beneficiar também muitos estados do Nordeste, como Paraíba, Pernambuco, Piauí e Rio Grande do Norte”, lembrou Aloizio Mercadante. O ministro também ressaltou sua importância na prevenção de desastres naturais, como longos períodos de seca ou chuvas bem acima da média. “Precisamos entender porque esses eventos acontecem e prevenir as ações que as mudanças climáticas podem causar”, explicou Mercadante.

Para a instalação do Radar Meteorológico Banda-S foram investidos R$ 14 milhões, sendo R$ 10 milhões partiram do Governo Federal, por meio do Ministério da Ciência e Tecnologia e Inovação (MCTI) e R$ 4 milhões do governo do estado do Ceará. Do total, R$ 12 milhões foram utilizados para a compra do equipamento e o restante (R$ 2 milhões) para a melhoria dos acessos ao local (construção de vias) e alimentação energética.

Segundo lembrou o secretário estadual da Ciência e Tecnologia, René Barreira, a instalação do Radar partiu de uma emenda de Ciro Gomes quando deputado federal, que aliado a sensibilidade do ex-presidente Lula, tornou possível a obra. “Com esse importante equipamento vamos ter um zoneamento agrícola e um controle mais efetivo e técnico dos eventos de grande risco”, ressaltou o secretário.

Fraud Case Seen as a Red Flag for Psychology Research (N.Y. Times)

By BENEDICT CAREY

Published: November 2, 2011

A well-known psychologist in the Netherlands whose work has been published widely in professional journals falsified data and made up entire experiments, an investigating committee has found. Experts say the case exposes deep flaws in the way science is done in a field,psychology, that has only recently earned a fragile respectability.

Joris Buijs/Pve

The psychologist Diederik Stapel in an undated photograph. “I have failed as a scientist and researcher,” he said in a statement after a committee found problems in dozens of his papers.

The psychologist, Diederik Stapel, of Tilburg University, committed academic fraud in “several dozen” published papers, many accepted in respected journals and reported in the news media, according to a report released on Monday by the three Dutch institutions where he has worked: the University of Groningen, the University of Amsterdam, and Tilburg. The journal Science, which published one of Dr. Stapel’s papers in April, posted an “editorial expression of concern” about the research online on Tuesday.

The scandal, involving about a decade of work, is the latest in a string of embarrassments in a field that critics and statisticians say badly needs to overhaul how it treats research results. In recent years, psychologists have reported a raft of findings on race biases, brain imaging and even extrasensory perception that have not stood up to scrutiny. Outright fraud may be rare, these experts say, but they contend that Dr. Stapel took advantage of a system that allows researchers to operate in near secrecy and massage data to find what they want to find, without much fear of being challenged.

“The big problem is that the culture is such that researchers spin their work in a way that tells a prettier story than what they really found,” said Jonathan Schooler, a psychologist at the University of California, Santa Barbara. “It’s almost like everyone is on steroids, and to compete you have to take steroids as well.”

In a prolific career, Dr. Stapel published papers on the effect of power on hypocrisy, on racial stereotyping and on how advertisements affect how people view themselves. Many of his findings appeared in newspapers around the world, including The New York Times, which reported in December on his study about advertising and identity.

In a statement posted Monday on Tilburg University’s Web site, Dr. Stapel apologized to his colleagues. “I have failed as a scientist and researcher,” it read, in part. “I feel ashamed for it and have great regret.”

More than a dozen doctoral theses that he oversaw are also questionable, the investigators concluded, after interviewing former students, co-authors and colleagues. Dr. Stapel has published about 150 papers, many of which, like the advertising study, seem devised to make a splash in the media. The study published in Science this year claimed that white people became more likely to “stereotype and discriminate” against black people when they were in a messy environment, versus an organized one. Another study, published in 2009, claimed that people judged job applicants as more competent if they had a male voice. The investigating committee did not post a list of papers that it had found fraudulent.

Dr. Stapel was able to operate for so long, the committee said, in large measure because he was “lord of the data,” the only person who saw the experimental evidence that had been gathered (or fabricated). This is a widespread problem in psychology, said Jelte M. Wicherts, a psychologist at the University of Amsterdam. In a recent survey, two-thirds of Dutch research psychologists said they did not make their raw data available for other researchers to see. “This is in violation of ethical rules established in the field,” Dr. Wicherts said.

In a survey of more than 2,000 American psychologists scheduled to be published this year, Leslie John of Harvard Business School and two colleagues found that 70 percent had acknowledged, anonymously, to cutting some corners in reporting data. About a third said they had reported an unexpected finding as predicted from the start, and about 1 percent admitted to falsifying data.

Also common is a self-serving statistical sloppiness. In an analysis published this year, Dr. Wicherts and Marjan Bakker, also at the University of Amsterdam, searched a random sample of 281 psychology papers for statistical errors. They found that about half of the papers in high-end journals contained some statistical error, and that about 15 percent of all papers had at least one error that changed a reported finding — almost always in opposition to the authors’ hypothesis.

The American Psychological Association, the field’s largest and most influential publisher of results, “is very concerned about scientific ethics and having only reliable and valid research findings within the literature,” said Kim I. Mills, a spokeswoman. “We will move to retract any invalid research as such articles are clearly identified.”

Researchers in psychology are certainly aware of the issue. In recent years, some have mocked studies showing correlations between activity on brain images and personality measures as “voodoo” science, and a controversy over statistics erupted in January after The Journal of Personality and Social Psychology accepted a paper purporting to show evidence of extrasensory perception. In cases like these, the authors being challenged are often reluctant to share their raw data. But an analysis of 49 studies appearing Wednesday in the journal PLoS One, by Dr. Wicherts, Dr. Bakker and Dylan Molenaar, found that the more reluctant that scientists were to share their data, the more likely that evidence contradicted their reported findings.

“We know the general tendency of humans to draw the conclusions they want to draw — there’s a different threshold,” said Joseph P. Simmons, a psychologist at the University of Pennsylvania’s Wharton School. “With findings we want to see, we ask, ‘Can I believe this?’ With those we don’t, we ask, ‘Must I believe this?’ ”

But reviewers working for psychology journals rarely take this into account in any rigorous way. Neither do they typically ask to see the original data. While many psychologists shade and spin, Dr. Stapel went ahead and drew any conclusion he wanted.

“We have the technology to share data and publish our initial hypotheses, and now’s the time,” Dr. Schooler said. “It would clean up the field’s act in a very big way.”

Mathematically Detecting Stock Market Bubbles Before They Burst (Science Daily)

ScienceDaily (Oct. 31, 2011) — From the dotcom bust in the late nineties to the housing crash in the run-up to the 2008 crisis, financial bubbles have been a topic of major concern. Identifying bubbles is important in order to prevent collapses that can severely impact nations and economies.

A paper published this month in the SIAM Journal on Financial Mathematics addresses just this issue. Opening fittingly with a quote from New York Federal Reserve President William Dudley emphasizing the importance of developing tools to identify and address bubbles in real time, authors Robert Jarrow, Younes Kchia, and Philip Protter propose a mathematical model to detect financial bubbles.

A financial bubble occurs when prices for assets, such as stocks, rise far above their actual value. Such an economic cycle is usually characterized by rapid expansion followed by a contraction, or sharp decline in prices.

“It has been hard not to notice that financial bubbles play an important role in our economy, and speculation as to whether a given risky asset is undergoing bubble pricing has approached the level of an armchair sport. But bubbles can have real and often negative consequences,” explains Protter, who has spent many years studying and analyzing financial markets.

“The ability to tell when an asset is or is not in a bubble could have important ramifications in the regulation of the capital reserves of banks as well as for individual investors and retirement funds holding assets for the long term. For banks, if their capital reserve holdings include large investments with unrealistic values due to bubbles, a shock to the bank could occur when the bubbles burst, potentially causing a run on the bank, as infamously happened with Lehman Brothers, and is currently happening with Dexia, a major European bank,” he goes on to explain, citing the significance of such inflated prices.

Using sophisticated mathematical methods, Protter and his co-authors answer the question of whether the price increase of a particular asset represents a bubble in real time. “[In this paper] we show that by using tick data and some statistical techniques, one is able to tell with a large degree of certainty, whether or not a given financial asset (or group of assets) is undergoing bubble pricing,” says Protter.

This question is answered by estimating an asset’s price volatility, which is stochastic or randomly determined. The authors define an asset’s price process in terms of a standard stochastic differential equation, which is driven by Brownian motion. Brownian motion, based on a natural process involving the erratic, random movement of small particles suspended in gas or liquid, has been widely used in mathematical finance. The concept is specifically used to model instances where previous change in the value of a variable is unrelated to past changes.

The key characteristic in determining a bubble is the volatility of an asset’s price, which, in the case of bubbles is very high. The authors estimate the volatility by applying state of the art estimators to real-time tick price data for a given stock. They then obtain the best possible extension of this data for large values using a technique called Reproducing Kernel Hilbert Spaces (RKHS), which is a widely used method for statistical learning.

“First, one uses tick price data to estimate the volatility of the asset in question for various levels of the asset’s price,” Protter explains. “Then, a special technique (RKHS with an optimization addition) is employed to extrapolate this estimated volatility function to large values for the asset’s price, where this information is not (and cannot be) available from tick data. Using this extrapolation, one can check the rate of increase of the volatility function as the asset price gets arbitrarily large. Whether or not there is a bubble depends on how fast this increase occurs (its asymptotic rate of increase).”

If it does not increase fast enough, there is no bubble within the model’s framework.

The authors test their methodology by applying the model to several stocks from the dot-com bubble of the nineties. They find fairly successful rates in their predictions, with higher accuracies in cases where market volatilities can be modeled more efficiently. This helps establish the strengths and weaknesses of the method.

The authors have also used the model to test more recent price increases to detect bubbles. “We have found, for example, that the IPO [initial public offering] of LinkedIn underwent bubble pricing at its debut, and that the recent rise in gold prices was not a bubble, according to our models,” Protter says.

It is encouraging to see that mathematical analysis can play a role in the diagnosis and detection of bubbles, which have significantly impacted economic upheavals in the past few decades.

Robert Jarrow is a professor at the Johnson Graduate School of Management at Cornell University in Ithaca, NY, and managing director of the Kamakura Corporation. Younes Kchia is a graduate student at Ecole Polytechnique in Paris, and Philip Protter is a professor in the Statistics Department at Columbia University in New York.

Professor Protter’s work was supported in part by NSF grant DMS-0906995.

O futuro da ciência está na colaboração (Valor Econômico)

JC e-mail 4376, de 01 de Novembro de 2011.

Texto de Michael Nielsen publicado no The Wall Street Journal e divulgado pelo Valor Econômico.

Um matemático da Universidade de Cambridge chamado Tim Gowers decidiu em janeiro de 2009 usar seu blog para realizar um experimento social inusitado. Ele escolheu um problema matemático difícil e tentou resolvê-lo abertamente, usando o blog para apresentar suas ideias e como estava progredindo. Ele convidou todo mundo para contribuir com ideias, na esperança de que várias mentes unidas seriam mais poderosas que uma. Ele chamou o experimento de Projeto Polímata (“Polymath Project”).

Quinze minutos depois de Gowers abrir o blog para discussão, um matemático húngaro-canadense publicou um comentário. Quinze minutos depois, um professor de matemática do ensino médio dos Estados Unidos entrou na conversa. Três minutos depois disso, o matemático Terence Tao, da Universidade da Califórnia em Los Angeles, também comentou. A discussão pegou fogo e em apenas seis semanas o problema foi solucionado.

Embora tenham surgido outros desafios e os colaboradores dessa rede nem sempre tenham encontrado todas as soluções, eles conseguiram criar uma nova abordagem para solucionar problemas. O trabalho deles é um exemplo das experiências com ciência colaborativa que estão sendo feitas para estudar desde de galáxias até dinossauros.

Esses projetos usam a internet como ferramenta cognitiva para amplificar a inteligência coletiva. Essas ferramentas são um meio de conectar as pessoas certas com os problemas certos na hora certa, ativando o que é um conhecimento apenas latente.

A colaboração em rede tem o potencial de acelerar extraordinariamente o número de descobertas da ciência como um todo. É provável que assistiremos a uma mudança mais fundamental na pesquisa científica nas próximas décadas do que a ocorrida nos últimos três séculos.

Mas há obstáculos grandes para alcançar essa meta. Embora pareça natural que os cientistas adotem essas novas ferramentas de descobrimento, na verdade eles têm demonstrado uma inibição surpreendente. Iniciativas como o Projeto Polímata continuam sendo exceção, não regra.

Considere a simples ideia de compartilhar dados científicos on-line. O melhor exemplo disso é o projeto do genoma humano, cujos dados podem ser baixados por qualquer um. Quando se lê no noticiário que um certo gene foi associado a alguma doença, é praticamente certo que é uma descoberta possibilitada pela política do projeto de abrir os dados.

Apesar do valor enorme de divulgar abertamente os dados, a maioria dos laboratórios não faz um esforço sistemático para compartilhar suas informações com outros cientistas. Como me disse um biólogo, ele estava “sentado no genoma” de uma nova espécie inteira há mais de um ano. Uma espécie inteira! Imagine as descobertas cruciais que outros cientistas poderiam ter feito se esse genoma tivesse sido carregado num banco de dados aberto.

Por que os cientistas não gostam de compartilhar?

Se você é um cientista buscando um emprego ou financiamento de pesquisa, o maior fator para determinar seu sucesso será o número de publicações científicas que já conseguiu. Se o seu histórico for brilhante, você se dará bem. Se não for, terá problemas. Então você dedica seu cotidiano de trabalho à produção de artigos para revistas acadêmicas.

Mesmo que ache pessoalmente que seria muito melhor para a ciência como um todo se você organizasse e compartilhasse seus dados na internet, é um tempo que o afasta do “verdadeiro” trabalho de escrever os artigos. Compartilhar dados não é algo a que seus colegas vão dar crédito, exceto em poucas áreas.

Há outras áreas em que os cientistas ainda estão atrasados no uso das ferramentas on-line. Um exemplo são os “wikis” criadas por pioneiros corajosos em assuntos como computação quântica, teoria das cordas e genética (um wiki permite o compartilhamento e edição colaborativa de um conjunto de informações interligadas, e o site Wikipedia é o mais conhecido deles).

Os wikis especializados podem funcionar como obras de referência atualizadas sobre as pesquisas mais recentes de um campo, como se fossem livros didáticos que evoluem ultrarrápido. Eles podem incluir descrições de problemas científicos importantes que ainda não foram resolvidos e podem servir de ferramenta para encontrar soluções.

Mas a maioria desses wikis não deu certo. Eles têm o mesmo problema que o compartilhamento de dados: mesmo se os cientistas acreditarem no valor da colaboração, sabem que escrever um único artigo medíocre fará muito mais por suas carreiras. O incentivo está completamente errado.

Para a ciência em rede alcançar seu potencial, os cientistas precisam abraçar e recompensar o compartilhamento aberto de todos os conhecimentos científicos, não só o publicado nas revistas acadêmicas tradicionais. A ciência em rede precisa ser aberta.

Michael Nielsen é um dos pioneiros da computação quântica e escreveu o livro “Reinventing Discovery: The New Era of Networked Science” (Reinventando a Descoberta: A Nova Era da Ciência em Rede, sem tradução para o português), de onde esse texto foi adaptado.

The world at seven billion (BBC)

27 October 2011 Last updated at 23:08 GMT

File photograph of newborn babies in Lucknow, India, in July 2009

As the world population reaches seven billion people, the BBC’s Mike Gallagher asks whether efforts to control population have been, as some critics claim, a form of authoritarian control over the world’s poorest citizens.

The temperature is some 30C. The humidity stifling, the noise unbearable. In a yard between two enormous tea-drying sheds, a number of dark-skinned women patiently sit, each accompanied by an unwieldy looking cloth sack. They are clad in colourful saris, but look tired and shabby. This is hardly surprising – they have spent most of the day in nearby plantation fields, picking tea that will net them around two cents a kilo – barely enough to feed their large families.

Vivek Baid thinks he knows how to help them. He runs the Mission for Population Control, a project in eastern India which aims to bring down high birth rates by encouraging local women to get sterilised after their second child.

As the world reaches an estimated seven billion people, people like Vivek say efforts to bring down the world’s population must continue if life on Earth is to be sustainable, and if poverty and even mass starvation are to be avoided.

There is no doubting their good intentions. Vivek, for instance, has spent his own money on the project, and is passionate about creating a brighter future for India.

But critics allege that campaigners like Vivek – a successful and wealthy male businessman – have tended to live very different lives from those they seek to help, who are mainly poor women.

These critics argue that rich people have imposed population control on the poor for decades. And, they say, such coercive attempts to control the world’s population often backfired and were sometimes harmful.

Population scare

Most historians of modern population control trace its roots back to the Reverend Thomas Malthus, an English clergyman born in the 18th Century who believed that humans would always reproduce faster than Earth’s capacity to feed them.

Giving succour to the resulting desperate masses would only imperil everyone else, he said. So the brutal reality was that it was better to let them starve.

‘Plenty is changed into scarcity’

Thomas Malthus

From Thomas Malthus’ Essay on Population, 1803 edition:

A man who is born into a world already possessed – if he cannot get subsistence from his parents on whom he has a just demand, and if the society do not want his labour, has no claim of right to the smallest portion of food.

At nature’s mighty feast there is no vacant cover for him. She tells him to be gone, and will quickly execute her own orders, if he does not work upon the compassion of some of her guests. If these guests get up and make room for him, other intruders immediately appear demanding the same favour. The plenty that before reigned is changed into scarcity; and the happiness of the guests is destroyed by the spectacle of misery and dependence in every part of the hall.

Rapid agricultural advances in the 19th Century proved his main premise wrong, because food production generally more than kept pace with the growing population.

But the idea that the rich are threatened by the desperately poor has cast a long shadow into the 20th Century.

From the 1960s, the World Bank, the UN and a host of independent American philanthropic foundations, such as the Ford and Rockefeller foundations, began to focus on what they saw as the problem of burgeoning Third World numbers.

The believed that overpopulation was the primary cause of environmental degradation, economic underdevelopment and political instability.

Massive populations in the Third World were seen as presenting a threat to Western capitalism and access to resources, says Professor Betsy Hartmann of Hampshire College, Massachusetts, in the US.

“The view of the south is very much put in this Malthusian framework. It becomes just this powerful ideology,” she says.

In 1966, President Lyndon Johnson warned that the US might be overwhelmed by desperate masses, and he made US foreign aid dependent on countries adopting family planning programmes.

Other wealthy countries such as Japan, Sweden and the UK also began to devote large amounts of money to reducing Third World birth rates.

‘Unmet need’

What virtually everyone agreed was that there was a massive demand for birth control among the world’s poorest people, and that if they could get their hands on reliable contraceptives, runaway population growth might be stopped.

But with the benefit of hindsight, some argue that this so-called unmet need theory put disproportionate emphasis on birth control and ignored other serious needs.

Graph of world population figures

“It was a top-down solution,” says Mohan Rao, a doctor and public health expert at Delhi’s Jawaharlal Nehru University.

“There was an unmet need for contraceptive services, of course. But there was also an unmet need for health services and all kinds of other services which did not get attention. The focus became contraception.”

Had the demographic experts worked at the grass-roots instead of imposing solutions from above, suggests Adrienne Germain, formerly of the Ford Foundation and then the International Women’s Health Coalition, they might have achieved a better picture of the dilemmas facing women in poor, rural communities.

“Not to have a full set of health services meant women were either unable to use family planning, or unwilling to – because they could still expect half their kids to die by the age of five,” she says.

India’s sterilisation ‘madness’

File photograph of Sanjay and Indira Gandhi in 1980

Indira Gandhi and her son Sanjay (above) presided over a mass sterilisation campaign. From the mid-1970s, Indian officials were set sterilisation quotas, and sought to ingratiate themselves with superiors by exceeding them. Stories abounded of men being accosted in the street and taken away for the operation. The head of the World Bank, Robert McNamara, congratulated the Indian government on “moving effectively” to deal with high birth rates. Funding was increased, and the sterilising went on.

In Delhi, some 700,000 slum dwellers were forcibly evicted, and given replacement housing plots far from the city centre, frequently on condition that they were either sterilised or produced someone else for the operation. In poorer agricultural areas, whole villages were rounded up for sterilisation. When residents of one village protested, an official is said to have threatened air strikes in retaliation.

“There was a certain madness,” recalls Nina Puri of the Family Planning Association of India. “All rationality was lost.”

Us and them

In 1968, the American biologist Paul Ehrlich caused a stir with his bestselling book, The Population Bomb, which suggested that it was already too late to save some countries from the dire effects of overpopulation, which would result in ecological disaster and the deaths of hundreds of millions of people in the 1970s.

Instead, governments should concentrate on drastically reducing population growth. He said financial assistance should be given only to those nations with a realistic chance of bringing birth rates down. Compulsory measures were not to be ruled out.

Western experts and local elites in the developing world soon imposed targets for reductions in family size, and used military analogies to drive home the urgency, says Matthew Connelly, a historian of population control at Columbia University in New York.

“They spoke of a war on population growth, fought with contraceptive weapons,” he says. “The war would entail sacrifices, and collateral damage.”

Such language betrayed a lack of empathy with their subjects, says Ms Germain: “People didn’t talk about people. They talked of acceptors and users of family planning.”

Emergency measures

Critics of population control had their say at the first ever UN population conference in 1974.

Karan Singh, India’s health minister at the time, declared that “development is the best contraceptive”.

But just a year later, Mr Singh’s government presided over one of the most notorious episodes in the history of population control.

In June 1975, the Indian premier, Indira Gandhi, declared a state of emergency after accusations of corruption threatened her government. Her son Sanjay used the measure to introduce radical population control measures targeted at the poor.

The Indian emergency lasted less than two years, but in 1975 alone, some eight million Indians – mainly poor men – were sterilised.

Yet, for all the official programmes and coercion, many poor women kept on having babies.

And where they did not, it arguably had less to do with coercive population control than with development, just as Karan Singh had argued in 1974, says historian Matt Connelly.

For example, in India, a disparity in birth rates could already be observed between the impoverished northern states and more developed southern regions like Kerala, where women were more likely to be literate and educated, and their offspring more likely to be healthy.

Women there realised that they could have fewer births and still expect to see their children survive into adulthood.

China: ‘We will not allow your baby to live’

Steven Mosher was a Stanford University anthropologist working in rural China who witnessed some of the early, disturbing moments of Beijing’s One Child Policy.

“I remember very well the evening of 8 March, 1980. The local Communist Party official in charge of my village came over waving a government document. He said: ‘The Party has decided to impose a cap of 1% on population growth this year.’ He said: ‘We’re going to decide who’s going to be allowed to continue their pregnancy and who’s going to be forced to terminate their pregnancy.’ And that’s exactly what they did.”

“These were women in the late second and third trimester of pregnancy. There were several women just days away from giving birth. And in my hearing, a party official said: ‘Do not think that you can simply wait until you go into labour and give birth, because we will not allow your baby to live. You will go home alone’.”

Total control

By now, this phenomenon could be observed in another country too – one that would nevertheless go on to impose the most draconian population control of all.

The One Child Policy is credited with preventing some 400 million births in China, and remains in place to this day. In 1983 alone, more than 16 million women and four million men were sterilised, and 14 million women received abortions.

Assessed by numbers alone, it is said to be by far the most successful population control initiative. Yet it remains deeply controversial, not only because of the human suffering it has caused.

A few years after its inception, the policy was relaxed slightly to allow rural couples two children if their first was not a boy. Boy children are prized, especially in the countryside where they provide labour and care for parents in old age.

But modern technology allows parents to discover the sex of the foetus, and many choose to abort if they are carrying a girl. In some regions, there is now a serious imbalance between men and women.

Moreover, since Chinese fertility was already in decline at the time the policy was implemented, some argue that it bears less responsibility for China’s falling birth rate than its supporters claim.

“I don’t think they needed to bring it down further,” says Indian demographer AR Nanda. “It would have happened at its own slow pace in another 10 years.”

Backlash

In the early 1980s, objections to the population control movement began to grow, especially in the United States.

In Washington, the new Reagan administration removed financial support for any programmes that involved abortion or sterilisation.

“If you give women the tools they need – education, employment, contraception, safe abortion – then they will make the choices that benefit society”

Adrienne Germain

The broad alliance to stem birth rates was beginning to dissolve and the debate become more polarised along political lines.

While some on the political right had moral objections to population control, some on the left saw it as neo-colonialism.

Faith groups condemned it as a Western attack on religious values, but women’s groups feared changes would mean poor women would be even less well-served.

By the time of a major UN conference on population and development in Cairo in 1994, women’s groups were ready to strike a blow for women’s rights, and they won.

The conference adopted a 20-year plan of action, known as the Cairo consensus, which called on countries to recognise that ordinary women’s needs – rather than demographers’ plans – should be at the heart of population strategies.

After Cairo

Today’s record-breaking global population hides a marked long-term trend towards lower birth rates, as urbanisation, better health care, education and access to family planning all affect women’s choices.

With the exception of sub-Saharan Africa and some of the poorest parts of India, we are now having fewer children than we once did – in some cases, failing even to replace ourselves in the next generation. And although total numbers are set to rise still further, the peak is now in sight.

Chinese poster from the 1960s of mother and baby, captioned: Practicing birth control is beneficial for the protection of the health of mother and childChina promoted birth control before implementing its one-child policy

Assuming that this trend continues, total numbers will one day level off, and even fall. As a result, some believe the sense of urgency that once surrounded population control has subsided.

The term population control itself has fallen out of fashion, as it was deemed to have authoritarian connotations. Post-Cairo, the talk is of women’s rights and reproductive rights, meaning the right to a free choice over whether or not to have children.

According to Adrienne Germain, that is the main lesson we should learn from the past 50 years.

“I have a profound conviction that if you give women the tools they need – education, employment, contraception, safe abortion – then they will make the choices that benefit society,” she says.

“If you don’t, then you’ll just be in an endless cycle of trying to exert control over fertility – to bring it up, to bring it down, to keep it stable. And it never comes out well. Never.”

Nevertheless, there remain to this day schemes to sterilise the less well-off, often in return for financial incentives. In effect, say critics, this amounts to coercion, since the very poor find it hard to reject cash.

“The people proposing this argue ‘Don’t worry, everything’ s fine now we have voluntary programmes on the Cairo model’,” says Betsy Hartmann.

“But what they don’t understand is the profound difference in power between rich and poor. The people who provide many services in poor areas are already prejudiced against the people they serve.”

Work in progress

For Mohan Rao, it is an example of how even the Cairo consensus fails to take account of the developing world.

“Cairo had some good things,” he says. “However Cairo was driven largely by First World feminist agendas. Reproductive rights are all very well, but [there needs to be] a whole lot of other kinds of enabling rights before women can access reproductive rights. You need rights to food, employment, water, justice and fair wages. Without all these you cannot have reproductive rights.”

Perhaps, then, the humanitarian ideals of Cairo are still a work in progress.

Meanwhile, Paul Ehrlich has also amended his view of the issue.

If he were to write his book today, “I wouldn’t focus on the poverty-stricken masses”, he told the BBC.

“I would focus on there being too many rich people. It’s crystal clear that we can’t support seven billion people in the style of the wealthier Americans.”

Mike Gallager is the producer of the radio programme Controlling People on BBC World Service

Where do you fit into 7 billion?

The world’s population is expected to hit seven billion in the next few weeks. After growing very slowly for most of human history, the number of people on Earth has more than doubled in the last 50 years. Where do you fit into this story of human life? Fill in your date of birth here to find out.

The world’s population will reach 7 billion at the end of October. Don’t panic (The Economist)

Demography

A tale of three islands

Oct 22nd 2011 | from the print edition

 

IN 1950 the whole population of the earth—2.5 billion—could have squeezed, shoulder to shoulder, onto the Isle of Wight, a 381-square-kilometre rock off southern England. By 1968 John Brunner, a British novelist, observed that the earth’s people—by then 3.5 billion—would have required the Isle of Man, 572 square kilometres in the Irish Sea, for its standing room. Brunner forecast that by 2010 the world’s population would have reached 7 billion, and would need a bigger island. Hence the title of his 1968 novel about over-population, “Stand on Zanzibar” (1,554 square kilometres off east Africa).

Brunner’s prediction was only a year out. The United Nations’ population division now says the world will reach 7 billion on October 31st 2011 (America’s Census Bureau delays the date until March 2012). The UN will even identify someone born that day as the world’s 7 billionth living person. The 6 billionth, Adnan Nevic, was born on October 12th 1999 in Sarajevo, in Bosnia. He will be just past his 12th birthday when the next billion clicks over.

That makes the world’s population look as if it is rising as fast as ever. It took 250,000 years to reach 1 billion, around 1800; over a century more to reach 2 billion (in 1927); and 32 years more to reach 3 billion. But to rise from 5 billion (in 1987) to 6 billion took only 12 years; and now, another 12 years later, it is at 7 billion (see chart 1). By 2050, the UN thinks, there will be 9.3 billion people, requiring an island the size of Tenerife or Maui to stand on.

Odd though it seems, however, the growth in the world’s population is actually slowing. The peak of population growth was in the late 1960s, when the total was rising by almost 2% a year. Now the rate is half that. The last time it was so low was in 1950, when the death rate was much higher. The result is that the next billion people, according to the UN, will take 14 years to arrive, the first time that a billion milestone has taken longer to reach than the one before. The billion after that will take 18 years.

Once upon a time, the passing of population milestones might have been cause for celebration. Now it gives rise to jeremiads. As Hillary Clinton’s science adviser, Nina Fedoroff, told the BBC in 2009, “There are probably already too many people on the planet.” But the notion of “too many” is more flexible than it seems. The earth could certainly not support 10 billion hunter-gatherers, who used much more land per head than modern farm-fed people do. But it does not have to. The earth might well not be able to support 10 billion people if they had exactly the same impact per person as 7 billion do today. But that does not necessarily spell Malthusian doom, because the impact humans have on the earth and on each other can change.

For most people, the big questions about population are: can the world feed 9 billion mouths by 2050? Are so many people ruining the environment? And will those billions, living cheek-by-jowl, go to war more often? On all three counts, surprising as it seems, reducing population growth any more quickly than it is falling anyway may not make much difference.

Start with the link between population and violence. It seems plausible that the more young men there are, the more likely they will be to fight. This is especially true when groups are competing for scarce resources. Some argue that the genocidal conflict in Darfur, western Sudan, was caused partly by high population growth, which led to unsustainable farming and conflicts over land and water. Land pressure also influenced the Rwandan genocide of 1994, as migrants in search of a livelihood in one of the world’s most densely populated countries moved into already settled areas, with catastrophic results.

But there is a difference between local conflicts and what is happening on a global scale. Although the number of sovereign states has increased almost as dramatically as the world’s population over the past half-century, the number of wars between states fell fairly continuously during the period. The number of civil wars rose, then fell. The number of deaths in battle fell by roughly three-quarters. These patterns do not seem to be influenced either by the relentless upward pressure of population, or by the slackening of that pressure as growth decelerates. The difference seems to have been caused by fewer post-colonial wars, the ending of cold-war alliances (and proxy wars) and, possibly, the increase in international peacekeepers.

More people, more damage?

Human activity has caused profound changes to the climate, biodiversity, oceanic acidity and greenhouse-gas levels in the atmosphere. But it does not automatically follow that the more people there are, the worse the damage. In 2007 Americans and Australians emitted almost 20 tonnes of carbon dioxide each. In contrast, more than 60 countries—including the vast majority of African ones—emitted less than 1 tonne per person.

This implies that population growth in poorer countries (where it is concentrated) has had a smaller impact on the climate in recent years than the rise in the population of the United States (up by over 50% in 1970-2010). Most of the world’s population growth in the next 20 years will occur in countries that make the smallest contribution to greenhouse gases. Global pollution will be more affected by the pattern of economic growth—and especially whether emerging nations become as energy-intensive as America, Australia and China.

Population growth does make a bigger difference to food. All things being equal, it is harder to feed 7 billion people than 6 billion. According to the World Bank, between 2005 and 2055 agricultural productivity will have to increase by two-thirds to keep pace with rising population and changing diets. Moreover, according to the bank, if the population stayed at 2005 levels, farm productivity would have to rise by only a quarter, so more future demand comes from a growing population than from consumption per person.

Increasing farm productivity by a quarter would obviously be easier than boosting it by two-thirds. But even a rise of two-thirds is not as much as it sounds. From 1970-2010 farm productivity rose far more than this, by over three-and-a-half times. The big problem for agriculture is not the number of people, but signs that farm productivity may be levelling out. The growth in agricultural yields seems to be slowing down. There is little new farmland available. Water shortages are chronic and fertilisers are over-used. All these—plus the yield-reductions that may come from climate change, and wastefulness in getting food to markets—mean that the big problems are to do with supply, not demand.

None of this means that population does not matter. But the main impact comes from relative changes—the growth of one part of the population compared with another, for example, or shifts in the average age of the population—rather than the absolute number of people. Of these relative changes, falling fertility is most important. The fertility rate is the number of children a woman can expect to have. At the moment, almost half the world’s population—3.2 billion—lives in countries with a fertility rate of 2.1 or less. That number, the so-called replacement rate, is usually taken to be the level at which the population eventually stops growing.

The world’s decline in fertility has been staggering (see chart 2). In 1970 the total fertility rate was 4.45 and the typical family in the world had four or five children. It is now 2.45 worldwide, and lower in some surprising places. Bangladesh’s rate is 2.16, having halved in 20 years. Iran’s fertility fell from 7 in 1984 to just 1.9 in 2006. Countries with below-replacement fertility include supposedly teeming Brazil, Tunisia and Thailand. Much of Europe and East Asia have fertility rates far below replacement levels.

The fertility fall is releasing wave upon wave of demographic change. It is the main influence behind the decline of population growth and, perhaps even more important, is shifting the balance of age groups within a population.

When gold turns to silver

A fall in fertility sends a sort of generational bulge surging through a society. The generation in question is the one before the fertility fall really begins to bite, which in Europe and America was the baby-boom generation that is just retiring, and in China and East Asia the generation now reaching adulthood. To begin with, the favoured generation is in its childhood; countries have lots of children and fewer surviving grandparents (who were born at a time when life expectancy was lower). That was the situation in Europe in the 1950s and in East Asia in the 1970s.

But as the select generation enters the labour force, a country starts to benefit from a so-called “demographic dividend”. This happens when there are relatively few children (because of the fall in fertility), relatively few older people (because of higher mortality previously), and lots of economically active adults, including, often, many women, who enter the labour force in large numbers for the first time. It is a period of smaller families, rising income, rising life expectancy and big social change, including divorce, postponed marriage and single-person households. This was the situation in Europe between 1945 and 1975 (“les trente glorieuses”) and in much of East Asia in 1980-2010.

But there is a third stage. At some point, the gilded generation turns silver and retires. Now the dividend becomes a liability. There are disproportionately more old people depending upon a smaller generation behind them. Population growth stops or goes into reverse, parts of a country are abandoned by the young and the social concerns of the aged grow in significance. This situation already exists in Japan. It is arriving fast in Europe and America, and soon after that will reach East Asia.

A demographic dividend tends to boost economic growth because a large number of working-age adults increases the labour force, keeps wages relatively low, boosts savings and increases demand for goods and services. Part of China’s phenomenal growth has come from its unprecedentedly low dependency ratio—just 38 (this is the number of dependents, children and people over 65, per 100 working adults; it implies the working-age group is almost twice as large as the rest of the population put together). One study by Australia’s central bank calculated that a third of East Asia’s GDP growth in 1965-90 came from its favourable demography. About a third of America’s GDP growth in 2000-10 also came from its increasing population.

The world as a whole reaped a demographic dividend in the 40 years to 2010. In 1970 there were 75 dependents for every 100 adults of working age. In 2010 the number of dependents dropped to just 52. Huge improvements were registered not only in China but also in South-East Asia and north Africa, where dependency ratios fell by 40 points. Even “ageing” Europe and America ended the period with fewer dependents than at the beginning.

A demographic dividend does not automatically generate growth. It depends on whether the country can put its growing labour force to productive use. In the 1980s Latin America and East Asia had similar demographic patterns. But while East Asia experienced a long boom, Latin America endured its “lost decade”. One of the biggest questions for Arab countries, which are beginning to reap their own demographic dividends, is whether they will follow East Asia or Latin America.

But even if demography guarantees nothing, it can make growth harder or easier. National demographic inheritances therefore matter. And they differ a lot.

Where China loses

Hania Zlotnik, the head of the UN’s Population Division, divides the world into three categories, according to levels of fertility (see map). About a fifth of the world lives in countries with high fertility—3 or more. Most are Africans. Sub-Saharan Africa, for example, is one of the fastest-growing parts of the world. In 1975 it had half the population of Europe. It overtook Europe in 2004, and by 2050 there will be just under 2 billion people there compared with 720m Europeans. About half of the 2.3 billion increase in the world’s population over the next 40 years will be in Africa.

The rest of the world is more or less equally divided between countries with below-replacement fertility (less than 2.1) and those with intermediate fertility (between 2.1 and 3). The first group consists of Europe, China and the rest of East Asia. The second comprises South and South-East Asia, the Middle East and the Americas (including the United States).

The low-fertility countries face the biggest demographic problems. The elderly share of Japan’s population is already the highest in the world. By 2050 the country will have almost as many dependents as working-age adults, and half the population will be over 52. This will make Japan the oldest society the world has ever known. Europe faces similar trends, less acutely. It has roughly half as many dependent children and retired people as working-age adults now. By 2050 it will have three dependents for every four adults, so will shoulder a large burden of ageing, which even sustained increases in fertility would fail to reverse for decades. This will cause disturbing policy implications in the provision of pensions and health care, which rely on continuing healthy tax revenues from the working population.

At least these countries are rich enough to make such provision. Not so China. With its fertility artificially suppressed by the one-child policy, it is ageing at an unprecedented rate. In 1980 China’s median age (the point where half the population is older and half younger) was 22 years, a developing-country figure. China will be older than America as early as 2020 and older than Europe by 2030. This will bring an abrupt end to its cheap-labour manufacturing. Its dependency ratio will rise from 38 to 64 by 2050, the sharpest rise in the world. Add in the country’s sexual imbalances—after a decade of sex-selective abortions, China will have 96.5m men in their 20s in 2025 but only 80.3m young women—and demography may become the gravest problem the Communist Party has to face.

Many countries with intermediate fertility—South-East Asia, Latin America, the United States—are better off. Their dependency ratios are not deteriorating so fast and their societies are ageing more slowly. America’s demographic profile is slowly tugging it away from Europe. Though its fertility rate may have fallen recently, it is still slightly higher than Europe’s. In 2010 the two sides of the Atlantic had similar dependency rates. By 2050 America’s could be nearly ten points lower.

But the biggest potential beneficiaries are the two other areas with intermediate fertility—India and the Middle East—and the high-fertility continent of Africa. These places have long been regarded as demographic time-bombs, with youth bulges, poverty and low levels of education and health. But that is because they are moving only slowly out of the early stage of high fertility into the one in which lower fertility begins to make an impact.

At the moment, Africa has larger families and more dependent children than India or Arab countries and is a few years younger (its median age is 20 compared with their 25). But all three areas will see their dependency ratios fall in the next 40 years, the only parts of the world to do so. And they will keep their median ages low—below 38 in 2050. If they can make their public institutions less corrupt, keep their economic policies outward-looking and invest more in education, as East Asia did, then Africa, the Middle East and India could become the fastest-growing parts of the world economy within a decade or so.

Here’s looking at you

Demography, though, is not only about economics. Most emerging countries have benefited from the sort of dividend that changed Europe and America in the 1960s. They are catching up with the West in terms of income, family size and middle-class formation. Most say they want to keep their cultures unsullied by the social trends—divorce, illegitimacy and so on—that also affected the West. But the growing number of never-married women in urban Asia suggests that this will be hard.

If you look at the overall size of the world’s population, then, the picture is one of falling fertility, decelerating growth and a gradual return to the flat population level of the 18th century. But below the surface societies are being churned up in ways not seen in the much more static pre-industrial world. The earth’s population may never need a larger island than Maui to stand on. But the way it arranges itself will go on shifting for centuries to come.

Brasil já pesquisa efeitos da mudança do clima (Valor Econômico)

JC e-mail 4373, de 27 de Outubro de 2011.

As pesquisas em mudança climática no Brasil começam a mudar de rumo. Se há alguns anos o foco estava nos esforços de redução das emissões dos gases-estufa, agora miram a adaptação ao fenômeno.

“Sabemos que nos próximos cinco ou dez anos não há perspectiva para que seja firmado internacionalmente um acordo de redução nas emissões de gases-estufa de grandes proporções, com cortes entre 70% a 80%”, diz o físico Paulo Artaxo, da USP, um estudioso da Amazônia. “Esse panorama é cada vez mais longínquo. Portanto é fundamental que se estudem estratégias de adaptação.”

Em outras palavras, as pesquisas devem se voltar para os efeitos da mudança do clima nos ecossistemas, em ambientes urbanos, em contextos sociais. “Não é uma questão de dinheiro, mas de direcionamento dos estudos”, diz Artaxo, membro do conselho diretor do Painel Brasileiro de Mudança Climática, órgão científico ligado aos ministérios da Ciência e Tecnologia e Ambiente. “O País precisa se preparar mais adequadamente para a mudança climática.”

“É preciso pesquisar mais, por exemplo, as alterações no ciclo hidrológico”, cita Reynaldo Victoria, coordenador do Programa Fapesp de Pesquisa sobre Mudanças Climáticas Globais. “Saber onde vai chover mais e onde vai chover menos”, explica. É um dos braços da pesquisa de Artaxo na Amazônia. “Porque não se quer construir uma hidrelétrica onde choverá muito menos nas próximas décadas”, ilustra o físico.

O programa de mudança climática da Fapesp já conta com investimentos de US$ 30 milhões em projetos na área. É um dos braços mais novos da fundação, mas já está ganhando musculatura. Tem 21 projetos em andamento, 14 contratos novos, dois outros em parceria com instituições estrangeiras, como o britânico Natural Environment Research Council (Nerc) ou a francesa Agence Nationale de la Recherche (ANR). Em dez anos, a previsão é de investimentos de mais de R$ 100 milhões.

As pesquisas começam a se voltar para campos pouco estudados. “Vamos analisar questões críticas para o Brasil”, diz Artaxo. Ele cita, por exemplo, o ciclo de carbono na Amazônia – algo muito mais complexo do que estudar a fotossíntese e a respiração das plantas.

Victoria, que também é professor do Centro de Energia Nuclear Aplicada à Agricultura (Cena-USP), diz que a intenção do programa é mirar campos novos, como entender qual o papel do Atlântico Sul no clima da região Sul do Brasil e Norte da Argentina. Outro exemplo é obter registros históricos na área de paleoclima.

Os impactos na área de saúde também serão mais estudados. Já se sabe que a mudança do clima faz com que doenças que não existiam em determinado lugar, passem a ocorrer. A dengue, por exemplo, encontra ambiente propício em regiões mais quentes. Entre as novas pesquisas de doenças emergentes há o estudo de um tipo de leishmaniose, comum na Bolívia e no Peru, que não existia no Brasil e agora ameaça surgir no Acre. Provocada por um mosquito, a doença causa uma infecção cutânea e pode ser mortal.

Os pesquisadores falaram sobre seus projetos durante a Fapesp Week, evento que faz parte da comemoração pelos 50 anos da Fundação de Amparo à Pesquisa do Estado de São Paulo e terminou ontem, em Washington.

Limite próximo (Fapesp)

Amazônia está muito próxima de um ponto de não retorno para sua sobrevivência, diz Thomas Lovejoy, da George Mason University, no simpósio internacional FAPESP Week (foto: JVInfante Photography/Wilson Center)

27/10/2011

Agência FAPESP – A Amazônia está muito próxima de um ponto de não retorno para sua sobrevivência, devido a uma combinação de fatores que incluem aquecimento global, desflorestamento e queimadas que minam seu sistema hidrogeológico.

A advertência foi feita por Thomas Lovejoy, atualmente professor da George Mason University, no Estado de Virgínia, EUA, no primeiro dia do simpósio internacional FAPESP Week, em Washington, nesta segunda-feira.

O biólogo Lovejoy, um dos mais importantes especialistas em Amazônia do mundo, começou a trabalhar na floresta brasileira em 1965, “apenas três anos depois da fundação da FAPESP”, lembrou.

Apesar de muita coisa positiva ter acontecido nestes 47 anos (“quando pisei pela primeira vez em Belém, só havia uma floresta nacional e uma área indígena demarcada e quase nenhum cientista brasileiro se interessava em estudar a Amazônia; hoje esse situação está totalmente invertida”), também apareceram no período diversos fatores de preocupação.

Lovejoy acredita que restam cinco anos para inverter as tendências em tempo de evitar problemas de maior gravidade. O aquecimento da temperatura média do planeta já está na casa de 0,8 grau centígrado. Ele acredita que o limite aceitável é de 2 graus centígrados e que ele pode ser alcançado até 2016 se nada for feito para efetivamente reduzi-lo.

O objetivo fixado nas mais recentes reuniões sobre o clima em Cancun e Copenhague de limitar o aumento médio da temperatura média global em 2 graus centígrados pode ser insuficiente, na opinião de Lovejoy, devido a essa conjugação de elementos.

De forma similar, Lovejoy crê que 20% de desflorestamento em relação ao tamanho original da Amazônia é o máximo que ela consegue suportar e o atual índice já é de 17% (em 1965, a taxa era de 3%).

A boa notícia, diz o biólogo, é que há bastante terra abandonada, sem nenhuma perspectiva de utilização econômica na Amazônia e que pode ser de alguma forma reflorestada, o que poderia proporcionar certa margem de segurança.

Em sua palestra, Lovejoy saudou vários cientistas brasileiros como exemplares em excelência em suas pesquisas. Entre outros, Eneas Salati, Carlos Nobre e Carlos Joly.

Weathering Fights – Science: What’s It Up To? (The Daily Show with Jon Stewart)

http://media.mtvnservices.com/mgid:cms:video:thedailyshow.com:400760

Science claims it’s working to cure disease, save the planet and solve the greatest human mysteries, but Aasif Mandvi finds out what it’s really up to. (05:47) – Comedy Central

Global Warming May Worsen Effects of El Niño, La Niña Events (Climate Central)

Published: October 12th, 2011

By Michael D. Lemonick

Does this mean Texas is toast?

As just about everyone knows, El Niño is a periodic unusual warming of the surface water in the eastern and central tropical Pacific Ocean. Actually, that’s pretty much a lie. Most people don’t know the definition of El Niño or its mirror image, La Niña, and truthfully, most people don’t much care.

What you do care about if you’re a Texan suffering through the worst one-year drought on record, or a New Yorker who had to dig out from massive snowstorms last winter (tied in part to La Niña), or a Californian who has ever had to deal with the torrential rains that trigger catastrophic mudslides (linked to El Niño), is that these natural climate cycles can elevate the odds of natural disasters where you live.

At the moment, we’re now entering the second year of the La Niña part of the cycle. La Niña is one key reason why the Southwest was so dry last winter and through the spring and summer, and since La Niña is projected to continue through the coming winter, Texas and nearby states aren’t likely to get much relief.

Precipitation outlook for winter 2011-12, showing the likelihood of below average precipitation in Texas and other drought-stricken states.

But Niñas and Niños (the broader cycle, for you weather/climate geeks, is known as the “El Niño-Southern Oscillation,” or “ENSO”) don’t just operate in isolation. They’re part of the broader climate system, which means that climate change could theoretically change how they operate — make them develop more frequently, for example, or less frequently, or be more or less pronounced. Climate change could also intensify the effects of El Niño and La Niña events.

Climate scientists have been wrestling with the first question for a while now, and they still don’t really have a definitive answer. Some climate models have suggested that global warming has already begun to cause subtle changes in ENSO cycles, and that the changes will become more pronounced later this century. But a new study, published in the Journal of Climate, doesn’t find much evidence for that.

But on the second question, the new study is a lot more definitive. “Due to a warmer and moister atmosphere,” said co-author Baylor Fox-Kemper, of the University of Colorado in a press release, “the impacts of El Niño are changing even though El Niño itself doesn’t change.”

That’s because global warming has begun to change the playing field on which El Niño and La Niña operate, just as it’s changing the background conditions that give rise to our everyday weather. The Texas drought is a prime example. Its most likely cause is reduced rainfall from La Niña-related weather patterns. But however dry Texas and Oklahoma might have been otherwise, the killer heat wave that plagued the region this past summer — the sort of heat wave global warming is already making more commonplace — baked much of the remaining moisture out of both the soil and vegetation. No wonder large parts of the Lone Star State have gone up in smoke.

A map of sea surface temperature anomalies, showing a swath of cooler than average waters in the central and eastern tropical Pacific Ocean – a telltale sign La Niña conditions.

When the next El Niño occurs in a year or two, it will probably bring heavy rains to places like Southern California, whose unstable hillsides tend to slide when soggy. Except now, thanks to global warming, the typical El Niño-related storms that roll in off the Pacific may well be turbocharged, since a warmer atmosphere can hold more water. This is the reason, say many climate scientists, that downpours have become heavier in recent decades across broad geographical areas.

La Niña, plus the added moisture in the air from global warming, have also been partially implicated in the massive snowstorms that struck the Northeast and Mid-Atlantic states during the last two winters. Those could get worse as well, suggests the new analysis. “What we see,” says Fox-Kemper, “is that certain atmospheric patterns, such as the blocking high pressure south of Alaska typical of La Niña winters, strengthen…so, the cooling of North America expected in a La Niña winter would be stronger in future climates.” So to pre-answer the question that will inevitably be asked next winter: no, more snow does NOT contradict the idea that the planet is warming. Quite the contrary.

Finally, for those who really do want to know what El Niño and La Niña actually are, as opposed to what they do, you can go to NOAA’s El Niño page. But be warned: there will be a quiz, and the word “thermocline” will appear.

Comments

By Kirk Petersen (Maplewood, NJ 07040)
on October 13th, 2011

Seventh paragraph, third sentence should begin “Its most likely cause”—not “it’s”.

Vital Details of Global Warming Are Eluding Forecasters (Science)

Science 14 October 2011:
Vol. 334 no. 6053 pp. 173-174
DOI: 10.1126/science.334.6053.173

PREDICTING CLIMATE CHANGE

Richard A. Kerr

Decision-makers need to know how to prepare for inevitable climate change, but climate researchers are still struggling to sharpen their fuzzy picture of what the future holds.

Seattle Public Utilities officials had a question for meteorologist Clifford Mass. They were planning to install a quarter-billion dollars’ worth of storm-drain pipes that would serve the city for up to 75 years. “Their question was, what diameter should the pipe be? How will the intensity of extreme precipitation change?” Mass says. If global warming means that the past century’s rain records are no guide to how heavy future rains will be, he was asked, what could climate modeling say about adapting to future climate change? “I told them I couldn’t give them an answer,” says the University of Washington (UW), Seattle, researcher.

Climate researchers are quite comfortable with their projections for the world under a strengthening greenhouse, at least on the broadest scales. Relying heavily on climate modeling, they find that on average the globe will continue warming, more at high northern latitudes than elsewhere. Precipitation will tend to increase at high latitudes and decrease at low latitudes.

But ask researchers what’s in store for the Seattle area, the Pacific Northwest, or even the western half of the United States, and they’ll often demur. As Mass notes, “there’s tremendous uncertainty here,” and he’s not just talking about the Pacific Northwest. Switching from global models to models focusing on a single region creates a more detailed forecast, but it also “piles uncertainty on top of uncertainty,” says meteorologist David Battisti of UW Seattle.

First of all, there are the uncertainties inherent in the regional model itself. Then there are the global model’s uncertainties at the regional scale, which it feeds into the regional model. As the saying goes, if the global model gives you garbage, regional modeling will only give you more detailed garbage. And still more uncertainties are created as data are transferred from the global to the regional model.

Although uncertainties abound, “uncertainty tends to be downplayed in a lot of [regional] modeling for adaptation,” says global modeler Christopher Bretherton of UW Seattle. But help is on the way. Regional modelers are well into their first extensive comparison of global-regional model combinations to sort out the uncertainties, although that won’t help Seattle’s storm-drain builders.

Most humble origins

Policymakers have long asked for regional forecasts to help them adapt to climate change, some of which is now unavoidable. Even immediate, rather drastic action to curb emissions of greenhouse gases would not likely limit warming globally to 2°C, generally considered the threshold above which “dangerous” effects set in. And nothing at all can be done to reduce the global warming effects expected in the next several decades. They are already locked into climate change.

Sharp but true? Feeding a global climate model’s prediction for midcentury (top) into a regional model gives more details (bottom), but modelers aren’t sure how accurate the details are. CREDIT: NORTH AMERICAN REGIONAL CLIMATE CHANGE ASSESSMENT PROGRAM

So scientists have been doing what they can for decision-makers. Early on, it wasn’t much. A U.S. government assessment released in 2000, Climate Change Impacts on the United States, relied on the most rudimentary regional forecasting technique (Science, 23 June 2000, p. 2113). Expert committee members divided the country into eight regions and then considered what two of their best global climate models had to say about each region over the next century. The two models were somewhat consistent in the far southwest, where the report’s authors found it was likely that warmer and drier conditions would eliminate alpine ecosystems and shorten the ski season.

But elsewhere, there was far less consistency. Over the eastern two-thirds of the contiguous 48 states, for example, the two models couldn’t agree on how much moisture soils would hold in the summer. Kansas corn would either suffer severe droughts more frequently, as one model had it, or enjoy even more moisture than it currently does, as the other indicated. But at least the uncertainties were plain for all to see.

The uncertainties of regional projections nearly faded from view in the next U.S. effort, Global Climate Change Impacts in the United States. The 2009 study drew on not two but 15 global models melded into single projections. In a technique called statistical downscaling, its authors assumed that local changes would be proportional to changes on the larger scales. And they adjusted regional projections of future climate according to how well model simulations of past climate matched actual climate.

Statistical downscaling yielded a broad warming across the lower 48 states with less warming across the southeast and up the West Coast. Precipitation was mostly down, especially in the southwest. But discussion of uncertainties in the modeling fell largely to a footnote (number 110), in which the authors cite a half-dozen papers to support their assertion that statistical downscaling techniques are “well-documented” and thoroughly corroborated.

The other sort of downscaling, known as dynamical downscaling or regional modeling, has yet to be fully incorporated into a U.S. national assessment. But an example of state-of-the-art regional modeling appeared 30 June in Environmental Research Letters. To investigate what will happen in the U.S. wine industry, regional modeler Noah Diffenbaugh of Purdue University in West Lafayette, Indiana, and his colleagues embedded a detailed model that spanned the lower 48 states in a climate model that spanned the globe. The global model’s relatively fuzzy simulation of evolving climate from 1950 to 2039—calculated at points about 150 kilometers apart—then fed into the embedded regional model, which calculated a sharper picture of climate change at points only 25 kilometers apart.

Closely analyzing the regional model’s temperature projections on the West Coast, the group found that the projected warming would decrease the area suitable for production of premium wine grapes by 30% to 50% in parts of central and northern California. The loss in Washington state’s Columbia Valley would be more than 30%. But adaptation to the warming, such as the introduction of heat-tolerant varieties of grapes, could sharply reduce the losses in California and turn the Washington loss into a 150% gain.

Not so fast

A rapidly growing community of regional modelers is turning out increasingly detailed projections of future climate, but many researchers, mostly outside the downscaling community, have serious reservations. “Many regional modelers don’t do an adequate job of quantifying issues of uncertainty,” says Bretherton, who is chairing a National Academy of Sciences study committee on a national strategy for advancing climate modeling. “We’re not confident predicting the very things people are most interested in being predicted,” such as changes in precipitation.

Regional models produce strikingly detailed maps of changed climate, but they might be far off base. “The problem is that precision is often mistaken for accuracy,” Bretherton says. Battisti just doesn’t see the point of downscaling. “I would never use one of these products,” he says.

The problems start with the global models, as critics see it. Regional models must fill in the detail in the fuzzy picture of climate provided by global models, notes atmospheric scientist Edward Sarachik, professor emeritus at UW Seattle. But if the fuzzy picture of the region is wrong, the details will be wrong as well. And global models aren’t very good at painting regional pictures, he says. A glaring example, according to Sarachik, is the way global models place the cooler waters of the tropical Pacific farther west than they are in reality. Such ocean temperature differences drive weather and climate shifts in specific regions halfway around the world, but with the cold water in the wrong place, the global models drive climate change in the wrong regions.

Gregory Tripoli’s complaint about the global models is that they can’t create the medium-size weather systems that they should be sending into any embedded regional model. Tripoli, a meteorologist and modeler at the University of Wisconsin, Madison, cites the case of summertime weather disturbances that churn down off the Rocky Mountains and account for 80% of the Midwest’s summer rainfall. If a regional model forecasting for Wisconsin doesn’t extend to the Rockies, Wisconsin won’t get the major weather events that add up to be climate. And some atmospheric disturbances travel from as far away as Thailand to wreak havoc in the Midwest, he says, so they could never be included in the regional model.

A tougher nut. Predicting the details of precipitation using a regional model (bottom) fed by a global model (top) is even more uncertain than projecting regional temperature change. CREDIT: NORTH AMERICAN REGIONAL CLIMATE CHANGE ASSESSMENT PROGRAM

Even the things the global models get right have a hard time getting into regional models, critics say. “There are a lot of problems matching regional and global models,” Tripoli says. In one problem area, global and regional models usually have different ways of accounting for atmospheric processes such as individual cloud development that neither model can simulate directly, creating further clashes. Even the different philosophies involved in building global models and regional models can lead to mismatches that create phantom atmospheric circulations, Tripoli says. “It’s not straightforward you’re going to get anything realistic,” he says.

Redeeming regional modeling

“You could say all the global and regional models are wrong; some people do say that,” notes regional modeler Filippo Giorgi of the Abdus Salam International Centre for Theoretical Physics in Trieste, Italy. “My personal opinion is we do know something now. A few reports ago, it was really very, very difficult to say anything about regional climate change.”

But Giorgi says that in recent years he has been seeing increasingly consistent regional projections coming from combinations of many different models and from successive generations of models. “This means the projections are more and more reliable,” he says. “I would be confident saying the Mediterranean area will see a general decrease in precipitation in the next decades. I’ve seen this in several generations of models, and we understand the processes underlying this phenomenon. This is fairly reliable information, qualitatively. Saying whether the decrease will be 10% or 50% is a different issue.”

The skill of regional climate forecasting also varies from region to region and with what is being forecast. “Temperature is much, much easier” than precipitation, Giorgi notes. Precipitation depends on processes like atmospheric convection that operate on scales too small for any model to render in detail. Trouble simulating convection also means that higher-latitude climate is easier to project than that of the tropics, where convection dominates.

Regional modeling does have a clear advantage in areas with complex terrain such as mountainous regions, notes UW’s Mass, who does regional forecasting of both weather and climate. In the Pacific Northwest, the mountains running parallel to the coast direct onshore winds upward, predictably wringing rain and snow from the air without much difficult-to-simulate convection.

The downscaling of climate projections should be getting a boost as the Coordinated Regional Climate Downscaling Experiment (CORDEX) gets up to speed. Begun in 2009, CORDEX “is really the first time we’ll get a handle on all these uncertainties,” Giorgi says. Various groups will take on each of the world’s continent-size regions. Multiple global models will be matched with multiple regional models and run multiple times to tease out the uncertainties in each. “It’s a landmark for the regional climate modeling community,” Giorgi says.

 

Science 23 June 2000:
Vol. 288 no. 5474 p. 2113
DOI: 10.1126/science.288.5474.2113

GREENHOUSE WARMING

Dueling Models: Future U.S. Climate Uncertain

Richard A. Kerr

When Congress started funding a global climate change research program in 1990, it wanted to know what all this talk about greenhouse warming would mean for United States voters. Ten years later, a U.S. national assessment, drawing on the best available climate model predictions, concludes that the United States will indeed warm, affecting everything from the western snowpacks that supply California with water to New England’s fall foliage. But on a more detailed level, the assessment often draws a blank. Whether the cornfields of Kansas will be gripped by frequent, severe droughts, as one climate model has it, or blessed with more moisture than they now enjoy, as another predicts, the report can’t say. As much as policy-makers would like to know exactly what’s in store for Americans, the rudimentary state of regional climate science will not soon allow it, and the results of this 3-year effort brought the point home.

“This is the first time we’ve tried to take the physical [climate] system and see what effect it might have on ecosystems and socioeconomic systems,” says Thomas Karl, director of the National Oceanic and Atmospheric Administration’s (NOAA’s) National Climatic Data Center in Asheville, North Carolina, and a co-chair of the committee of experts that pulled together the assessment report “Climate Change Impacts on the United States” (available at http://www.nacc.usgcrp.gov/). “We don’t say we know there’s going to be catastrophic drought in Kansas,” he says. “What we do say is, ‘Here’s the range of our uncertainties.’ This document should get people to think.” If anything is certain, Karl says, it’s that “the past isn’t going to be a very good guide to future climate.”

By chance, the assessment had a handy way to convey the range of uncertainty that regional modeling serves up. The report, which divides the country into eight regions, is based on a pair of state-of-the-art climate models—one from the Canadian Climate Center and one from the U.K. Hadley Center for Climate Research and Prediction—that couple a simulated atmosphere and ocean. The two models solved the problems of simplifying a complex world in different ways, leading to very different predicted U.S. climates. “In terms of temperature, the Canadian model is at the upper end of the warming by 2100” predicted by a range of models, says modeler Eric Barron of Pennsylvania State University, University Park, and a member of the assessment team. “The Hadley model is toward the lower end. The Canadian model is on the dry side, and the Hadley model is on the wet side. We’re capturing a substantial portion of the range of simulations. We tried hard to convey that uncertainty.”

On a broad scale, the report can conclude: “Overall productivity of American agriculture will likely remain high, and is projected to increase throughout the 21st century,” although there will be winners and losers from place to place, and adapting agricultural practice to climate change will be key. Where the models are somewhat consistent, as in the far southwest, the report ventures what could be construed as predictions: “It is likely that some ecosystems, such as alpine ecosystems, will disappear entirely from the region,” or “Higher temperatures are likely to mean … a shorter season for winter activities, such as skiing.” Where the models clash, as on summer soil moisture over the eastern two-thirds of the lower 48 states, it explains the alternatives and suggests ways to adapt, such as switching crops.

The range of possible climate impacts laid out by the models “fairly reflects where we are in the science,” says Karl. But he notes that the effort did lack one important input: Congress mandated the assessment without funding it. “You get what you pay for,” says climatologist Kevin Trenberth of the National Center for Atmospheric Research in Boulder, Colorado. “A lot of it was done hastily.” Karl concedes that everyone involved would have liked to have had more funding delivered more reliably.

Even given more time and money, however, the assessment may not have come up with much better small-scale predictions, given the inherent limitations of the science. Even the best models today can say little that’s reliable about climate change at the regional level, never mind at the scale of a congressional district. Their picture of future climate is fuzzy—they might lump together San Francisco and Los Angeles because the models have such coarse geographic resolution—and the realism of such meteorological phenomena as clouds and precipitation is compromised by the inevitable simplifications of simulating the world in a computer.

“For the most part, these sorts of models give a warming,” says modeler Filippo Giorgi, “but they tend to give very different predictions, especially at the regional level, and there’s no way to say one should be believed over another.” Giorgi and his colleague Raquel Francisco of the Abdus Salam International Center for Theoretical Physics in Trieste, Italy, recently evaluated the uncertainties in five coupled climate models—including the two used in the national assessment—within 23 regions, the continental United States comprising roughly three regions. Giorgi concludes that as the scale of prediction shrinks, reliability drops until for small regions “the model data are not believable at all.”

Add in uncertainties external to the models, such as population and economic growth rates, says modeler Jerry D. Mahlman, director of NOAA’s Geophysical Fluid Dynamics Laboratory in Princeton, New Jersey, and the details of future climate recede toward unintelligibility. Some people in Congress and the policy community had “almost silly expectations there would be enormously useful, small-scale specifics, if you just got the right model. But the right model doesn’t exist,” says Mahlman.

Still, even though the national assessment does not offer the list of region-by-region impacts that Congress might have hoped for, it does show “where we are adaptable and where we are vulnerable,” says global change researcher Stephen Schneider of Stanford University. In 10 years, modelers say, they’ll do better.

The Post-Normal Seduction of Climate Science (Forbes)

William Pentland10/14/2011 @ 12:22AM |2,770 views

In early 2002, former U.S. Defense Secretary Donald Rumsfeld explained why the lack of evidence linking Saddam Hussein with terrorist groups did not mean there was no connection during a televised press conference.

“[T]here are known ‘knowns’ – there are things we know we know,” said Rumsfeld. “We also know there are known ‘unknowns’ – that is to say we know there are some things we do not know. But there are also unknown ‘unknowns’ – the ones we don’t know we don’t know . . . it is the latter category that tend to be the difficult ones.”

Rumsfeld turned out to be wrong about Hussein, but what if he had been talking about global warming?  Well, he probably would have been on to something there.  Unknowns of any ilk are a real pickle in climate science.

Indeed, uncertainty in climate science has induced a state of severe political paralysis. The trouble is that nobody really knows why. A rash of recent surveys and studies have exonerated most of the usual suspects – scientific illiteracy, industry distortions, skewed media coverage.

Now, the climate-science community is scrambling to crack the code on the “uncertainty” conundrum. Exhibit A: the October 2011 issue of the journal Climatic Change, the closest thing in climate science to gospel truth, which is devoted entirely to the subject of uncertainty.

While I have yet to digest all of the dozen or so essays, I suspect they are only the opening salvo in what is will soon become a robust debate about the significance of uncertainty in climate-change science. The first item up on the chopping block is called post-normal science (PNS).

PNS is a model of the scientific process pioneered by Jerome Ravetz and Silvio Funtowicz, which describes the peculiar challenges science encounters where “facts are uncertain, values in dispute, stakes high and decisions urgent.” Unlike “normal” science in the sense described by the philosopher of science Thomas Kuhn, post-normal science commonly crosses disciplinary lines and involves new methods, instruments and experimental systems.

Judith Curry, a professor at Georgia Tech, weighs the wisdom of taking the plunge on PNS in an excellent piece called “Reasoning about climate uncertainty.” Drawing on the work of Dutch wunderkind, Jeroen van der Sluijs, Curry calls on the Intergovernmental Panel on Climate Change to stop marginalizing uncertainty and get real about bias in the consensus building process. Curry writes:

The consensus approach being used by the IPCC has failed to produce a thorough portrayal of the complexities of the problem and the associated uncertainties in our understanding . . . Better characterization of uncertainty and ignorance and a more realistic portrayal of confidence levels could go a long way towards reducing the “noise” and animosity portrayed in the media that fuels the public distrust of climate science and acts to stymie the policy process.

PNS is especially seductive in the context of uncertainty. Not surprisingly, Curry suggests that instituting PNS-like strategies at the IPCC “could go a long way towards reducing the ‘noise’ and animosity” surrounding climate-change science.

While I personally believe PNS is persuasive, the PNS model provokes something closer to revulsion in many people. Last year, members of the U.S. House of Representatives filed a petition challenging the U.S. Environmental Protection Agency‘s Greenhouse Gas Endangerment seemed less sanguine about post-normal science:

. . . the conclusions of organizing bodies, especially the IPCC, cannot be said to reflect scientific “consensus” in any meaningful sense of that word. Instead, they reflect a political movement that has commandeered science to the service of its agenda. This is “post-normal science”: the long-dreaded arrival of deconstructionism to the natural sciences, according to which scientific quality is determined not by its fidelity to truth, but by its fidelity to the political agenda.

It seems unlikely that taking the PNS plunge would appreciably improve the U.S. public’s perception of the credibility, legitimacy and salience of climate-change assessments. This probably says more about Americans than it does about the analytic force of the PNS model.

Let’s face it. Americans do not agree on a whole hell of a lot. And they never have. Many U.S. institutions were deliberately designed to tolerate the coexistence of free states and slave-owning states. Ironically, Americans appear to agree more on climate-change science than other high-profile scientific controversies like the safety of genetically-modified organisms.

National Science Foundation

While it pains me to admit this, I am increasingly convinced that the IPCC’s role in assessing the science of climate change needs to be scaled back. The IPCC was an overly optimistic experiment in international governance designed for a world that never materialized.  The U.N. General Assembly established the IPCC in the months immediately preceding the fall of the Berlin Wall. Only two few years later, the IPCC’s first assessment report and the creation of the U.N. Framework Convention on Climate Change coincided with the collapse of the Soviet Union and the end of the Cold War.

A new world order seemed to be dawning in those days, which is probably why it seemed like a good idea to ask scientists to tell us what constitutes “dangerous climate change.”   Two decades and two world trade towers later, the world is a decidedly less hospitable place for institutions like the IPCC.

The proof is in the pudding – or, in this case, the atmosphere.

Climate Change Tumbles Down Europe’s Political Agenda as Economic Worries Take the Stage (N.Y. Times)

By JEREMY LOVELL of ClimateWire. Published: October 13, 2011

LONDON — Climate change has all but fallen off the political agenda across Europe as the resurging economic crisis empties national coffers and shakes economic confidence, and the public and the press turn their attention to more immediate issues of rising fuels bills and joblessness, analysts say.

Sputtering economies, a shift of attention to looming elections and the prospect of little or no movement in the December climate talks in Durban, South Africa, have combined to take the political momentum out of an issue that was a major cause in Europe.

“It is way down the agenda and will not feature in elections,” said Edward Cameron, director of the World Resources Institute think tank’s international climate initiative, on the sidelines of a meeting on climate change at London’s Chatham House think tank. “At a time of joblessness and fiscal crises, it is very difficult to advance the climate change issue.”

That is as true for next year’s presidential elections in the United States as it will be in France, despite the fact that there has been a series of environmental disasters, from the Texas drought this year to Russia’s heat wave and consequent steep rise in wheat prices last year.

According to acclaimed NASA scientist James Hansen, who has been warning of impending climatic doom for decades, the lack of focus on these events is in no small part due to the fact that scientists are poor communicators while the climate change skeptics have mounted a smoothly run campaign to capitalize on any mistakes and admissions of uncertainty.

“There is a strong campaign by those people who want to continue the fossil fuel business as usual. Climate contrarians … have managed in the public’s eye to muddy the waters enough that there is uncertainty why should we do anything yet,” he said on a visit to London’s Royal Society for a meeting on lessons to be learned from past climate change battles.

“They have been winning the argument in the last several years, even though the science has become clearer,” he added.

Nuclear power issue distracts Berlin

In Germany, where a generous feed-in tariff scheme has produced some 28 gigawatts of wind power capacity and more than 18 GW of solar photovoltaic capacity, Chancellor Angela Merkel’s coalition government was forced into an abrupt U-turn on a controversial move to extend the lives of the country’s fleet of nuclear power plants. There was a political revolt after the March 11 nuclear disaster at Fukushima in Japan.

The oldest seven of Germany’s nuclear plants were closed immediately after Fukushima and will now never reopen, while the remainder will close by 2022.

This has had the perverse effect in a country proud of its renewable energy efforts of increasing the use of coal-fired power plants and increasing the likelihood of new coal- or gas-fired plants being built. The price tag will include higher carbon emissions at exactly the time that the Germany along with the rest of the European Union is pledged to cut emissions.

While political observers believe the climate change issue will come back to the fore at some point in Germany — a country where the Greens have played a pivotal political role — the nuclear power issue is so politically charged that it is off the agenda for now.

Even in the United Kingdom, which has a huge wind energy program and where the Conservative-Liberal Democrat coalition came to power 15 months ago pledging to be the “greenest government ever,” there are major signs of backsliding. A long-awaited energy bill has been shelved, and renewable energy support costs and carbon emission reduction targets are either under review or about to be.

At the Conservative Party’s annual conference earlier this month, climate change was consigned to a brief debate on the opening Sunday, when delegates were mostly just arriving and finding their way around or still traveling to get there.

Damned by faint praise in London

Prime Minister David Cameron did not mention the issue in his speech to the conference — a performance that usually sets the broad agenda for the following year — and Chancellor of the Exchequer George Osborne caused environmental outrage but satisfaction to the party’s right wing by pledging that the United Kingdom would not go any faster than its E.U. neighbors on emission cuts.

This is despite the fact that the United Kingdom has a legal target to cut its carbon emissions by at least 80 percent below 1990 levels by 2050, with cuts of 35 percent by 2022 and 50 percent by 2025, whereas the European Union’s goal is 20 percent by 2020.

It was widely reported that the 2022 target was only agreed to after a major battle in the Cabinet between supporters of Conservative Osborne and those of Liberal Democrat Energy and Climate Change Minister Chris Huhne. It has since been announced that the carbon targets will be reviewed in 2014.

Even in London, where charismatic Conservative Mayor Boris Johnson came to power in 2008 in part on a green ticket, the issue has largely been parked and replaced by transport in the run-up to next year’s mayoral elections. The city’s aging transport system is feared likely to come under massive strain during the 2012 Olympic Games.

Then there is the strange case of a strategic plan on adapting London to climate change, the draft of which was launched with great fanfare and declarations of urgency in February 2010. It was on the brink of publication in September 2010, but after that, it appeared to have vanished without trace.

At the same time, most members of City Hall’s climate change team, set up under the previous Labour administration, have been moved to other jobs.

‘Too difficult — and not a vote winner’

“Political leaders get it, but the treasuries don’t. The men with the money don’t want to be first movers,” said Nick Mabey, co-founder of environmental think tank E3G. “But the political froth has gone. It has become too difficult — and not a vote winner.”

Compounding that problem, at least in the United Kingdom, has been a series of reports underscoring the likely high cost to households of green energy policies at a time when the prices of domestic electricity and gas are already rising sharply.

A recent opinion poll found that the climate change issue has been replaced by concerns over rising fuel bills and energy security.

But Mabey is not too concerned. While the subject may be off the immediate political agenda, behind the scenes, the more enlightened corporate leaders and investment fund managers have been making their own calculations. They are moving their money into the low-carbon economic transformation that in some cases is already profitable and in many eyes essential and inevitable.

The main danger, they say, is that if climate change as a driver of action is allowed to languish too long and become too invisible while energy becomes the main motivator, it will become far harder to resurrect climate change.

For Mabey and WRI’s Cameron, while the deep and seemingly returning global economic crisis has proved a serious distraction internationally as well as domestically, all is not lost.

For a number of reasons, including the rise of a new and major climate player — China — and a series of new scientific reports on climate change due over the next two or three years, 2015 will be the next pivotal moment for the world to take collective action, they say.

“Climate change doesn’t keep people awake at night. Our task for the next few years is to move it back up the political agenda again,” said WRI’s Cameron.

Copyright 2011 E&E Publishing. All Rights Reserved.

Group Urges Research Into Aggressive Efforts to Fight Climate Change (N.Y. Times)

By CORNELIA DEAN, Published: October 4, 2011

With political action on curbing greenhouse gases stalled, a bipartisan panel of scientists, former government officials and national security experts is recommending that the government begin researching a radical fix: directly manipulating the Earth’s climate to lower the temperature.

Members said they hoped that such extreme engineering techniques, which include scattering particles in the air to mimic the cooling effect of volcanoes or stationing orbiting mirrors in space to reflect sunlight, would never be needed. But in itsreport, to be released on Tuesday, the panel said it is time to begin researching and testing such ideas in case “the climate system reaches a ‘tipping point’ and swift remedial action is required.”

The 18-member panel was convened by the Bipartisan Policy Center, a research organization based in Washington founded by four senators — Democrats and Republicans — to offer policy advice to the government. In interviews, some of the panel members said they hoped that the mere discussion of such drastic steps would jolt the public and policy makers into meaningful action in reducing greenhouse gas emissions, which they called the highest priority.

The idea of engineering the planet is “fundamentally shocking,” David Keith, an energy expert at Harvard and the University of Calgary and a member of the panel, said. “It should be shocking.”

In fact, it is an idea that many environmental groups have rejected as misguided and potentially dangerous.

Jane Long, an associate director of the Lawrence Livermore National Laboratory and the panel’s co-chairwoman, said that by spewing greenhouse gases into the atmosphere, human activity was already engaged in climate modification. “We are doing it accidentally, but the Earth doesn’t know that,” she said, adding, “Going forward in ignorance is not an option.”

The panel, the Task Force on Climate Remediation Research, suggests that the White House Office of Science and Technology Policy begin coordinating research and estimates that a valuable effort could begin with a few million dollars in financing over the next few years.

One reason that the United States should embrace such research, the report suggests, is the threat of unilateral action by another country. Members say research is already under way in Britain, Germany and possibly other countries, as well as in the private sector.

“A conversation about this is going to go on with us or without us,” said David Goldston, a panel member who directs government affairs at the Natural Resources Defense Counciland is a former chief of staff of the House Committee on Science. “We have to understand what is at stake.”

In interviews, panelists said again and again that the continuing focus of policy makers and experts should be on reducing emissions of carbon dioxide and other greenhouse gases. But several acknowledged that significant action remained a political nonstarter. Last month, for example, the Obama administration told the federal Environmental Protection Agency to hold off on tightening ozone standards, citing complications related to the weak economy.

According to the United Nations Intergovernmental Panel on Climate Change, greenhouse gas emissions have contributed to raising the global average surface temperatures by about 1.3 degrees Fahrenheit in the past 100 years. It is impossible to predict how much impact the report will have. But given the panelists’ varied political and professional backgrounds, they seem likely to achieve one major goal: starting a broader conversation on the issue. Some climate experts have been working on it for years, but they have largely kept their discussions to themselves, saying they feared giving the impression that there might be quick fixes for climate change.

“Climate adaptation went through the same period of concern,” Mr. Goldston said, referring to the onetime reluctance of some researchers to discuss ways in which people, plants and animals might adjust to climate change. Now, he said, similar reluctance to discuss geoengineering is giving way, at least in part because “it’s possible we may have to do this no matter what.”

Although the techniques, which fall into two broad groups, are more widely known as geoengineering, the panel prefers “climate remediation.”

The first is carbon dioxide removal, in which the gas is absorbed by plants, trapped and stored underground or otherwise removed from the atmosphere. The methods are “generally uncontroversial and don’t introduce new global risks,” said Ken Caldeira, a climate expert at Stanford University and a panel member. “It’s mostly a question of how much do these things cost.”

Controversy arises more with the second group of techniques, solar radiation management, which involves increasing the amount of solar energy that bounces back into space before it can be absorbed by the Earth. They include seeding the atmosphere with reflective particles, launching giant mirrors above the earth or spewing ocean water into the air to form clouds.

These techniques are thought to pose a risk of upsetting earth’s natural rhythms. With them, Dr. Caldeira said, “the real question is what are the unknown unknowns: Are you creating more risk than you are alleviating?”

At the influential blog Climate Progress, Joe Romm, a fellow at the Center for American Progress, has made a similar point, likening geo-engineering to a dangerous course of chemotherapy and radiation to treat a condition curable through diet and exercise — or, in this case, emissions reduction.

The panel rejected any immediate application of climate remediation techniques, saying too little is known about them. In 2009, the Royal Society in Britain said much the same, assessing geoengineering technologies as “technically feasible” but adding that their potential costs, effectiveness and risks were unknown.

Similarly, in a 2010 review of federal research that might be relevant to climate remediation, the federal Government Accountability Office noted that “major uncertainties remain on the efficacy and potential consequences” of the approach. Its report also recommended that the White House Office of Science and Technology Policy “establish a clear strategy for geoengineering research.”

John P. Holdren, who heads that office, declined interview requests. He issued a statement reiterating the Obama administration’s focus on “taking steps to sensibly reduce pollution that is contributing to climate change.”

Yet in an interview with The Associated Press in 2009, Dr. Holdren said the possible risks and benefits of geoengineering should be studied very carefully because “we might get desperate enough to want to use it.”

In a draft plan made public on Friday, the U.S. Global Change Research Program, a coordinating effort administered by his office, outlined its own climate change research agenda, including studies of the impacts of rapid climate change.

The plan said that climate-related projections would be crucial to future studies of the “feasibility, effectiveness and unintended consequences of strategies for deliberate, large-scale manipulations of Earth’s environment,” including carbon dioxide removal and solar radiation management.

Many countries fault the United States for government inaction on climate change, especially given its longtime role as a chief contributor to the problem.

Frank Loy, a panelist and former chief climate negotiator for the United States, suggested that people around the world would see past those issues if the United States embraced geoengineering studies, provided that it was “very clear about what kind of research is undertaken and what the safeguards are.”

This article has been revised to reflect the following correction:

Correction: October 4, 2011

An earlier version of this article mistakenly referred to Frank Loy as the nation’s chief climate negotiator; he is a former chief climate negotiator. It also misstated the name of a federal agency that reported on the potential effectiveness of climate remediation. It is the Government Accountability Office, not the General Accountability Office.