Arquivo da tag: ciência

Biased but Brilliant (N.Y. Times)

GRAY MATTER
Biased but Brilliant

By CORDELIA FINE
Published: July 30, 2011

Cordelia Fine, a senior research associate at the Melbourne Business School, is the author of “A Mind of Its Own: How Your Brain Distorts and Deceives.”

HOW’S this for a cynical view of science? “A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it.”

Scientific truth, according to this view, is established less by the noble use of reason than by the stubborn exertion of will. One hopes that the Nobel Prize-winning physicist Max Planck, the author of the quotation above, was writing in an unusually dark moment.

And yet a large body of psychological data supports Planck’s view: we humans quickly develop an irrational loyalty to our beliefs, and work hard to find evidence that supports those opinions and to discredit, discount or avoid information that does not. In a classic psychology experiment, people for and against the death penalty were asked to evaluate the different research designs of two studies of its deterrent effect on crime. One study showed that the death penalty was an effective deterrent; the other showed that it was not. Which of the two research designs the participants deemed the most scientifically valid depended mostly on whether the study supported their views on the death penalty.

In the laboratory, this is labeled confirmation bias; observed in the real world, it’s known as pigheadedness.

Scientists are not immune. In another experiment, psychologists were asked to review a paper submitted for journal publication in their field. They rated the paper’s methodology, data presentation and scientific contribution significantly more favorably when the paper happened to offer results consistent with their own theoretical stance. Identical research methods prompted a very different response in those whose scientific opinion was challenged.

This is a worry. Doesn’t the ideal of scientific reasoning call for pure, dispassionate curiosity? Doesn’t it positively shun the ego-driven desire to prevail over our critics and the prejudicial urge to support our social values (like opposition to the death penalty)?

Perhaps not. Some academics have recently suggested that a scientist’s pigheadedness and social prejudices can peacefully coexist with — and may even facilitate — the pursuit of scientific knowledge.

Let’s take pigheadedness first. In a much discussed article this year in Behavioral and Brain Sciences, the cognitive scientists Hugo Mercier and Dan Sperber argue that our reasoning skills are really not as dismal as they seem. They don’t deny that irrationalities like the confirmation bias are common. Instead, they suggest that we stop thinking of the primary function of reasoning as being to improve knowledge and make better decisions. Reasoning, they claim, is for winning arguments. And an irrational tendency like pigheadedness can be quite an asset in an argumentative context. A engages with B and proposes X. B disagrees and counters with Y. Reverse roles, repeat as desired — and what in the old days we might have mistaken for an exercise in stubbornness turns out instead to be a highly efficient “division of cognitive labor” with A specializing in the pros, B in the cons.

It’s salvation of a kind: our apparently irrational quirks start to make sense when we think of reasoning as serving the purpose of persuading others to accept our point of view. And by way of positive side effect, these heated social interactions, when they occur within a scientific community, can lead to the discovery of the truth.

And what about scientists’ prejudices? Clearly, social values should never count as evidence for or against a particular hypothesis — abhorrence of the death penalty does not count as data against its crime-deterrent effects. However, the philosopher of science Heather Douglas has argued that social values can safely play an indirect role in scientific reasoning. Consider: The greater we judge the social costs of a potential scientific error, the higher the standard of evidence we will demand. Professor A, for example, may be troubled by the thought of an incorrect discovery that current levels of a carcinogen in the water are safe, fearing the “discovery” will cost lives. But Professor B may be more anxious about the possibility of an erroneous conclusion that levels are unsafe, which would lead to public panic and expensive and unnecessary regulation.

Both professors may scrutinize a research paper with these different costs of error implicitly in mind. If the paper looked at cancer rates in rats, did the criteria it used to identify the presence of cancer favor over- or under-diagnosis? Did the paper assume a threshold of exposure below which there is no cause for concern, or did it assume that any level of exposure increases risk? Deciding which are the “better” criteria or the “better” background assumptions is not, Ms. Douglas argues, solely a scientific issue. It also depends on the social values you bring to bear on the research. So when Professor A concludes that a research study is excellent, while Professor B declares it seriously mistaken, it may be that neither is irrationally inflating or discounting the strength of the evidence; rather, each is tending to a different social concern.

Science often makes important contributions to debates that involve clashes of social values, like the protection of public health versus the protection of private industry from overregulation. Yet Ms. Douglas suggests that, with social values denied any legitimate role in scientific reasoning, “debates often dance around these issues, attempting to hide them behind debates about the interpretation of data.” Professors A and B are left with no other option but to conclude that the other is a stubborn, pigheaded excuse for a scientist.

For all its imperfections, science continues to be a stunning success. Yet maybe progress would be even faster and smoother if scientists would admit, and even embrace, their humanity.

A version of this op-ed appeared in print on July 31, 2011, on page SR12 of the New York edition with the headline: Biased But Brilliant.

Climate Chaos (Against the Grain)

Tues 6.28.11| Climate Chaos

Christian Parenti speaking at a KPFA benefit on July 14th, on Tropic of Chaos: Climate Change and the New Geography of Violence, Nation Books, 2011

Listen to this Program here.

Download program audio (mp3, 49.82 Mbytes)

Residents of the Global North may be justly wringing their hands about flooding, droughts, and freak weather, but the most worrying effects of climate change are expected to hit the countries of the Global South, especially those in the broad regions on either side of the equator. Christian Parenti has reported from that vast area and discusses the shape that climate-related social dislocation is already taking, as well as the militarized plans of the rich countries to keep poor climate refugees out.

© Against the Grain, a program of KPFA Radio, 94.1fm Berkeley CA and online at KPFA.org.

I am, therefore I’m right (Christian Science Monitor)

By Jim Sollisch / July 29, 2011

If you’ve ever been on a jury, you might have noticed that a funny thing happens the minute you get behind closed doors. Everybody starts talking about themselves. They say what they would have done if they had been the plaintiff or the defendant. They bring up anecdote after anecdote. It can take hours to get back to the points of law that the judge has instructed you to consider.

Being on a jury (I recently served on my fourth) reminds me why I can’t stomach talk radio. We Americans seem to have lost the ability to talk about anything but our own experiences. We can’t seem to generalize without stereotyping or to consider evidence that goes against our own experience.

I heard a doctor on a radio show the other day talking about a study that found that exercise reduces the incidence of Alzheimer’s. And caller after caller couldn’t wait to make essentially the opposite point: “Well, my grandmother never exercised and she lived to 95, sharp as a tack.” We are in an age summed up by the aphorism: “I experience, therefore I’m right.”

This isn’t a new phenomenon, except by degree. Historically, the hallmarks of an uneducated person were the lack of ability to think critically, to use deductive reasoning, to distinguish the personal from the universal. Now that seems an apt description of many Americans. The culture of “I” is everywhere you look, from the iPod/iPhone/iPad to the fact that memoir is the fastest growing literary genre.

How’d we get here? The same way we seem to get everywhere today: the Internet. The Internet has allowed us to segregate ourselves based on our interests. All cat lovers over here. All people who believe President Obama wasn’t born in the United States over there. For many of us, what we believe has become the most important organizing element in our lives. Once we all had common media experiences: Walter Cronkite, Ed Sullivan, a large daily newspaper. Now each of us can create a personal media network – call it the iNetwork – fed by the RSS feeds of our choosing.

But the Internet doesn’t just cordon us off in our own little pods. It also makes us dumber, as Nicholas Carr points out in his excellent book, “The Shallows: What the Internet is Doing to our Brains.” He argues that the way we consume media changes our brains, not just our behaviors. The Internet rewards shallow thinking: One search leads to thousands of results that skim over the surface of a subject.

Of course, we could dive deeply into any one of the listings, but we don’t. Studies show that people skim on line, they don’t read. The experience has been designed to reward speed and variety, not depth. And there is tangible evidence, based on studies of brain scans, that the medium is changing our physical brains, strengthening the synapses and areas used for referential thinking while weakening the areas used for critical thinking.

And when we diminish our ability to think critically, we, in essence, become less educated. Less capable of reflection and meaningful conversation. Our experience, reinforced by a web of other gut instincts and experiences that match our own, becomes evidence. Case in point: the polarization of our politics. Exhibit A: the debt ceiling impasse.

Ironically, the same medium that helped mobilize people in the Arab world this spring is helping create a more rigid, dysfunctional democracy here: one that’s increasingly polarized, where each side is isolated and capable only of sound bites that skim the surface, a culture where deep reasoning and critical thinking aren’t rewarded.

The challenge for most of us isn’t to go backwards: We can’t disconnect from the Internet. Nor would we want to. But we can work harder to make “search” the metaphor it once was: to discover, not just to skim. The Internet lets us find facts in an instant. But it doesn’t stop us from finding insight, if we’re willing to really search.

Jim Sollisch is creative director at Marcus Thomas Advertising.

Why Global Warming Slowed in the 2000’s: Another Possible Explanation (Climate Central)

Published: July 21st, 2011
By Michael D. Lemonick

The world is getting progressively warmer, and the vast majority of evidence points to greenhouse gases spewed into the atmosphere by humans — carbon dioxide (CO2), especially — as the main culprit. But while the buildup of greenhouse gases has been steadily increasing, the warming goes in fits and starts. From one year to the next it might get a little warmer or a lot warmer, or even cooler.

That’s because greenhouse gases aren’t the whole story. Natural variations in sunlight and ocean currents; concentrations of particles in the air, manmade and otherwise; and even plain old weather variations can speed the warming up or slow it down, even as the underlying temperature trend continues upward. And while none of those factors is likely to change that trend over the long haul, scientists really want to understand how they affect projections of where our climate is heading.

The latest attempt to do so just appeared in Science Express, the online counterpart of the journal Science, where a team of climate scientists is reporting on their investigations of airborne particles, or aerosols, in the stratosphere. It’s well known, says co-author John Daniel, of the National Oceanic and Atmospheric Administration’s Earth System Research Laboratory in Boulder, Colo., that these particles have a cooling effect, since they reflect sunlight that would otherwise warm the planet.

Mt. Pinatubo’s erruption in the Philippines, in 1991. Credit: USGS.

It’s also well known that major volcanic eruptions, like Mt. Pinatubo’s in the Philippines in 1991, can pump lots of aerosols into the stratosphere — and indeed, Pinatubo alone temporarily cooled the planet for about two years. The explosion of Mt. Tambora in 1815 had even more catastrophic effects, which you can imagine given that 1816 came to be known as “the year without a summer.” But what lots of people thought, says Daniel, “is that since there haven’t been any eruptions on that scale recently, aerosols have become relatively unimportant for climate.”

That, says the study, is not true: even without major eruptions, aerosols in the stratosphere increased by about 7 percent per year from 2000 to 2010. Plug that figure into climate models, and they predict a reduction in the warming you’d otherwise expect from the rise in greenhouse gases by up to 20 percent.

In the real world, as it happens, the rise in temperature slowed during that same decade. “That,” says Daniel, “was the motivation for doing this research. It could have just been natural climate variability, but we wondered if it could be something else.” Some climate scientists attribute the slowdown to heat being temporarily stored in the deep oceans, but stratospheric aerosols could clearly be part of the answer as well.

Whether these aerosols are natural or manmade, however, is something the scientists didn’t address. Just last week, a paper in Proceedings of the National Academy of Sciences (PNAS) suggested the cause was a construction boom of coal-fired power plants in China over the same decade. The new study doesn’t necessarily contradict that. “Human emissions could play a role,” says Daniel, although the PNAS study was talking about aerosols in the lower atmosphere, not the stratosphere. “But even in the absences of colossal volcanic eruptions,” he says, “smaller eruptions could still add up.”

The other difference between the two studies is that the one from last week looked at the relatively slow temperature rise over the most recent decade and tried to tease out what might have changed since the previous decades, when the warming was faster. The new one took actual observations of aerosols and tried to predict what the temperature rise should be. That sort of approach tends to produce more credible results, since an incorrect prediction would stick out like a sore thumb.

Where the two studies emphatically agree is that if the level of aerosols goes down — due to a lull in eruptions, or a reduction in coal-plant pollution, or both — the pace of warming would likely pick up. That would mean that current projections for up to a 4.5°C increase in global average surface temperatures by the end of the century might turn out to be an underestimate. And if aerosol levels increase, the temperature in 2100 could be lower than everyone expects.

80 Percent of World Climate Data Are Not Computerized and Readily Available (Science Daily)

Science News

ScienceDaily (July 20, 2011) — In order to gain a better knowledge of climate variations, such as those caused by global warming, and be able to tackle them, we need to understand what happened in the recent past. This is the conclusion of a research study led by the Rovira i Virgili University (URV), which shows that the scientific community today is only able to access and analyse 20% of the recorded climate information held. The remaining data are not accessible in digital format.

Some climate data in Europe go back to the 17th Century, but “not even 20% of the information recorded in the past is available to the scientific community,” Manola Brunet, lead author of the study and a researcher at the URV’s Centre for Climate Change, said.

This situation is even worse in continents such as Africa and South America, where weather observations did not begin until the middle of the 19th Century. These are the results of a study published in Climate Research, which highlights the need to urgently recover all the information recorded in perishable formats.

“Failure to decipher the messages in the climate records of the past will result in socioeconomic problems, because we will be unable to deal with the current and future impacts of climate change and a hotter world,” says Brunet.

Spain, along with the USA, Canada, Holland and Norway, is one of a small number of countries which allows partial access to its historic climate data. The rest of the world does not make these data available to the scientific community or the general public, despite recommendations to this effect by the World Meteorological Organization (WMO).

In order to overcome the political and legal hurdles posed by this currently poor access, “governments should adopt a resolution within the United Nations on opening up their historical climate data,” the researcher suggests.

Predicting heat waves

Weather services in all countries are faced with the overwhelming job of converting all their paper-based historical climate information, which is stored in archives, libraries and research centres, into digital format. The wide range of forms in which the information is held makes access harder, as do the purposes for which the meteorological service itself was actually created.

“The main objective is to provide a weather service to public, who want to know what the weather will be like the next day,” explains Brunet. This has led to climate science (which studies the range of atmospheric conditions characterising a region rather than focusing on weather forecasting) becoming the great ‘victim’, receiving fewer funds with which to digitise, develop and standardise data.

However, climate services do play a significant role in some European countries, the United States and Canada. It was these services that were able to explain last summer’s heat wave in Eastern Europe and put it into context, as well as the high temperatures recorded on the Old Continent in 2003.

“If we had access to all the historical data recorded, we would be able to evaluate the frequency with which these phenomena are likely to occur in the future with a higher degree of certainty,” the expert explains.

This kind of information is of scientific, social and economic interest, with insurance companies setting their premiums according to expected climate changes, for example. City councils and governments also “want to understand climate conditions and how these will change in future in order to improve land zoning and prevent urban development from taking place in areas likely to be affected by flooding,” concludes Brunet.

Science and truth have been cast aside by our desire for controversy (Guardian)

Last week’s report into media science coverage highlighted an over-reliance on pointless dispute

Robin McKie
The Observer, Sunday 24 July 2011

Thomas Huxley, the British biologist who so vociferously, and effectively, defended Darwin’s theory of natural selection in the 19th century, had a basic view of science. “It is simply common sense at its best – rigidly accurate in observation and merciless to fallacy in logic.”

It is as neat a description as you can get and well worth remembering when considering how science is treated by the UK media and by the BBC in particular. Last week, a study, written by geneticist Steve Jones, warned that far too often the corporation had failed to appreciate the nature of science and to make a distinction “between well-established fact and opinion”. In doing so, the corporation had given free publicity to marginal belief, he said.

Jones was referring to climate change deniers, anti-MMR activists, GM crop opponents and other fringe groups who have benefited from wide coverage despite the paucity of evidence that supports their beliefs. By contrast, scientists, as purveyors of common sense, have found themselves sidelined because producers wanted to create controversy and so skewed discussions to hide researchers’ near unanimity of views in these fields. In this way, the British public has been misled into thinking there is a basic division among scientists over global warming or MMR.

It is a problem that can be blamed on the media that believe, with some justification, that adversarial dispute is the best way to cover democracy in action. It serves us well with politics and legal affairs, but falls down badly when it comes to science because its basic processes, which rely heavily on internal criticism and disproof, are so widely misunderstood.

Yet there is nothing complicated about the business, says Robert May, the former UK government science adviser. “In the early stages of research, ideas are like hillocks on a landscape. So you design experiments to discriminate among them. Most hillocks shrink and disappear until, in the end, you are left with a single towering pinnacle of virtual certitude.”

The case of manmade climate change is a good example, adds May. “A hundred years ago, scientists realised carbon dioxide emissions could affect climate. Twenty years ago, we thought they were now having an impact. Today, after taking more and more measurements, we can see there is no other explanation for the behaviour of the climate. Humans are changing it. Of course, deniers disagree, but that’s because they hold fixed positions that have nothing to do with science.”

It is the scientist, not the denier, who is the real sceptic, adds Paul Nurse, president of the Royal Society. “When you carry out research, you cannot afford to cherry-pick data or ignore inconvenient facts. You have to be brutal. You also have to be sceptical about your own ideas and attack them. If you don’t, others will.”

When an idea reaches the stage where it’s almost ready to become a paper, it has therefore been subjected to savage scrutiny by its own authors and by their colleagues – and that is before writing has started. Afterwards, the paper goes to peer review where there is a further round of critical appraisal by a separate group of researchers. What emerges is a piece of work that has already been robustly tested – a point that is again lost in the media.

Over the centuries, this process has been honed to near perfection. By proposing and then attacking ideas and by making observations to test them, humanity has built up a remarkable understanding of the universe. The accuracy of Einstein’s theories of relativity, Crick and Watson’s double helix structure of DNA and plate tectonics were all revealed this way, though no scientist would admit these discoveries are the last word, as the palaeontologist Stephen Jay Gould once pointed out: “In science, ‘fact’ can only mean ‘confirmed to such a degree that it would be perverse to withhold provisional assent’,” he admitted.

Certainly, things can go wrong, as Huxley acknowledged. Science may be organised common sense but all too often a beautiful theory created this way has been skewered by “a single ugly fact”, as he put it. Think of Fred Hoyle’s elegant concept of a steady state universe that is gently expanding and eternal. The idea was at one time considered to be philosophically superior to its rival, the big bang theory that proposed the cosmos erupted into existence billions of years ago. The latter idea explained the expansion of the universe by recourse to a vast explosion. The former accounted for this expansion in more delicate, intriguing terms.

The steady state theory continued to hold its own until, in 1964, radio-astronomers Arno Penzias and Robert Woodrow Wilson noted interference on their radio telescope at the Bell Labs in New Jersey and tried to eliminate it. The pair went as far as shovelling out the pigeon droppings in the telescope and had the guilty pigeons shot (each blamed the other for giving the order). Yet the noise persisted. Only later did the two scientists realise what they were observing. The static hiss they were picking up was caused by a microwave radiation echo that had been set off when the universe erupted into existence after its big bang birth.

That very ugly fact certainly ruined Hoyle’s beautiful theory and, no doubt, his breakfast when he read about it in his newspaper. But then the pursuit of truth has always been a tricky and cruel business. “It is true that some things come along like that to throw scientists into a tizz but it doesn’t happen very often,” adds Jones. “The trouble is, the BBC thinks it happens every day.”

And this takes us to the nub of the issue: how should science be reported and recorded? How can you take a topic such as climate change, about which there is virtual unanimity of views among scientists, and keep it in the public’s eye. The dangers of rising greenhouse gas emissions have dramatic implications after all. But simply reporting every tiny shrinkage in polar ice sheets or rise in sea levels will only alienate readers or viewers, a point acknowledged by May. “Newspapers, radio and TV have a duty to engage and there is no point in doing a lot of excellent reporting on a scientific issue if it is boring or trivial. The alternative is to trivialise or distort, thus subordinating substance in the name of attraction. It is a paradox for which I can see no answer.”

Jones agrees. “What we don’t want to do is go back to the days when fawning reporters asked great figures to declaim on scientific issues – or political ones, for that matter. On the other hand, we cannot continue to distort views in the name balance,” It is a tricky business, but as former Times editor Charlie Wilson once told a member of staff upset at a task’s complexity: “Of course, it’s hard. If it was easy we would get an orang-utan to do it.”

Jones, in highlighting a specific problem for the BBC, has opened up a far wider, far more important issue – the need to find ways to understand how science works and to appreciate its insights and complexities. It certainly won’t be easy.

Can a Candid Climate Modeler Convince Contrarians? (Scientific American)

Intrepid British climate scientist sets out to win over global warming doubters

By Jeremy Lovell and ClimateWire | July 19, 2011

CONVINCING CONTRARIANS: Scientists attempt to win over climate change doubters. Image: Courtesy of NOAA

LONDON — David Stainforth is a brave man. His mission is to try to remove some of the confusion over the climate debate by explaining why uncertainty has to be a part of the computerized climate models that scientists use to forecast the expected impacts of climate change, including more violent storms as well as more flooding and droughts.

Stainforth, a climate modeler and senior research fellow at the London School of Economics, hopes that by coming clean on the degree of difficulty in making such predictions, he and his fellow climate scientists will find it easier to make — and win — the argument that prompt action now is not only necessary but the far cheaper alternative to inaction.

“Governments and people want certainty about what will happen with climate change, so scientists tend to turn to climate modeling. But the models are wrong in so many ways because there are so many uncertainties and unknowns built into them,” Stainforth told ClimateWire here at the Royal Academy’s recent annual Summer Science Exhibition.

“The reason is that they are just that, models, not reality. The bottom line is that they give a quite useful message from science to the adaptation community. But it is all relative and hedged about with qualifications. They give likelihoods not certainties, ranges of probabilities, not absolutes. That is where the discussion then must start, not end,” he added.

It is a bold step to take at a time when the climate skeptics appear to be making the most of the continuing public confusion and denial over the issues shown in repeated polls in the United States and United Kingdom. Skeptics have taken advantage of the revelations of scientific infighting with the leaked emails from the United Kingdom’s University of East Anglia in late 2009. They have also pointed to evidence of some sloppy science by the Intergovernmental Panel on Climate Change to assert that the feared results of climate change may be more fiction than science.

Take that, add the diplomatic bickering and backsliding in international climate change talks, then fold in the news of the continuing global economic crisis and reports that renewable energy will drive up energy costs. You will get a sense that what Stainforth is attempting is a very hard sell.

The ‘trouble’ with climate models

“You can explain in five or 10 minutes why we need to do something about climate change — and do it without using climate models. But it is far harder to persuade people of the degree and speed of what needs to be done without the models, and that is where the trouble starts,” said Stainforth.

“Governments and the media demand certainty. They don’t want uncertainties and probabilities. For example, all our models predict wetter winters and warmer summers, but they are far less certain about wetter or drier summers, and that has major implications for the siting and size of flood defenses,” he explained, referring to dams and levees.

“Climate scientists have moved a long way beyond discussing whether climate change is a threat to our societies and economies. That is settled. But that is not to say they do not still disagree about a lot of things like the design of the models and the degree of change,” he added.

He remains hopeful that the non-scientific public will understand the strong consensus among climate scientists that makes the remaining bickering look small. “There is uncertainty, but there is also probability. By showing and discussing the degree of each in public and with the public, we hope to involve them and therefore get out of the loop and move forward.”

Stainforth’s mission is backed by an array of groups including the United Kingdom’s Natural Environment Research Council, the Economic and Social Research Council and the Centre for Climate Change Economics and Policy as well as the London School of Economics. There is also the Grantham Research Institute on Climate Change and the Environment — headed by Lord Nicholas Stern, whose report on the economics of climate change in 2006 electrified governments worldwide on the issue.

Trying some interactive games

Using literature and interactive games at the Confidence in Climate website, the project sets out to show how probabilities work and why different models may come up with quite widely differing predictions. It then applies this to a composite of theories and observations on the climate conundrum.

“When you make a decision about the future — whether it is based on theory or observation — it is a sort of gamble. You can never know what is going to happen. When we make decisions about how to tackle climate change it is no different,” the website says.

“Because of the uncertainty we can’t be sure exactly what degree of challenge we will face. None the less, some things are clear — uncertainty doesn’t mean ignorance. … We also know that bigger increases in atmospheric greenhouse gas levels are likely to lead to much bigger impacts; the impact of a 4 degree warming is likely to be more than twice the impact of a 2 degree warming,” it adds.

As for Stainforth, he thinks the debate urgently needs to be widened considerably from the rather restricted inner core of scientists, modelers, meteorologists and statisticians who have monopolized it to date.

“We need ecologists, farmers, doctors, anthropologists, sociologists, engineers, psychologists, hydrologists, social scientists. The climate change problem involves everyone and should therefore include everyone,” he said.

“We have to grasp the nettle here and communicate openly the uncertainty, to explain what is uncertain, where, why and to what degree. We don’t want it split into ‘believers’ and ‘unbelievers’; we want people to understand.”

Reprinted from Climatewire with permission from Environment & Energy Publishing, LLC. http://www.eenews.net, 202-628-6500

On Experts and Global Warming (N.Y. Times)

July 12, 2011, 4:01 PM
By GARY GUTTING

Experts have always posed a problem for democracies. Plato scorned democracy, rating it the worst form of government short of tyranny, largely because it gave power to the ignorant many rather than to knowledgeable experts (philosophers, as he saw it). But, if, as we insist, the people must ultimately decide, the question remains: How can we, nonexperts, take account of expert opinion when it is relevant to decisions about public policy?

Once we accept the expert authority of climate science, we have no basis for supporting the minority position.

To answer this question, we need to reflect on the logic of appeals to the authority of experts. First of all, such appeals require a decision about who the experts on a given topic are. Until there is agreement about this, expert opinion can have no persuasive role in our discussions. Another requirement is that there be a consensus among the experts about points relevant to our discussion. Precisely because we are not experts, we are in no position to adjudicate disputes among those who are. Finally, given a consensus on a claim among recognized experts, we nonexperts have no basis for rejecting the truth of the claim.

These requirements may seem trivially obvious, but they have serious consequences. Consider, for example, current discussions about climate change, specifically about whether there is long-term global warming caused primarily by human activities (anthropogenic global warming or A.G.W.). All creditable parties to this debate recognize a group of experts designated as “climate scientists,” whom they cite in either support or opposition to their claims about global warming. In contrast to enterprises such as astrology or homeopathy, there is no serious objection to the very project of climate science. The only questions are about the conclusions this project supports about global warming.

There is, moreover, no denying that there is a strong consensus among climate scientists on the existence of A.G.W. — in their view, human activities are warming the planet. There are climate scientists who doubt or deny this claim, but even they show a clear sense of opposing a view that is dominant in their discipline. Nonexpert opponents of A.G.W. usually base their case on various criticisms that a small minority of climate scientists have raised against the consensus view. But nonexperts are in no position to argue against the consensus of scientific experts. As long as they accept the expert authority of the discipline of climate science, they have no basis for supporting the minority position. Critics within the community of climate scientists may have a cogent case against A.G.W., but, given the overall consensus of that community, we nonexperts have no basis for concluding that this is so. It does no good to say that we find the consensus conclusions poorly supported. Since we are not experts on the subject, our judgment has no standing.

It follows that a nonexpert who wants to reject A.G.W. can do so only by arguing that climate science lacks the scientific status needed be taken seriously in our debates about public policy. There may well be areas of inquiry (e.g., various sub-disciplines of the social sciences) open to this sort of critique. But there does not seem to be a promising case against the scientific authority of climate science. As noted, opponents of the consensus on global warming themselves argue from results of the discipline, and there is no reason to think that they would have had any problem accepting a consensus of climate scientists against global warming, had this emerged.

Some nonexpert opponents of global warming have made much of a number of e-mails written and circulated among a handful of climate scientists that they see as evidence of bias toward global warming. But unless this group is willing to argue from this small (and questionable) sample to the general unreliability of climate science as a discipline, they have no alternative but to accept the consensus view of climate scientists that these e-mails do not undermine the core result of global warming.

I am not arguing the absolute authority of scientific conclusions in democratic debates. It is not a matter of replacing Plato’s philosopher-kings with scientist-kings in our polis. We the people still need to decide (perhaps through our elected representatives) which groups we accept as having cognitive authority in our policy deliberations. Nor am I denying that there may be a logical gap between established scientific results and specific policy decisions. The fact that there is significant global warming due to human activity does not of itself imply any particular response to this fact. There remain pressing questions, for example, about the likely long-term effects of various plans for limiting CO2 emissions, the more immediate economic effects of such plans, and, especially, the proper balance between actual present sacrifices and probable long-term gains. Here we still require the input of experts, but we must also make fundamental value judgments, a task that, pace Plato, we cannot turn over to experts.

The essential point, however, is that once we have accepted the authority of a particular scientific discipline, we cannot consistently reject its conclusions. To adapt Schopenhauer’s famous remark about causality, science is not a taxi-cab that we can get in and out of whenever we like. Once we board the train of climate science, there is no alternative to taking it wherever it may go.

New York Times Publishes a Searing Drought Story, But Completely Misses the Climate Change Angle (Climate Central)

Published: July 12th, 2011, Last Updated: July 13th, 2011
By Andrew Freedman

In Monday’s New York Times, Kim Severson and Kirk Johnson wrote an eloquent story on the intense drought that is maintaining a tight grip on a broad swath of America’s southern tier, from Arizona to Florida. Reporting from Georgia, Severson and Johnson detailed the plight of farmers struggling to make ends meet as the parched soil makes it nearly impossible for them to grow crops and feed livestock.

Monday’s story from the New York Times on drought.

The piece is a great example of how emotionally moving storytelling from a local perspective can convey the consequences of broad issues and trends, in this case, a major drought that has enveloped 14 states. In that sense, it served Times readers extraordinarily well.

However, when it came to providing readers with a thorough understanding of the drought’s causes and aggravating factors, Severson and Johnson left out any mention of the elephant in the room — global climate change, and pinned the entire drought on one factor, La Niña. For this, it was overly simplistic, and even just downright inaccurate.

Here’s how the story framed the drought’s causes:

From a meteorological standpoint, the answer is fairly simple. “A strong La Niña shut off the southern pipeline of moisture,” said David Miskus, who monitors drought for the National Oceanic and Atmospheric Administration.

The La Niña “lone gunman” theory is problematic from a scientific standpoint. Just last week, Marty Hoerling, the federal government’s top researcher tasked with examining how climate change may be influencing extreme weather and climate events, told reporters that “we cannot reconcile it [the drought] with just the La Niña impact alone, at least not at this time.”

Instead, the causal factors are more nuanced than that, and they do include global warming, since it is changing the background conditions in which such extreme events occur.

During a press conference last week from a drought management meeting in the parched city of Austin, Texas, Hoerling made clear that climate change is already increasing average temperatures across the drought region, and is expected to lead to more frequent and intense droughts in the Southwest. Other research indicates the trend towards a drier Southwest is already taking place. “There are recent regional tendencies toward more severe droughts in the southwestern United States, parts of Canada and Alaska, and Mexico,” stated a 2008 report from the U.S. Global Change Research Program.

As is the case with any extreme weather or climate event now, one cannot truly separate climate change from the mix, considering that droughts, floods, and other extreme events now occur in an environment that has been profoundly altered by human emissions of greenhouse gases, such as carbon dioxide. This doesn’t mean that climate change is causing all of these extreme events, but it does mean that climate change may be increasing the likelihood that some types of events will occur, and may be changing the characteristics of some extreme events, such as by making heat waves more intense.

The fact that the Times story detailed both the drought and the record heat accompanying it, yet left out any mention of climate change, was a particularly puzzling error of omission. Hoerling, for one, pointed to the extreme heat seen during this drought as a possible sign of things to come, as climate change helps produce dangerous combinations of heat and drought.

“We haven’t necessarily dealt with drought and heat at the same time in such a persistent way, and that’s a new condition,” Hoerling said, noting that higher temperatures only hasten the drying of soils.

Many ponds in Texas, such as this one in Rusk County, were nearly dry by late June 2011. Credit: agrilifetoday/flickr.

Texas had its warmest June on record, for example, and on June 26th, Amarillo, Texas recorded its warmest temperature on record for any month, at 111°F. According to the Weather Channel, parts of Oklahoma and Texas have already exceeded their yearly average number of days at or above 100 degrees, including Oklahoma City, Dallas, and Austin. The heat is related to the drought, because when soil moisture is so low, more of the sun’s energy goes towards heating the air directly.

It’s unfortunate that the Times story, which was a searing portrayal of how a drought can impact communities that are already down on their luck due to economic troubles, did not include at least some discussion on climate change. As I’ve shown here, and climate blogger Joe Romm has also pointed out, there was sufficient evidence to justify raising the climate change topic in that story, and many others like it. After all, if the media doesn’t make an effort to evaluate the evidence on the links between extreme weather and climate change, then how can we expect the public to understand how global warming may affect their lives?

At Climate Central, our scientists are working to better understand whether and how climate change is increasing the likelihood of certain extreme weather events, such as heat waves, while at the same time, our journalists are covering the Southern drought and wildfire situation with the goal of making sure our readers understand what scientific studies show about global warming and extreme events.

This is not an easy task, but it need not be such a lonely one.

Update, July 13: The Times published an editorial on the drought today, which also blames the drought squarely on La Niña-related weather patterns, and makes no mention of climate change impacts or projections.

* * *

EDITORIAL (New York Times)
Suffering in the Parched South
Published: July 12, 2011

Right now, the official drought map of the United States looks as if it has been set on fire and scorched at the bottom edge. Scorched is how much of the Southeast and Southwest feel, in the midst of a drought that is the most extreme since the 1950s and possibly since the Dust Bowl of the 1930s. The government has classified much of this drought as D4, which means exceptional. The outlook through late September shows possible improvement in some places, but in most of Texas, Oklahoma, southern Arkansas, and northern Louisiana and Mississippi the drought is expected to worsen.

Dry conditions began last year and have only intensified as temperatures rose above 100 in many areas. Rain gauges have been empty for months, causing a region-wide search for new underground sources of water as streams and lakes dry up. The drought is produced by a pattern of cooling in the Pacific called La Niña. A cooler ocean means less moisture in the atmosphere, which shuts down the storms shuttling east across the region.

Droughts are measured in dollars as well as degrees. The prospects for cattle and wheat, corn and cotton crops across the South are dire. There is no way yet to estimate the ultimate cost of this drought because there is no realistic estimate of when it will end. Farmers have been using crop insurance payments, and federal relief is available in disaster areas, including much of Texas. But the only real relief will be the end of the dry, hot winds and the beginning of long, settled rains.

* * *

Drought Spreads Pain From Florida to Arizona

Grant Blankenship for The New York Times. Buster Haddock, an agricultural scientist at the University of Georgia, in a field where cotton never had the chance to grow.

By KIM SEVERSON and KIRK JOHNSON
Published: July 11, 2011

COLQUITT, Ga. — The heat and the drought are so bad in this southwest corner of Georgia that hogs can barely eat. Corn, a lucrative crop with a notorious thirst, is burning up in fields. Cotton plants are too weak to punch through soil so dry it might as well be pavement.

Waiting for Rain

Dangerously Dry – Nearly a fifth of the contiguous United States has been faced with the worst drought in recent years.

The Dry Season

OKLAHOMA A simple, if plaintive, message from the residents of Hough, in the panhandle, late last month. Shawn Yorks/The Guymon Daily Herald, via Associated Press

Farmers with the money and equipment to irrigate are running wells dry in the unseasonably early and particularly brutal national drought that some say could rival the Dust Bowl days.

“It’s horrible so far,” said Mike Newberry, a Georgia farmer who is trying grow cotton, corn and peanuts on a thousand acres. “There is no description for what we’ve been through since we started planting corn in March.”

The pain has spread across 14 states, from Florida, where severe water restrictions are in place, to Arizona, where ranchers could be forced to sell off entire herds of cattle because they simply cannot feed them.

In Texas, where the drought is the worst, virtually no part of the state has been untouched. City dwellers and ranchers have been tormented by excessive heat and high winds. In the Southwest, wildfires are chewing through millions of acres.

Last month, the United States Department of Agriculture designated all 254 counties in Texas natural disaster areas, qualifying them for varying levels of federal relief. More than 30 percent of the state’s wheat fields might be lost, adding pressure to a crop in short supply globally.

Even if weather patterns shift and relief-giving rain comes, losses will surely head past $3 billion in Texas alone, state agricultural officials said.

Most troubling is that the drought, which could go down as one of the nation’s worst, has come on extra hot and extra early. It has its roots in 2010 and continued through the winter. The five months from this February to June, for example, were so dry that they shattered a Texas record set in 1917, said Don Conlee, the acting state climatologist.

Oklahoma has had only 28 percent of its normal summer rainfall, and the heat has blasted past 90 degrees for a month.

“We’ve had a two- or three-week start on what is likely to be a disastrous summer,” said Kevin Kloesel, director of the Oklahoma Climatological Survey.

The question, of course, becomes why. In a spring and summer in which weather news has been dominated by epic floods and tornadoes, it is hard to imagine that more than a quarter of the country is facing an equally daunting but very different kind of natural disaster.

From a meteorological standpoint, the answer is fairly simple. “A strong La Niña shut off the southern pipeline of moisture,” said David Miskus, who monitors drought for the National Oceanic and Atmospheric Administration.

The weather pattern called La Niña is an abnormal cooling of Pacific waters. It usually follows El Niño, which is an abnormal warming of those same waters.

Although a new forecast from the National Weather Service’s Climate Prediction Center suggests that this dangerous weather pattern could revive in the fall, many in the parched regions find themselves in the unlikely position of hoping for a season of heavy tropical storms in the Southeast and drenching monsoons in the Southwest.

Climatologists say the great drought of 2011 is starting to look a lot like the one that hit the nation in the early to mid-1950s. That, too, dried a broad part of the southern tier of states into leather and remains a record breaker.

But this time, things are different in the drought belt. With states and towns short on cash and unemployment still high, the stress on the land and the people who rely on it for a living is being amplified by political and economic forces, state and local officials say. As a result, this drought is likely to have the cultural impact of the great 1930s drought, which hammered an already weakened nation.

“In the ’30s, you had the Depression and everything that happened with that, and drought on top,” said Donald A. Wilhite, director of the school of natural resources at the University of Nebraska in Lincoln and former director of the National Drought Mitigation Center. “The combination of those two things was devastating.”

Although today’s economy is not as bad, many Americans ground down by prolonged economic insecurity have little wiggle room to handle the effects of a prolonged drought. Government agencies are in the same boat.

“Because we overspent, the Legislature overspent, we’ve been cut back and then the drought comes along and we don’t have the resources and federal government doesn’t, and so we just tighten our belt and go on,” said Donald Butler, the director of the Arizona Department of Agriculture.

The drought is having some odd effects, economically and otherwise.

“One of the biggest impacts of the drought is going to be the shrinking of the cattle herd in the United States,” said Bruce A. Babcock, an agricultural economist at Iowa State University in Ames. And that will have a paradoxical but profound impact on the price of a steak.

Ranchers whose grass was killed by drought cannot afford to sustain cattle with hay or other feed, which is also climbing in price. Their response will most likely be to send animals to slaughter early. That glut of beef would lower prices temporarily.

But America’s cattle supply will ultimately be lower at a time when the global supply is already low, potentially resulting in much higher prices in the future.

There are other problems. Fishing tournaments have been canceled in Florida and Mississippi, just two of the states where low water levels have kept recreational users from lakes and rivers. In Texas, some cities are experiencing blackouts because airborne deposits of salt and chemicals are building up on power lines, triggering surges that shut down the system. In times of normal weather, rain usually washes away the environmental buildup. Instead, power company crews in cities like Houston are being dispatched to spray electrical lines.

In this corner of Georgia, where temperatures have been over 100 and rainfall has been off by more than half, fish and wildlife officials are worried over the health of the shinyrayed pocketbook and the oval pigtoe mussels, both freshwater species on the endangered species list.

The mussels live in Spring Creek, which is dangerously low and borders Terry Pickle’s 2,000-acre farm here. He pulls his irrigation from wells that tie into the water system of which Spring Creek is a part.

Whether nature or agriculture is to blame remains a debate in a state that for 20 years has been embroiled in a water war with Alabama and Florida. Meanwhile, Colquitt has allowed the state to drill a special well to pump water back into the creek to save the mussels from extinction.

Most farmers here are much more worried about the crops than the mussels. With cotton and corn prices high, they had high hopes for the season. But many have had to replant fields several times to get even one crop to survive. Others, like Mr. Pickle, have relied on irrigation so expensive that it threatens to eat into any profits.

The water is free, but the system used to get it from the ground runs on diesel fuel. His bill for May and June was an unheard of $88,442.

Thousands of small stories like that will all contribute to the ultimate financial impact of the drought, which will not be known until it is over. And no one knows when that will be.

The United States Department of Agriculture’s Farm Service Agency has already provided over $75 million in assistance to ranchers nationwide, with most of it going to Florida, New Mexico and Texas. An additional $62 million in crop insurance indemnities have already been provided to help other producers.

Economists say that adding up the effects of drought is far more complicated than, say, those of a hurricane or tornado, which destroy structures that have set values. With drought, a shattered wheat or corn crop is a loss to one farmer, and it has a specific price tag. But all those individual losses punch a hole in the food supply and drive prices up. That is good news for a farmer who manages to get a crop in. The final net costs down the line are thus dispersed, and mostly passed along.

That means grocery shoppers will feel the effects of the drought at the dinner table, where the cost of staples like meat and bread will most likely rise, said Michael J. Roberts, an associate professor of agricultural and resource economics at North Carolina State University in Raleigh, N.C. “The biggest losers are consumers,” he said.

Kim Severson reported from Colquitt, Ga., and Kirk Johnson from Denver. Dan Frosch contributed reporting from Denver.

Our Extreme Future: Predicting and Coping with the Effects of a Changing Climate (Scientific American)

Adapting to extreme weather calls for a combination of restoring wetland and building drains and sewers that can handle the water. But leaders and the public are slow to catch on. Final part of a three-part series

By John Carey | Thursday, June 30, 2011 | 97

Image: Fikret Onal/Flickr

Editor’s note: This article is the last of a three-part series by John Carey. Part 1, “Storm Warning: Extreme Weather Is a Product of Climate Change,” was posted on June 28. Part 2, “Global Warming and the Science of Extreme Weather,” was posted on June 29.

Extreme weather events have become both more common and more intense. And increasingly, scientists have been able to pin at least part of the blame on humankind’s alteration of the climate. What’s more, the growing success of this nascent science of climate attribution (finding the telltale fingerprints of climate change in extreme events) means that researchers have more confidence in their climate models—which predict that the future will be even more extreme.

Are we prepared for this future? Not yet. Indeed, the trend is in the other direction, especially in Washington, D.C., where a number of members of Congress even argue that climate change itself is a hoax.

Scientists hope that rigorously identifying climate change’s contribution to individual extreme events can indeed wake people up to the threat. As the research advances, it should be possible to say that two extra inches (five centimeters) of rain poured down in a Midwestern storm because of greenhouse gases, or that a California heat wave was 10 times more likely to occur thanks to humans’ impacts on climate. So researchers have set up rapid response teams to assess climate change’s contribution to extreme events while the events are still fresh in people’s minds. In addition, the Intergovernmental Panel on Climate Change (IPCC) is preparing a special report on extreme events and disasters, due out by the end of 2011. “It is important for us emphasize that climate change and its impacts are not off in the future, but are here and now,” explained Rajendra Pachauri, chair of the IPCC, during a briefing at United Nations climate talks in Cancún last December.

The message is beginning to sink in. The Russian government, for instance, used to doubt the existence of climate change, or argue that it might be beneficial for Russia. But now, government officials have realized that global warming will not bring a gradual and benign increase in temperatures. Instead, they’re likely to see more crippling heat waves. As Russian President Dmitry Medvedev told the Security Council of the Russian Federation last summer: “Everyone is talking about climate change now. Unfortunately, what is happening now in our central regions is evidence of this global climate change, because we have never in our history faced such weather conditions.”

Doubts persist despite evidence

Among the U.S. public, the feeling is different. Opinion pollsand anecdotal reports show that most Americans do not perceive a threat from climate change. And a sizable number of Americans, including many newly elected members of Congress, do not even believe that climate change exists. Extreme weather? Just part of nature, they say. After all, disastrous floods and droughts go back to the days of Noah and Moses. Why should today’s disasters be any different? Was the July 23, 2010, storm that spawned Les Scott’s record hailstone evidence of a changing climate, for instance? “Not really,” Scott says. “It was just another thunderstorm. We get awful bad blizzards that are a lot worse.”

And yes, 22 of Maryland’s 23 counties were declared natural disaster areas after record-setting heat and drought in 2010. “It was the worst corn crop I ever had,” says fourth-generation farmer Earl “Buddy” Hance. But was it a harbinger of a more worrisome future? Probably not, says Hance, the state’s secretary of agriculture. “As farmers we are skeptical, and we need to see a little more. And if it does turn out to be climate change, farmers would adapt.” By then, adaptation could be really difficult, frets Minnesota organic farmer Jack Hedin, whose efforts to raise the alarm are “falling on deaf ears,” he laments.

Many scientists share Hedin’s worry. “The real honest message is that while there is debate about how much extreme weather climate change is inducing now, there is very little debate about its effect in the future,” says Michael Wehner, staff scientist at Lawrence Berkeley National Laboratory and member of the lead author teams of the interagency U.S. Climate Change Science Program’s Synthesis and Assessment reports on climate extremes. For instance, climate models predict that by 2050 Russia will have warmed up so much that every summer will be as warm as the disastrous heat wave it just experienced, says Richard Seager of Columbia University’s Lamont–Doherty Earth Observatory. In other words, many of today’s extremes will become tomorrow’s everyday reality. “Climate change will throw some significant hardballs at us,” says Martin Hoerling, a research meteorologist at the National Oceanic and Atmospheric Administration’s Earth System Research Laboratory in Boulder, Colo. “There will be a lot of surprises that we are not adapted to.”

A dusty future

One of the clearest pictures of this future is emerging for the U.S. Southwest and a similar meteorological zone that stretches across Italy, Greece and Turkey. Work by Tim Barnett of the Scripps Institution of Oceanography, Seager and others predicts that these regions will get hotter and drier—and, perhaps more important, shows that the change has already begun. “The signal of a human influence on climate pops up in 1985, then marches on getting strong and stronger,” Barnett says. By the middle of the 21st century, the models predict, the climate will be as dry as the seven-year long Dust Bowl drought of the 1930s or the damaging 1950s drought centered in California and Mexico, Seager says: “In the future the drought won’t last just seven years. It will be the new norm.”

That spells trouble. In the Southwest the main worry is water—water that makes cities like Los Angeles and Las Vegas possible and that irrigates the enormously productive farms of California’s Central Valley. Supplies are already tight. During the current 11-year dry spell, the demand for water from the vast Colorado River system, which provides water to 30 million people and irrigates four million acres (1.6 million hectares) of cropland, has exceeded the supply. The result: water levels in the giant Lake Mead reservoir dropped to a record low in October (before climbing one foot, or 30 centimeters, after torrential winter rains in California reduced the demand for Colorado River water). Climate change will just make the problem worse. “The challenge will be great,” says Terry Fulp, deputy regional director of the U.S. Department of the Interior’s Bureau of Reclamation’s Lower Colorado Region. “I rank climate change as probably my largest concern. When I’m out on my boat on Lake Mead, it’s on my mind all the time.”

The Southwest is just a snapshot of the challenges ahead. Imagine the potential peril to regions around the world, scientists say. “Our civilization is based on a stable base climate—it doesn’t take very much change to raise hell,” Scripps’s Barnett says. And given the lag in the planet’s response to the greenhouse gases already in the atmosphere, many of these changes are coming whether we like them or not. “It’s sort of like that Kung Fu guy who said, ‘I’m going to kick your head off now, and there’s not a damn thing you can do about it,'” Barnett says.

Grassroots action

Although efforts to fight climate change are now stalled in Washington, many regions do see the threat and are taking action both to adapt to the future changes and to try to limit the amount of global warming itself. The Bureau of Reclamation’s Lower Colorado Region office, for instance, has developed a plan to make “manageable” cuts in the amounts of water that the river system supplies, which Fulp hopes will be enough to get the region through the next 15 years. In Canada, after experiencing eight extreme storms (of more than one-in-25-year intensity) between 1986 and 2006, Toronto has spent hundreds of millions of dollars to upgrade its sewer and storm water system for handling deluges. “Improved storm drains are the cornerstone of our climate adaptation policy,” explains Michael D’Andrea, Toronto’s director of water infrastructure management.

In Iowa, even without admitting that climate change is real, farmers are acting as if it is, spending millions of dollars to alter their practices. They are adding tile drainage to their fields to cope with increased floods, buying bigger machinery to move more quickly because their planting window has become shorter, planting a month earlier than they did 50 years ago, and sowing twice as many corn plants per acre to exploit the additional moisture, says Gene Takle, professor of meteorology at Iowa State University in Ames. “Iowa’s floods are in your face—and in your basement—evidence that the climate has changed, and the farmers are adapting,” he says.

Local officials have seen the connection, too. After the huge floods of 2008, the Iowa town of Cedar Falls passed an ordinance requiring that anyone who lives in the 500-year flood plain must have flood insurance—up from the previous 200-year flood requirement. State Sen. Robert Hogg wants to make the policy statewide. He also is pushing to restore wetlands that can help soak up floodwaters before they devastate cities. “Wetland restoration costs money, but it’s cheaper than rebuilding Cedar Rapids,” he says. “I like to say that dealing with climate change is not going to require the greatest sacrifices, but it is going to require the greatest foresight Americans have ever had.”

Right now, that foresight is more myopia, many scientists worry. So when and how will people finally understand that far more is needed? It may require more flooded basements, more searing heat waves, more water shortages or crop failures, more devastating hurricanes or other examples of the increases in extreme weather that climate change will bring. “I don’t want to root for bad things to happen, but that’s what it will take,” says one government scientist who asked not to be identified. Or as Nashville resident Rich Hays says about his own experience with the May 2010 deluge: “The flood was definitely a wake-up call. The question is: How many wake-up calls do we need?”

Reporting for this story was funded by the Pew Center on Global Climate Change.

Global Warming and the Science of Extreme Weather (Scientific American)

How rising temperatures change weather and produce fiercer, more frequent storms. Second of a three-part series

By John Carey | Wednesday, June 29, 2011 | 46

HURRICANE KATRINA battered New Orleans in 2005. Image: NOAA

Editor’s note: This article is the second of a three-part series by John Carey. Part 1, posted on June 28, is “Storm Warning: Extreme Weather Is a Product of Climate Change”.

Extreme floods, prolonged droughts, searing heat waves, massive rainstorms and the like don’t just seem like they’ve become the new normal in the last few years—they have become more common, according to data collected by reinsurance company Munich Re (see Part 1 of this series). But has this increase resulted from human-caused climate change or just from natural climatic variations? After all, recorded floods and droughts go back to the earliest days of mankind, before coal, oil and natural gas made the modern industrial world possible.

Until recently scientists had only been able to say that more extreme weather is “consistent” with climate change caused by greenhouse gases that humans are emitting into the atmosphere. Now, however, they can begin to say that the odds of having extreme weather have increased because of human-caused atmospheric changes—and that many individual events would not have happened in the same way without global warming. The reason: The signal of climate change is finally emerging from the “noise”—the huge amount of natural variability in weather.

Scientists compare the normal variation in weather with rolls of the dice. Adding greenhouse gases to the atmosphere loads the dice, increasing odds of such extreme weather events. It’s not just that the weather dice are altered, however. As Steve Sherwood, co-director of the Climate Change Research Center at the University of New South Wales in Australia, puts it, “it is more like painting an extra spot on each face of one of the dice, so that it goes from 2 to 7 instead of 1 to 6. This increases the odds of rolling 11 or 12, but also makes it possible to roll 13.”

Why? Basic physics is at work: The planet has already warmed roughly 1 degree Celsius since preindustrial times, thanks to CO2and other greenhouse gases emitted into the atmosphere. And for every 1-degree C (1.8 degrees Fahrenheit) rise in temperature, the amount of moisture that the atmosphere can contain rises by 7 percent, explains Peter Stott, head of climate monitoring and attribution at the U.K. Met Office’s Hadley Center for Climate Change. “That’s quite dramatic,” he says. In some places, the increase has been much larger. Data gathered by Gene Takle, professor of meteorology at Iowa State University in Ames, show a 13 percent rise in summer moisture over the past 50 years in the state capital, Des Moines.

The physics of too much rain

The increased moisture in the atmosphere inevitably means more rain. That’s obvious. But not just any kind of rain, the climate models predict. Because of the large-scale energy balance of the planet, “the upshot is that overall rainfall increases only 2 to 3 percent per degree of warming, whereas extreme rainfall increases 6 to 7 percent,” Stott says. The reason again comes from physics. Rain happens when the atmosphere cools enough for water vapor to condense into liquid. “However, because of the increasing amount of greenhouse gases in the troposphere, the radiative cooling is less efficient, as less radiation can escape to space,” Stott explains. “Therefore the global precipitation increases less, at about 2 to 3 percent per degree of warming.” But because of the extra moisture, when precipitation does occur (in both rain and snow), it’s more likely to be in bigger events.

Iowa is one of many places that fits the pattern. Takle documented a three- to seven-fold increase in high rainfall events in the state, including the 500-year Mississippi River flood in 1993, the 2008 Cedar Rapids flood as well as the 500-year event in 2010 in Ames, which inundated the Hilton Coliseum basketball court in eight feet (2.5 meters) of water . “We can’t say with confidence that the 2010 Ames flood was caused by climate change, but we can say that the dice are loaded to bring more of these events,” Takle says.

And more events seem to be in the news every month, from unprecedented floods in Riyadh, Saudi Arabia, to massive snowstorms that crippled the U.S. Northeast in early 2011, to the November 2010 to January 2011 torrents in Australia that flooded an area the size of Germany and France . This “disaster of biblical proportions,” as local Australian officials called it, even caused global economic shock waves: The flooding of the country’s enormously productive coal mines sent world coal prices soaring.

More stormy weather

More moisture and energy in the atmosphere, along with warmer ocean temperatures also mean more intense hurricanes, many scientists say. In fact, 2010 was the first year in decades in which two simultaneous category 4 hurricanes, Igor and Julia, formed in the Atlantic Ocean. In addition, the changed conditions bring an increased likelihood of more powerful thunderstorms with violent updrafts, like a July 23, 2010, tempest in Vivian, S.D., that produced hailstones that punched softball-size holes through roofs—and created a behemoth ball of ice measured at a U.S. record 8 inches (20 centimeters) in diameter even after it had partially melted. “I’ve never seen a storm like that before—and hope I’ll never go through anything like it,” says Les Scott, the Vivian farmer and rancher who found the hailstone .

Warming the planet alters large-scale circulation patterns as well. Scientists know that the sun heats moist air at the equator, causing the air to rise. As it rises, the air cools and sheds most of its moisture as tropical rain. Once six to 10 miles (9.5 to 16 kilometers) aloft, the now dry air travels toward the poles, descending when it reaches the subtropics, normally at the latitude of the Baja California peninsula. This circulation pattern, known as a Hadley cell, contributes to desertification, trade winds and the jet stream.

On a warmer planet, however, the dry air will travel farther north and south from the equator before it descends, climate models predict, making areas like the U.S. Southwest and the Mediterranean even drier. Such an expanded Hadley cell would also divert storms farther north. Are the models right? Richard Seager of Columbia University’s Lamont–Doherty Earth Observatory has been looking for a climate change–induced drying trend in the Southwest, “and there seems to be some tentative evidence that it is beginning to happen,” he says. “It gives us confidence in the models.” In fact, other studies show that the Hadley cells have not only expanded, they’ve expanded more than the models predicted.

Such a change in atmospheric circulation could explain both the current 11-year drought in the Southwest and Minnesota’s status as the number one U.S. state for tornadoes last year. On October 26, 2010, the Minneapolis area even experienced record low pressure in what Paul Douglas, founder and CEO of WeatherNation in Minnesota, dubbed a “landicane”—a hurricanelike storm that swept across the country. “I thought the windows of my home would blow in,” Douglas recalls. “I’ve chased tornados and flown into hurricanes but never experienced anything like this before.” Yet it makes sense in the context of climate change, he adds. “Every day, every week, another piece of the puzzle falls into place,” he says. “More extreme weather seems to have become the rule, not just in the U.S. but in Europe and Asia.”

The rise of climate attribution

Is humankind really responsible? That’s where the burgeoning field of climate attribution, pioneered by Hadley’s Peter Stott and other scientists, comes in. The idea is to look for trends in the temperature or precipitation data that provide evidence of overall changes in climate. When those trends exist, it then becomes possible to calculate how much climate change has contributed to extreme events. Or in more technical terms, the probability of a particular temperature or rainfall amount is shaped roughly like a bell curve. A change in climate shifts the whole curve. That, in turn, increases the likelihood of experiencing the more extreme weather at the tail end of the bell curve. Whereas day-to-day weather remains enormously variable, the underlying human-caused shift in climate increases the power and number of the events at the extreme. The National Oceanic and Atmospheric Administration’s (NOAA) Deke Arndt puts it more colorfully: “Weather throws the punches, but climate trains the boxer,” he says. By charting the overall shift, then, it’s possible to calculate the increased chances of extreme events due to global warming.

This idea was already in the air in 2003 when Stott traveled though the worst heat wave in recorded European history on a wedding anniversary trip to Italy and Switzerland. One of the striking consequences he noticed was that the Swiss mountains were missing their usual melodious tinkling of cowbells. “There was no water in the mountains, and the farmers had to take all their cows down in the valley,” he says. He decided to see if he could pin part of the blame on climate change after he returned to his office in Exeter, England. “I didn’t expect to get a positive result,” he says

But he did. In fact, the signal of a warming climate was quite clear in Europe, even using data up to only 2000. In a landmark paper in Nature Stott and colleagues concluded that the chances of a heat wave like the 2003 event have more than doubled because of climate change. (Scientific American is part of Nature Publishing Group.) Data collected since then show that the odds are at least four times higher compared with pre-industrial days. “We are very aware of the risks of misattribution,” Stott says. “We don’t want to point to specific events and say that they are part of climate change when they really are due to natural variability. But for some events, like the 2003 heat wave, we have the robust evidence to back it up.”

Case in point: Hurricane Katrina

Another event with a clear global warming component, says Kevin Trenberth, head of climate analysis at the National Center for Atmospheric Research (NCAR) in Boulder, Colo., was Hurricane Katrina. Trenberth calculated that the combination of overall planetary warming, elevated moisture in the atmosphere, and higher sea-surface temperatures meant that “4 to 6 percent of the precipitation—an extra inch [2.5 centimeters] of rain—in Katrina was due to global warming,” he says. “That may not sound like much, but it could be the straw that breaks the camel’s back or causes a levee to fail.” It was also a very conservative estimate. “The extra heat produced as moisture condenses can invigorate a storm, and at a certain point, the storm just takes off,” he says. “That would certainly apply to Nashville.” So climate change’s contribution to Katrina could have been twice as high as his calculations show, he says. Add in higher winds to the extra energy, and it is easy to see how storms can become more damaging.

This science of attribution is not without controversies. Another case in point: the 2010 Russian heat wave, which wiped out one quarter of the nation’s wheat crop and darkened the skies of Moscow with smoke from fires. The actual meteorological cause is not in doubt. “There was a blocking of the atmospheric circulation,” explains Martin Hoerling, a research meteorologist at the NOAA’s Earth System Research Laboratory, also in Boulder. “The jet stream shifted north, bringing a longer period of high pressure and stagnant weather conditions.” But what caused the blocking? Hoerling looked for an underlying long-term temperature trend in western Russia that might have increased the odds of a heat wave, as Stott had done for the 2003 European event. He found nothing. “The best explanation is a rogue black swan—something that came out of the blue,” he says.

Wrong, retorts NCAR’s Trenberth. He sees a clear expansion of the hot, dry Mediterranean climate into western Russia that is consistent with climate change predictions—and that also intensified the Pakistan monsoon. “I completely repudiate Marty—and it doesn’t help to have him saying you can’t attribute the heat wave to climate change,” he says. “What we can say is that, as with Katrina, this would not have happened the same way without global warming.”

Yet even this dispute is smaller than it first appears. What is not in doubt is that the Russian heat wave is a portent—a glimpse of the future predicted by climate models. Even Hoerling sees it as a preview of coming natural disasters. By 2080, such events are expected to happen, on average, once every five years, he says: “It’s a good wake-up call. This type of phenomenon will become radically more common.”

Storm Warnings: Extreme Weather Is a Product of Climate Change (Scientific American)

More violent and frequent storms, once merely a prediction of climate models, are now a matter of observation. Part 1 of a three-part series

By John Carey | Tuesday, June 28, 2011 | 130

DROWNING: The Souris River overflowed levees in Minot, N.D., as seen here on June 23. Image: Patrick Moes/U.S. Army Corps of Engineers

In North Dakota the waters kept rising. Swollen by more than a month of record rains in Saskatchewan, the Souris River topped its all time record high, set back in 1881. The floodwaters poured into Minot, North Dakota’s fourth-largest city, and spread across thousands of acres of farms and forests. More than 12,000 people were forced to evacuate. Many lost their homes to the floodwaters.

Yet the disaster unfolding in North Dakota might be bringing even bigger headlines if such extreme events hadn’t suddenly seemed more common. In this year alone massive blizzards have struck the U.S. Northeast, tornadoes have ripped through the nation, mighty rivers like the Mississippi and Missouri have flowed over their banks, and floodwaters have covered huge swaths of Australia as well as displaced more than five million people in China and devastated Colombia. And this year’s natural disasters follow on the heels of a staggering litany of extreme weather in 2010, from record floods in Nashville, Tenn., and Pakistan, to Russia’s crippling heat wave.

These patterns have caught the attention of scientists at the National Climatic Data Center in Asheville, N.C., part of the National Oceanic and Atmospheric Administration (NOAA). They’ve been following the recent deluges’ stunning radar pictures and growing rainfall totals with concern and intense interest. Normally, floods of the magnitude now being seen in North Dakota and elsewhere around the world are expected to happen only once in 100 years. But one of the predictions of climate change models is that extreme weather—floods, heat waves, droughts, even blizzards—will become far more common. “Big rain events and higher overnight lows are two things we would expect with [a] warming world,” says Deke Arndt, chief of the center’s Climate Monitoring Branch. Arndt’s group had already documented a stunning rise in overnight low temperatures across the U.S. So are the floods and spate of other recent extreme events also examples of predictions turned into cold, hard reality?

Increasingly, the answer is yes. Scientists used to say, cautiously, that extreme weather events were “consistent” with the predictions of climate change. No more. “Now we can make the statement that particular events would not have happened the same way without global warming,” says Kevin Trenberth, head of climate analysis at the National Center for Atmospheric Research (NCAR) in Boulder, Colo.

That’s a profound change—the difference between predicting something and actually seeing it happen. The reason is simple: The signal of climate change is emerging from the “noise”—the huge amount of natural variability in weather.

Extreme signals

There are two key lines of evidence. First, it’s not just that we’ve become more aware of disasters like North Dakota or last year’s Nashville flood, which caused $13 billion in damage, or the massive 2010 summer monsoon in Pakistan that killed 1,500 people and left 20 million more homeless. The data show that the number of such events is rising. Munich Re, one of the world’s largest reinsurance companies, has compiled the world’s most comprehensive database of natural disasters, reaching all the way back to the eruption of Mount Vesuvius in A.D. 79. Researchers at the company, which obviously has a keen financial interest in trends that increase insurance risks, add 700 to 1,000 natural catastrophes to the database each year, explains Mark Bove, senior research meteorologist in Munich Re’s catastrophe risk management office in Princeton, N.J. The data indicate a small increase in geologic events like earthquakes since 1980 because of better reporting. But the increase in the number of climate disasters is far larger. “Our figures indicate a trend towards an increase in extreme weather events that can only be fully explained by climate change,” says Peter Höppe, head of Munich Re’s Geo Risks Research/Corporate Climate Center: “It’s as if the weather machine had changed up a gear.

The second line of evidence comes from a nascent branch of science called climate attribution. The idea is to examine individual events like a detective investigating a crime, searching for telltale fingerprints of climate change. Those fingerprints are showing up—in the autumn floods of 2000 in England and Wales that were the worst on record, in the 2003 European heat wave that caused 14,000 deaths in France, in Hurricane Katrina—and, yes, probably even in Nashville. This doesn’t mean that the storms or hot spells wouldn’t have happened at all without climate change, but as scientists like Trenberth say, they wouldn’t have been as severe if humankind hadn’t already altered the planet’s climate.This new science is still controversial. There’s an active debate among researchers about whether the Russian heat wave bears the characteristic signature of climate change or whether it was just natural variability, for instance. Some scientists worry that trying to attribute individual events to climate change is counterproductive in the larger political debate, because it’s so easy to dismiss the claim by saying that the planet has always experienced extreme weather. And some researchers who privately are convinced of the link are reluctant to say so publicly, because global warming has become such a target of many in Congress.

But the evidence is growing for a link between the emissions of modern civilization and extreme weather events. And that has the potential to profoundly alter the perception of the threats posed by climate change. No longer is global warming an abstract concept, affecting faraway species, distant lands or generations far in the future. Instead, climate change becomes personal. Its hand can be seen in the corn crop of a Maryland farmer ruined when soaring temperatures shut down pollination or the $13 billion in damage in Nashville, with the Grand Ole Opry flooded and sodden homes reeking of rot. “All of a sudden we’re not talking about polar bears or the Maldives any more,” says Nashville-based author and environmental journalist Amanda Little. “Climate change translates into mold on my baby’s crib. We’re talking about homes and schools and churches and all the places that got hit.”

Drenched in Nashville

Indeed, the record floods in Nashville in May 2010 shows how quickly extreme weather can turn ordinary life into a nightmare. The weekend began innocuously. The forecast was a 50 percent chance of rain. Musician Eric Normand and his wife Kelly were grateful that the weather event they feared, a tornado, wasn’t anticipated. Eric’s Saturday concert in a town south of Nashville should go off without a hitch, he figured.

He was wrong. On Saturday, it rained—and rained. “It was a different kind of rain than any I had experienced in my whole life,” says Nashville resident Rich Hays. Imagine the torrent from an intense summer thunderstorm, the sort of deluge that prompts you to duck under an underpass for a few minutes until the rain stops and it’s safe to go on, Little says. It was like that, she recalls—except that on this weekend in May 2010 it didn’t stop. Riding in the bus with his fellow musicians, Normand “looked through a window at a rain-soaked canopy of green and gray,” he wrote later. Scores of cars were underwater on the roads they had just traveled. A short 14-hour bus gig turned out to be “one of the most stressful and terrifying we had ever experienced,” Normand says.

And still it rained—more than 13 inches (33 centimeters) that weekend. The water rose in Little’s basement—one foot, two feet, three feet (one meter) deep. “You get this panicky feeling that things are out of control,” she says. Over at Hays’s home, fissures appeared in the basement floor, and streams of water turned into a “full-on river,” Hays recalls. Then in the middle of night, “I heard this massive crack, almost like an explosion,” he says. The force of the water had fractured the house’s concrete foundation. He and his wife spent the rest of the night in fear that the house might collapse.

Sunday morning, Normand went out in the deluge to ask his neighbor if he knew when the power might go back on—it was then he realized that his normal world had vanished. A small creek at the bottom of the hill was now a lake one-half mile (0.8 kilometer) wide, submerging homes almost up to their second stories. “My first reaction was disbelief,” Normand says. He and his family were trapped, without power and surrounded by flooded roads. “We were just freaked out,” he recalls.

And all across the flooded city the scenes were surreal, almost hallucinatory, Little says. “There were absurdities heaped upon absurdities. Churches lifted off foundations and floating down streets. Cars floating in a herd down highways.” In her own basement her family’s belongings bobbed like debris in a pond.

By time the deluge ended, more than 13 inches (33 centimeters) of rain had fallen, as recorded at Nashville’s airport. The toll: 31 people dead, more than $3 billion in damage—and an end to the cherished perception that Nashville was safe from major weather disasters. “A community that had never been vulnerable to this incredible force of nature was literally taken by storm,” Little says.

But can the Nashville deluge, the North Dakota floods and the many other extreme weather events around the world be connected with the greenhouse gases that humans have spewed into the atmosphere? Increasingly the answer seems to be yes. Whereas it will never be possible to say that any particular event was caused by climate change, new science is teasing out both the contributions that it makes to individual events—and the increase in the odds of extreme weather occurring as a result of climate change.

Nordeste perde um quinto dos reservatórios de água em 2010 (FSP)

JC e-mail 4304, de 20 de Julho de 2011.

Relatório aponta bacias da região semiárida como as mais críticas.

A região Nordeste do País perdeu, entre outubro de 2009 e outubro de 2010, 20% dos reservatórios de água que possuía no período anterior, segundo a ANA (Agência Nacional de Águas). O dado está em um relatório sobre os recursos hídricos do País, publicado ontem e disponível em http://bit.ly/pnZBqo. Segundo a agência, a perda de reservatórios na região se deve à menor quantidade de chuvas.

Na região ficam as bacias do Semiárido, um dos pontos críticos quanto aos recursos hídricos, segundo o relatório. Também são classificadas assim as bacias do rio Meia Ponte, no Centro-Oeste, e a do Tietê, no Sudeste.

A definição leva em conta a disponibilidade e o uso de água, além da presença ou não de vegetação nativa e como é feito o tratamento dos resíduos sólidos no local. Segundo a ministra do Meio Ambiente, Izabella Teixeira, a ideia é, a partir dos dados do relatório, “focar os esforços nas áreas críticas”.

A ampliação dos serviços de saneamento foi apontada como prioridade pela ministra, principalmente nas cidades de até 50 mil habitantes. O pior índice de qualidade da água é o das áreas de grande densidade urbana.
(Folha de São Paulo)

Rios em péssimas condições (O Globo)

JC e-mail 4304, de 20 de Julho de 2011.

Brasil tem só 4% de recursos hídricos com qualidade ótima, segundo relatório.

Com 12% da oferta de água do planeta, o Brasil tem apenas 4% de seus recursos hídricos com qualidade considerada ótima, percentual que caiu seis pontos de 2008 para 2009. Segundo avaliação do “Informe 2011 da Conjuntura dos Recursos Hídricos do Brasil”, divulgado ontem pela Agência Nacional de Águas (ANA), cem rios estão em situação ruim ou péssima.

Para avaliar o índice de qualidade da água, a agência usa nove parâmetros, que levam em conta principalmente a contaminação dos rios pelo lançamento de esgoto. Essa centena de rios em situação precária não consegue depurar naturalmente a quantidade de resíduos que vêm recebendo. Embora o governo argumente que está fazendo investimentos em políticas públicas de saneamento, mais da metade das cidades do país – 2.926 municípios – não tem tratamento de esgoto. O relatório aponta que em 2009 foram investidos R$21,4 bilhões em saneamento e gestão da água, sendo R$13,2 bilhões em obras de tratamento de esgoto.

A água de pior qualidade se concentra perto das regiões metropolitanas de São Paulo, Curitiba, Belo Horizonte, Porto Alegre, Rio de Janeiro e Salvador e das cidades de médio porte, como Campinas (SP) e Juiz de Fora (MG). Entre os rios cuja água é de péssima ou má qualidade, estão o Tietê, que corta a capital paulista, o Iguaçu, que forma as famosas Cataratas do Iguaçu, e o Guandu-Mirim, no Rio – os dois últimos ficam dentro de unidades de conservação, o Parque Nacional do Iguaçu e a Área de Proteção Ambiental (APA) do Rio Guandu, respectivamente.

Entre 2008 e 2009, a água de qualidade péssima no país se manteve em 2%; a ruim aumentou de 6% para 7%; a regular passou de 12% para 16% e a boa subiu de 70% para 71%. Nesse período, o número de pontos monitorados caiu de 1.812 para 1.747. O superintendente de Planejamento de Recursos Hídricos da agência, Ney Maranhão, mostrou-se satisfeito com os resultados do estudo.

– Temos 90,6% dos rios num estado satisfatório de qualidade e de disponibilidade (quantidade de água). Apenas 2% não apresentam resultado satisfatório – avaliou Maranhão, que coordenou o trabalho.

Estresse hídrico e agricultura – Maranhão ressaltou que as políticas públicas têm sido direcionadas para as bacias que estão em situação crítica, seja por apresentarem baixa disponibilidade ou qualidade de água. A maior parte dos rios e bacias com problema de oferta de água se encontra no Nordeste.

A ministra do Meio Ambiente, Izabella Teixeira, disse que, no futuro, o estresse hídrico (falta de água em algumas regiões do país) vai impactar na agricultura. Ao todo, 69% dos recursos consumidos pela população são usados em irrigação. Izabella aproveitou a ocasião para mandar um recado ao Congresso, onde tramita a reforma do Código Florestal.

– Quando estamos discutindo Código Florestal, não falamos apenas do uso do solo. Estamos falando de recursos hídricos e qualidade de vida. O relatório traz com muita propriedade o estresse hídrico com perda de mata ciliar (vegetação nativa às margens dos rios). Onde se desmata mata ciliar, há comprometimento dos recursos hídricos – afirmou a ministra.

O levantamento da ANA também levou em conta o problema das mudanças climáticas, responsáveis por eventos naturais extremos em datas diferentes no ano passado: a estiagem na Amazônia; as enchentes em Alagoas, Pernambuco e em Minas Gerais; as cheias no Rio, em São Paulo e no Rio Grande do Sul. Um exemplo do agravamento dessa situação: em 2006, foram registradas 135 situações de emergência ou de calamidade pública por conta de fortes chuvas. Em 2010, esse número de ocorrências subiu para 601. No total, quase 10% das cidades brasileiras – 563 municípios – decretaram situação de emergência devido a enchentes, inundações, enxurradas e alagamentos.

No caso das secas, houve uma inversão: 2010 registrou menos casos de emergência (583) do que 2006 (914). Entre 2009 e 2010 houve diminuição de 20,8% no nível dos reservatórios de água construídos no Nordeste para combater estiagens.
(O Globo)

La Niña’s Exit Leaves Climate Forecasts in Limbo (NASA)

06.29.11

The latest satellite data of Pacific Ocean sea surface heights from the NASA/European Ocean Surface Topography Mission/Jason-2 satellite show near-normal conditions in the equatorial Pacific. The image is based on the average of 10 days of data centered on June 18, 2011. Higher (warmer) than normal sea surface heights are indicated by yellows and reds, while lower (cooler) than normal sea surface heights are depicted in blues and purples. Green indicates near-normal conditions. Image credit: NASA/JPL Ocean Surface Topography Team

It’s what Bill Patzert, a climatologist and oceanographer at NASA’s Jet Propulsion Laboratory in Pasadena, Calif., likes to call a “La Nada” – that puzzling period between cycles of the El Niño-Southern Oscillation climate pattern in the Pacific Ocean when sea surface heights in the equatorial Pacific are near average.

The comings and goings of El Niño and La Niña are part of a long-term, evolving state of global climate, for which measurements of sea surface height are a key indicator. For the past three months, since last year’s strong La Niña event dissipated, data collected by the U.S.-French Ocean Surface Topography Mission (OSTM)/Jason-2 oceanography satellite have shown that the equatorial Pacific sea surface heights have been stable and near average. Elsewhere, however, the northeastern Pacific Ocean remains quite cool, with sea levels much lower than normal. The presence of cool ocean waters off the U.S. West Coast has also been a factor in this year’s cool and foggy spring there.

The current state of the Pacific is shown in this OSTM/Jason-2 image, based on the average of 10 days of data centered on June 18, 2011. The image depicts places where Pacific sea surface height is higher (warmer) than normal as yellow and red, while places where the sea surface is lower (cooler) than normal are shown in blue and purple. Green indicates near-normal conditions. Sea surface height is an indicator of how much of the sun’s heat is stored in the upper ocean.

For oceanographers and climate scientists like Patzert, “La Nada” conditions can bring with them a high degree of uncertainty. While some forecasters (targeting the next couple of seasons) have suggested La Nada will bring about “normal” weather conditions, Patzert cautions previous protracted La Nadas have often delivered unruly jet stream patterns and wild weather swings.

In addition, some climatologists are pondering whether a warm El Niño pattern (which often follows La Niña) may be lurking over the horizon. Patzert says that would be perfectly fine for the United States.

“For the United States, there would be some positives to the appearance of El Niño this summer,” Patzert said. “The parched and fire-ravaged southern tier of the country would certainly benefit from a good El Niño soaking. Looking ahead to late August and September, El Niño would also tend to dampen the 2011 hurricane season in the United States. We’ve had enough wild and punishing weather this year. Relief from the drought across the southern United States and a mild hurricane season would be very welcome.”

Jason-2 scientists will continue to monitor Pacific Ocean sea surface heights for signs of El Niño, La Niña or prolonged neutral conditions.

JPL manages the U.S. portion of the OSTM/Jason-2 mission for NASA’s Science Mission Directorate, Washington, D.C.

For more information on NASA’s ocean surface topography missions, visit: http://sealevel.jpl.nasa.gov/missions/.

To view the latest Jason-1 and OSTM/Jason-2 data, visit: http://sealevel.jpl.nasa.gov/science/elninopdo/latestdata/.

Alan Buis 818-354-0474
Jet Propulsion Laboratory, Pasadena, Calif.
Alan.buis@jpl.nasa.gov

2011-199

Some Catholics seek to counter Galileo (Chicago Tribune)

Splinter group says the Earth, not the sun, is, indeed, at the center of the universe

By Manya A. Brachear, Tribune reporter
July 4, 2011

Some people believe the world literally revolves around them. It’s a belief born not of selfishness but faith.

A small group of conservative Roman Catholics is pointing to a dozen biblical verses and the Church’s original teaching as proof that the Earth is the center of the universe, the view that prompted Galileo Galilei’s clash with the Church four centuries ago.

The relatively obscure movement has gained a following among a few Chicago-area Catholics who find comfort in knowing there are still staunch defenders of original Church doctrine.

“This subject is, as far as I can see, an embarrassment to the modern church because the world more or less looks upon geocentrism or someone who believes it in the same boat as the flat Earth,” said James Phillips, of Cicero.

Phillips attends Our Lady Immaculate Catholic Church in Oak Park, a parish run by the Society of St. Pius X, a group that rejects most of the modernizing reforms the Vatican II council made from 1962 to 1965.

But by challenging modern science, the proponents of a geocentric universe are challenging the very church they seek to serve and protect.

“I have no idea who these people are. Are they sincere, or is this a clever bit of theater?” said Brother Guy Consolmagno, the curator of meteorites and spokesman for the Vatican Observatory.

Indeed, those promoting geocentrism argue that heliocentrism, or the centuries-old consensus among scientists that the Earth revolves around the sun, is nothing more than a conspiracy theory to squelch the church’s influence.

“Heliocentrism becomes ‘dangerous’ if it is being propped up as the true system when, in fact, it is a false system,” said Robert Sungenis, leader of a budding movement to get scientists to reconsider. “False information leads to false ideas, and false ideas lead to illicit and immoral actions — thus the state of the world today. … Prior to Galileo, the church was in full command of the world; and governments and academia were subservient to her.”

Sungenis is no lone Don Quixote, as illustrated by the hundreds of curiosity seekers, skeptics and supporters at a conference last fall titled “Galileo Was Wrong. The Church Was Right” just off the University of Notre Dame campus in South Bend, Ind.

Astrophysicists at Notre Dame didn’t appreciate the group hitching its wagon to the prestige of America’s flagship Catholic university and resurrecting a concept that’s extinct for a reason.

“It’s an idea whose time has come and gone,” astrophysics professor Peter Garnavich said. “There are some people who want to move the world back to the 1950s when it seemed like a better time. These are people who want to move the world back to the 1250s. I don’t really understand it at all.”

Garnavich said the theory of geocentrism violates what he believes should be a strict separation of church and science. One answers why, the other answers how, and never the twain should meet, he said.

But supporters of the theory contend that there is scientific evidence to support geocentrism, just as there is evidence to support the six-day story of creation in Genesis.

There is proof in Scripture that the Earth is the center of the universe, Sungenis said. Among many verses, he cites Joshua 10:12-14 as definitive proof: “And the sun stood still, and the moon stayed, while the nation took vengeance on its foe. … The sun halted in the middle of the sky; not for a whole day did it resume its swift course.”

But Ken Ham, founder of the Creation Museum in Petersburg, Ky., said the Bible is silent on geocentrism.

“There’s a big difference between looking at the origin of the planets, the solar system and the universe and looking at presently how they move and how they are interrelated,” Ham said. “The Bible is neither geocentric or heliocentric. It does not give any specific information about the structure of the solar system.”

Just as Ham challenges the foundation of natural history museums, Sungenis challenges planetariums, most notably the Vatican Observatory.

Consolmagno said the very premise of going after Galileo illustrates the theory’s lack of scientific credibility.

“Of course, we understand the universe in a far more nuanced way than Galileo did 400 years ago,” he said. “And I would hope that the next 400 years would see just as much development.”

But Sungenis said the renewed interest in geocentrism is due, in part, to the efforts of Christians entering the scientific domain previously dominated by secularists. These Christian scientists, he said, showed modern science is without scientific foundation or even good evidence.

The issue has even sparked a debate between Art and Pat Jones, of Lyons. Pat Jones, a conservative Catholic who often attends Mass at Phillips’ parish, said heliocentrism is part of a conspiracy.

“Because of our fallen nature in Christian terms, we take the line of least resistance — go with the flow,” said Pat Jones. “But the means of grace have to be intact.”

Her husband, Art, a self-described skeptical Protestant, says he is still a “doubting Thomas” but wouldn’t put it past the orthodox science community to cook up a conspiracy. He accompanied his wife to the South Bend conference to learn more and “keep peace in the family.”

Meanwhile, the theory has brought others like Phillips closer to God.

“I dropped my practice of faith,” Phillips said. “When I came back, it was a big wake-up call for me. … The world has its own dogmas.”

mbrachear@tribune.com

O futuro do presente no pretérito (Fapesp)

HUMANIDADES | LITERATURA
A ficção científica brasileira e a relação do país com a ciência e a tecnologia

Carlos Haag
Edição Impressa 184 – Junho 2011

A Presidência da República dos Estados Unidos do Brasil estava confiada a uma mulher. O país estava mais forte, mais belo e rico. Para aqui convergiam povos de todos os recantos da Terra. A Amazônia está urbanizada, o analfabetismo foi abolido e, na roça, os trabalhadores cantam trechos da última ópera a que assistiram ou recitam, de cor, os poemas mais lindos.” Aviso: isso não é um texto institucional desvairado do governo atual. A autora, Adalzira Bittencourt (1904-1976), descreveu essa “previsão” em 1929 em Sua Excia. a presidente da República. Mas esse “paraíso” de ficção científica tem um porém: tudo isso foi conseguido graças à ascensão na política das mulheres, que implementam um rígido programa de eugenia e higiene social. Por uma ironia do destino, a presidente, dra. Mariangela de Albuquerque, apaixona-se pelo pintor Jorge, que só conhece por cartas amorosas. Cansada de esperar o amante, a primeira mandatária ordena que seja trazido, algemado, em sua presença. “Era lindo de rosto, mas tinha não mais do que 90 cm de altura e tinha nas costas uma corcunda enorme.” A presidente eugenista ordena, implacável, a eutanásia profilática no amado. “Era mulher”, encerra-se, em tom vitorioso, a novela.

O tom “ideológico” da novela percorreu, e ainda se mantém, a produção de ficção científica brasileira, infelizmente pouco estudada e vista, em geral, como “produto de segunda ordem” e indigno do cânone literário. “Desde o século XIX o gênero provou ser um veículo ideal para registrar tensões na definição da identidade nacional e do processo de modernização. Essas tensões são exacerbadas na América Latina e, por isso, a produção da ficção em países como Brasil, Argentina e México, grandes representantes desse gênero no continente, é muito mais politizada do que a escrita nos países do Norte. No Brasil, o gênero ajudou a refletir uma agenda política mais concreta e os escritores, ontem e hoje, estão mais intimamente envolvidos com os rumos futuros de seu país e usaram o gênero nascente não apenas para circular suas idéias na arena pública, mas também para mostrar aos seus compatriotas suas opiniões sobre a realidade presente e suas visões sobre um tempo futuro, melhor e mais moderno”, explica a historiadora Rachel Haywood Ferreira, da Universidade do Estado de Iowa, autora de The emergence of Latin American science fiction, que acaba de ser lançado nos EUA pela Wesleyan University Press. “A ficção científica brasileira permite traçar a crise de identidade que acompanhou a modernização, juntamente com o senso de perda que a persegue, e que é parte da entrada do Brasil na condição pós-moderna. A ficção nacional em parte exemplifica a erosão da narrativa latino-americana de identidade nacional, porque ela se torna cada vez mais influenciada pela troca cultural inerente à globalização iniciada nos anos 1990”, concorda a professora de literatura Mary Ginway, da Universidade da Flórida, autora de Ficção científica brasileira: mitos culturais e nacionalidade no país do futuro (Devir Livraria). Apesar disso, o gênero continua considerado como “menor”. “É uma pena, porque o deslocamento da tradição da ficção para o contexto de um país em desenvolvimento nos permite revelar certas assunções sobre como se dá esse desenvolvimento e determinar a função desse gênero nesse tipo de sociedade. A ficção científica fornece um barômetro para medir atitudes diante da tecnologia, ao mesmo tempo que reflete as implicações sociais da modernização da sociedade brasileira”, avalia Mary. “Há mesmo uma variação gradual de um clima de otimismo para outro, de pessimismo: a ciência parece não mais ser a garantia da verdade, como se pensava, e o impacto da tecnologia pode nem sempre ser positivo, o que dificulta que se alcance o potencial nacional. Tudo isso se pode ver na ficção científica latino-americana: a definição da identidade nacional; as tensões entre ciência e religião e entre campo e cidade; a pseudociência”, nota Rachel.

Para a pesquisadora, a literatura especulativa é importante em paí-ses como o Brasil, onde “ciência e tecnologia têm um papel-chave na vida intelectual, já que a tecnologia é vista como a solução possível para que o país possa superar o atraso histórico do desenvolvimento econômico com a esperança de se criar uma sociedade melhor e mais utópica”. Infelizmente, foi justamente essa ligação com o nacional que representou a glória e o desprezo da ficção científica no Brasil, apesar de termos acompanhado com certa rapidez a expansão do gênero na Europa. A primeira ficção científica nacional data de 1868 (foi publicada no jornal O Jequitinhonha até 1872), Páginas da história do Brasil, escrita no ano 2000, de Joaquim Felício dos Santos, uma obra satírica sobre a monarquia que leva dom Pedro II numa viagem pelo tempo até o futuro, onde descobre como seu regime de governo era pernicioso ao país. “Obras como essas que adentram o século XX, até os anos 1920, mostram que havia interesse dos brasileiros em desenvolver narrativas utópicas, fantasias moralizadoras e até o romance científico, um corpo de ficção especulativa que poderia ter sustentado uma produção maior nas décadas seguintes. Infelizmente, como viria a acontecer nos anos 1970, os exercícios nacionais não resistiram à pressão estrangeira, à pressão da crítica, que não criou um nicho para o gênero no Brasil, e ao relativo desinteresse do público leitor”, analisa Roberto de Sousa Causo, autor de Ficção científica, fantasia e horror no Brasil: 1875-1950 (Editora da UFMG). “A separação rígida entre a literatura sancionada e a não sancionada redundou na quase total ausência de uma pulp era no contexto brasileiro. A ficção especulativa perdeu esse espaço de inventividade desregrada, de abertura de novas possibilidades, de constituição de uma tradição mais empreendedora”, avalia.

Como observa Antonio Candido, em sua Formação da literatura brasileira, há uma posição fechada no país de considerar a literatura como prática constitutiva de nacionalidade, um pragmatismo que implica até hoje a diminuição da imaginação, pelo interesse de se usar politicamente as letras como forma de representar a experiência social e humana. Nesse movimento, avalia Causo, os usos da literatura como instrumento de formação da nacionalidade teriam preferido a documentação realista e naturalista orientada pelo progresso e pelo determinismo. “A versão brasileira sofre duplamente por causa de suas associações com ‘arte baixa’, fruto de uma tradição autoritária nacional que abomina a cultura de massas e a arte popular, e por ser um gênero imaginativo num país que dá alto valor ao realismo literário”, concorda a brasilianista Mary Gingway. Num conto de Jorge Calife, um dos mais conhecidos autores contemporâneos de ficção científica, Brasil, país do futuro, um jovem, em 1969, durante a ditadura, tem como dever de casa escrever um ensaio sobre o Brasil do ano 2000. Ele, de fato, consegue viajar no tempo e ver o Rio do futuro, uma dolorosa decepção ao descobrir que nada mudara e a vida dos brasileiros continuava miserável. De volta ao quarto, escreve o texto descrevendo uma cidade imaginária sob um domo, com medo de ser reprovado pelo professor se falasse a verdade. “Essa história é um lembrete de que, a despeito da modernização global, o Brasil pode enfrentar uma longa espera antes de receber os benefícios da tecnologia”, afirma a pesquisadora americana.

Os inícios da ficção científica foi o chamado romance científico, desenvolvido entre 1875 e 1939, que tomava como modelos europeus os livros de Jules Verne e Wells. “Embora as contribuições científicas latino-americanas desse período fossem pequenas em comparação com o resto do mundo, os cientistas desses países estavam em sintonia com o que fazia na Europa e a adoção da eugenia é um sinal da aprovação generalizada da ciência como prova de modernidade cultural. Os textos criados nesse espírito não se revelam como imitações de modelos literários imperialistas que mostravam sociedades imaginárias baseadas em tecnologias inviáveis, mas em obras que descreviam o presente com a autoridade do discurso científico e almejavam o futuro brilhante que viria com certeza. São textos utópicos que acontecem em lugares remotos ou tempos distantes, descrevendo sociedades inexistentes em detalhes”, analisa Rachel. A eugenia dessas obras, porém, vem embalada numa versão mais soft, um ramo alternativo das noções hereditárias de Lamarck, em que havia espaço para a reforma das deformações humanas, algo que entusiasmava os brasileiros, já que ofereciam soluções científicas viáveis para os “problemas” nacionais. “Era um neolamarckismo tingido com cores otimistas em que reformas do meio social poderiam resultar em melhoras permanentes e que o progresso, mesmo nos trópicos, era possível. Mais tarde, o darwinismo social se juntaria ao caldo que produziria a ficção”, conta a pesquisadora. Um bom exemplo é o romance pioneiro no gênero, Dr. Benignus (1875), de Augusto Zaluar, uma expedição científica ao interior do Brasil, com direito a seres vindos do Sol, muita conversa e pouca aventura. Para Benignus, a ciência serviria para dar valor ao cidadão importante ou resgataria a nação “bárbara” e abandonada.

Outro tema característico aparece em O presidente negro ou O choque das raças (1926), de Monteiro Lobato, que mostra como a divisão do eleitorado branco em 2228 permite a eleição nos EUA de um presidente negro, o que faz os brancos se unirem novamente para colocar os negros “sob controle”. Para o escritor, a mestiçagem era justamente o fator responsável pelo atraso econômico e cultural. A solução era seduzir os negros com um alisador de cabelos, os “raios Ômega”, que provocavam a esterilização do usuário. De forma menos agressiva, o tom eugenista transparece nas obras do jornalista Berilo Neves, autor da coletânea A costela de Adão (1930) e O século XXI (1934), histórias satíricas passadas no futuro cujo alvo preferencial eram o feminismo e as frivolidades femininas. Em geral suas narrativas misóginas envolvem a criação de máquinas de reprodução humana que fazem as mulheres obsoletas ou um mundo futuro em que os gêneros aparecem trocados. Em A liga dos planetas (1923), de Albino José Coutinho, o primeiro romance nacional a mostrar uma viagem espacial, o narrador constrói seu “aeroplano” e finca a bandeira brasileira na Lua. Mas não foge do pensamento corrente: a missão espacial tinha como justificativa um pedido presidencial para que o herói encontrasse, em outros mundos, gente de qualidade, porque aqui isso não acontecia.

Mas houve exceções honrosas ao darwinismo social, como A Amazônia misteriosa (1925), de Gustavo Cruls, inspirado em A ilha do Dr. Moureau, de Wells, com uma solução nacional: o protagonista perdido pela Amazônia se encontra com um cientista alemão, o professor Hartmann, que faz experiências em crianças do sexo masculino desprezadas pelas amazonas. Como se isso não bastasse, o médico, após tomar uma droga alucinógena, topa com Atahualpa, que descreve a ele os abusos feitos pelos europeus. O protagonista vê que esses foram mantidos pelo cientista tedesco e rejeita as explorações colonialistas e o abuso da ciência. Em A república 3.000 ou A filha do inca (1930), o modernista Menotti Del Picchia descreve uma expedição que se depara com uma civilização de grande tecnologia em pleno Brasil Central, isolada sob uma cúpula invisível. Os protagonistas rejeitam os postulados positivistas, fogem com a princesa inca e tudo se encerra com uma elegia à vida simples. Jerônymo Monteiro, o futuro autor do personagem Dick Peter, usa seu romance Três meses no século 81 (1947) para mostrar o seu protagonista Campos confrontando o próprio Wells sobre a viagem do tempo, usando o recurso da “transmigração da alma”, provocada por médiuns. “O herói de Monteiro não apenas viaja no tempo, mas lidera uma rebelião de humanistas contra a elite massificadora da Terra futura, aliando-se aos marcianos com quem o nosso planeta está em guerra”, diz Causo. “Por um lado, a nossa ficção científica vai se imbuindo da realidade trágica do subdesenvolvimento e ilumina a compreensão do leitor sobre a conjuntura particular em que vive, o que nos diferenciava da ficção científica do Primeiro Mundo. Ao mesmo tempo, reconhecer isso nos fez rejeitar conceitos importados, como o darwinismo social. Não havia mais razão na convivência entre esse discurso e uma conjuntura de neocolonialismo, como se vê na ficção científica brasileira do final do século XIX e início do XX, salvo dentro de uma postura elitista interna ao país”, analisa o pesquisador.

Enquanto isso, florescia nos EUA, em revistas populares, as pulp magazines, uma ficção científica tecnófila, pouco preocupada com o estilo ou com a caracterização de personagens, mais interessada no engajamento do leitor na ação, na aventura e na extravagância das ideias, as pulp fictions. Apesar dos esforços pulps de Berilo Neves e em particular de Jerônymo Monteiro (considerado o “pai da ficção científica brasileira”), essa forma popular não vingou no país. “O Brasil perdeu ao não ter acesso a esse material ou por não ter criado a sua versão de uma era de revistas populares, em que a inventividade estava presente e o público reagia, criando um forte vínculo entre produtores e consumidores de ficção científica”, lembra Causo. Ao lado dessa golden age anglo–americana, a ficção nacional, também em função dos efeitos do pós-guerra, passa a apresentar uma desconfiança básica da ciência e da tecnologia nas mãos dos humanos por conta do poder da razão em face dos excessos da emoção. “Em razão da aguda divisão de classes da sociedade brasileira, com forte concentração de renda nas mãos da elite, a tecnologia é vista como um elemento divisor, e não unificador. Para os brasileiros, a tecnologia é mais um problema político e econômico, e não uma forma de resolvê-lo”, analisa Mary Ginway. Apesar disso, os anos 1960 presenciam uma explosão do gênero graças aos esforços do editor baiano Gumercindo Rocha Dorea, criador das Edições GRD, que passam a batizar e abrigar uma nova geração de escritores, incluindo-se criadores do mainstream convidados a criar ficção como Dinah Silveira de Queiroz, Rachel de Queiroz, Fausto Cunha, entre outros.

Entre EUA e Brasil passam a acontecer descompassos ficcionais. “Se a ficção científica americana abraça a tecnologia e a mudança, mas teme rebeliões ou invasões por robôs e alienígenas, a ficção brasileira tende a rejeitar a tecnologia, mas abraça os robôs e acha os alienígenas como sendo indiferentes ou exóticos, mas pouco ameaçadores, quando não portadores de uma mensagem de paz ao mundo”, afirma Mary. Tampouco as visões americanas de megalópoles plenas de mecanismos futuristas agradavam aos brasileiros. “A sociedade brasileira, por seu passado rural e patriarcal, valoriza o personalismo nas relações, colocando valo no contato humano. Assim, essa rejeição pode ser lida como a negação de uma nova ordem baseada na uniformização e na obediência cega a uma cultura organizacional”, continua a pesquisadora. A ficção científica nacional começa a colocar o seu sabor sobre os arroubos do futuro. “A tecnologia só pode ser solução, nessas obras, quando é reduzida e humanizada. Os alienígenas, comparados aos estrangeiros, são descritos como indiferentes aos humanos e seus destinos, tomando recursos e abandonando os humanos à sua sorte. A Amazônia, por exemplo, passa a ser alvo desses invasores, que pousam ali. Já os robôs são vistos com grande simpatia, talvez em função do passado escravista em que havia uma promiscuidade entre servos e senhores. Assim, os ícones da ficção são transformados pelas relações sociais brasileiras tradicionais e suas possibilidades como agentes de mudança social, enquanto possibilidades utópicas são geralmente negadas.” Os autores nacionais se apropriam de um gênero do Primeiro Mundo que lida com ciência e tecnologia e, ao transformarem seus paradigmas, tornam-no antitecnológico e nacional, segundo a pesquisadora, um gesto compreensível de resistência ante o temor da modernização que ameaçava destruir a cultura e as tradições humanistas do Brasil, como se verá com o golpe de 1964.

Esse período da ditadura marca o início da ficção científica distópica, ou seja, usar elementos familiares e fazê-los estranhos para discutir ideias e fazer denúncias. “Ao usar um mundo futurista imaginário, as distopias se concentram em temas políticos e satirizam tendências presentes na sociedade. Daí as distopias nacionais serem todas representações alegóricas de um Brasil sob regime militar, com alusões à censura, tortura, controle etc. Os enredos são sempre sobre rebeliões contra uma tecnocracia perversa e arbitrária”, nota Mary. É um abrasileiramento da tendência da new age da ficção científica internacional, sob os auspícios de Ray Bradbury, em que a tecnologia aparece como vilã ao roubar dos brasileiros a sua identidade (uma questão recorrente desde o século XIX), em especial quando em mãos de um governo autoritário. “No lado oposto está o mito da identidade, visto como natural e imutável, assumindo a forma da natureza, da mulher, da sexualidade, da terra”, nota Causo. Com o fim da ditadura, a ficção científica volta ao seu padrão em formas mais sofisticadas como o cyberpunk, a ficção hard e as histórias alternativas, muitas escritas por mulheres.

Em 1988, Ivan Carlos Regina lança o manifesto antropofágico da ficção científica brasileira, que como o manifesto de Oswald de Andrade, propõe uma “canabalização” do gênero pelos escritores brasileiros. “Precisamos deglutir, após o bispo Sardinha, a pistola de raios laser, o cientista maluco, o alienígena bonzinho, o herói invencível, a dobra espacial, a mocinha com pernas perfeitas e cérebro de noz e o disco voador, que estão tão distantes da realidade brasileira quanto a mais longínqua das estrelas.” “Ao combinar formas altas e baixas de literatura, ao unir mito, mídia, tecnologia moderna e ao abordar questões como raça e gênero sexual, a ficção nacional da pós-ditadura desconstrói a noção de Brasil como uma nação tropical exótica, cheia de gente feliz, oferecendo um mosaico pós-moderno dos conflitos brasileiros para lutar com a sua própria história e com a crescente globalização”, nota Mary. Nesse momento há mesmo quem advogue o gênero como terreno fértil para os escritores do mainstream. “Os heróis da prosa de ficção brasileira estão cansados. Faz pelo menos 20 anos que a sua rotina não muda”, avisa o escritor Nelson de Oliveira, autor de Os transgressores, em seu “Convite ao mainstream”. “Nossa sorte é que na literatura brasileira existem outras correntes além da principal. A mais vigorosa, brutal e vulgar é a ficção científica. Ela é como os bárbaros que puseram abaixo Roma. Os bárbaros são a solução para uma civilização decadente. Os temas da ficção científica são a semente desses guerreiros que, ao fecundarem a prosa cansada e decadente do mainstream, ajudarão a gerar contos e romances mais consistentes e menos artificiais.”

IPCC estuda geoengenharia para minimizar aquecimento (Carbono Brasil)

JC e-mail 4286, de 24 de Junho de 2011

Talvez motivada pela lentidão das negociações climáticas, entidade sugere que cientistas avaliem possibilidades para refletir os raios solares e até o depósito de ferro nos oceanos para estimular o crescimento de algas que absorvam o CO².

O jornal britânico The Guardian teve acesso a documentos do Painel Intergovernamental de Mudanças Climáticas da ONU (IPCC) destinados para os cientistas que formam o grupo de trabalho em geoengenharia da entidade e revelou que utilizar essa opção para lidar com as mudanças climáticas está sendo considerada com seriedade.

O grupo de cientistas se reúne na próxima semana em Lima, no Peru, e tem como principal objetivo fornecer sugestões para os governos de quais tecnologias de geoengenharia seriam mais eficientes e seguras.

Entre as propostas que o IPCC pede para serem avaliadas estão: Dispersar aerossóis de enxofre na estratosfera para refletir parte dos raios solares de volta para o espaço; Depositar grandes quantidades de ferro nos oceanos para o crescimento de algas que absorvam o CO²; Realizar a bioengenharia de culturas agrícolas para que tenham uma cor que reflita os raios solares; Suprimir a formação de nuvens do tipo cirrus, que agem acentuando o efeito estufa.

De acordo com o The Guardian, outras medidas que podem ser estudadas são a dispersão de partículas de água do mar nas nuvens para que reflitam os raios solares, a pintura de branco das estradas e telhadas em todo o mundo e diferentes maneiras de capturar e armazenar os gases do efeito estufa.

Apesar das ideias parecerem ficção científica, algumas delas já foram inclusive tiradas do papel. No começo de 2009, um navio de pesquisas alemão carregado com 20 toneladas de sulfato de ferro partiu em direção à Antártica com o objetivo de injetar o material no fundo do oceano. A operação acabou sendo suspensa no último momento pelo governo alemão que atendeu aos pedidos da comunidade internacional.

Realizar projetos de geoengenharia sempre levantou muita polêmica, tanto que em 2010 a Convenção sobre Diversidade Biológica (CDB) aprovou uma moratória desse tipo de iniciativa. Entretanto, a moratória permite a continuidade de estudos em pequena escala em circunstâncias controladas.

Mesmo a Sociedade Americana de Meteorologia (AMS), entidade que defende o uso da geoengenharia, alerta que ainda são necessários muitos estudos antes que seja feita qualquer alteração de grande porte nos sistemas terrestres.

“O potencial para ajudar a sociedade, assim como os riscos de consequências inesperadas, exigem mais pesquisas, regulamentações e transparência nas iniciativas”, ressalta a instituição.

Contrários até mesmo a continuidade de estudos sobre o assunto, 125 grupos ambientais e de direitos humanos de 40 países, incluindo a Friends of the Earth International e a Via Campesina, entregaram uma carta nesta semana para o presidente do IPCC, Rajendra Pachauri, alertando que a entidade não tem competência para avaliar a opção da geoengenharia.

“Perguntar a um grupo de cientistas que trabalham com geoengenharia se é preciso fazer mais pesquisas sobre o assunto é igual perguntar se um urso quer mel”, afirma a carta. Segundo os ambientalistas, essa não é uma questão apenas cientifica, é política.

A geoengenharia voltou a ganhar força depois que foi registrado que em 2010 as emissões bateram um novo recorde histórico, apesar de todas as promessas dos governos mundiais. De acordo com a Agência Internacional de Energia, o ano passado registrou a emissão de 30,6 gigatoneladas de dióxido de carbono.

Além disso, o ritmo das negociações internacionais está muito lento, tornando praticamente impossível que seja criado um acordo climático global nos próximos meses.

A própria presidente da Convenção-Quadro da ONU sobre Mudanças Climáticas (UNFCCC), Christiana Figueres, afirmou que talvez seja preciso adotar tecnologias mais radicais para conter o aquecimento em no máximo 2°C e evitar as piores consequências das mudanças climáticas.

“Estamos nos colocando em uma situação onde precisaremos utilizar métodos mais drásticos para retirar as emissões da atmosfera”, concluiu Figueres.

How the ‘ecosystem’ myth has been used for sinister means (The Guardian)

When, in the 1920s, a botanist and a field marshal dreamed up rival theories of nature and society, no one could have guessed their ideas would influence the worldview of 70s hippies and 21st-century protest movements. But their faith in self-regulating systems has a sinister history

Adam Curtis
The Observer, Sunday 29 May 2011

A small greenhouse at Biosphere 2 in Arizona in 1988. The attempt to create an enclosed ecological system ended in failure. Photograph: Roger Ressmeyer/Corbis

At the end of March this year there was a wonderful moment of television interviewing on Newsnight. It was just after student protesters had invaded Fortnums and other shops in Oxford Street during the TUC march against the cuts. Emily Maitlis asked Lucy Annson from UK Uncut whether, as a spokesperson for the direct-action group, she condemned the violence.

Annson swiftly opened the door that leads to the nightmare interview, saying: “We are a network of people who self-organise. We don’t have a position on things. It’s about empowering the individual to go out there and be creative.”

“But is it wrong for individuals to attack buildings?” asked Maitlis.

“You’d have to ask that particular individual,” replied Annson.

“But you are a spokesperson for UK Uncut,” insisted Maitlis. And Annson came out with a wonderful line: “No. I’m a spokesperson for myself.”

What you were seeing in that interchange was the expression of a very powerful ideology of our time. It is the idea of the “self-organising network”. It says that human beings can organise themselves into systems where they are linked, but where there is no hierarchy, no leaders and no control. It is not the old form of collective action that the left once believed in, where people subsumed themselves into the greater force of the movement. Instead all the individuals in the self-organising network can do whatever they want as creative, autonomous, self-expressive entities, yet somehow, through feedback between all the individuals in the system, a kind of order emerges.

At its heart it says that you can organise human beings without the exercise of power by leaders.

As a political position it is obviously very irritating for TV interviewers, which may or may not be a good thing. And it doesn’t necessarily mean it isn’t a valid way for organising protests – and possibly even human society. But I thought I would tell the brief and rather peculiar history of the rise of the idea of the “self-organising network”.

Of course some of the ideas come out of anarchist thought. But the idea is also deeply rooted in a strange fantasy vision of nature that emerged in the 1920s and 30s as the British Empire began to decline. It was a vision of nature and – ultimately – the whole world as a giant system that could stabilise itself. And it rose up to grip the imagination of those in power – and is still central in our culture.

But we have long forgotten where it came from. To discover this you have to go back to a ferocious battle between two driven men in the 1920s. One was a botanist and Fabian socialist called Arthur Tansley. The other was one of the most powerful and ruthless rulers of the British Empire, Field Marshal Jan Smuts.

It all started with a dream. One night Tansley had an unsettling nightmare that involved him shooting his wife. So he did the natural thing and started reading the works of Sigmund Freud, and even went to be analysed by Freud himself. Then Tansley came up with an extraordinary theory. He took Freud’s idea that the human brain is like an electrical machine – a network around which energy flowed – and argued that the same thing was true in nature. That underneath the bewildering complexity of the natural world were interconnected systems around which energy also flowed. He coined a name for them. He called them ecosystems.

But Tansley went further. He said that the world was composed at every level of systems, and what’s more, all these systems had a natural desire to stabilise themselves. He grandly called it “the great universal law of equilibrium”. Everything, he wrote, from the human mind to nature to even human societies – all are tending towards a natural state of equilibrium.

Tansley admitted he had no real evidence for this. And what he was really doing was taking an engineering concept of systems and networks and projecting it on to the natural world, turning nature into a machine. But the idea, and the term “ecosystem”, stuck.

But then Field Marshal Smuts came up with an even grander idea of nature. And Tansley hated it.

Field Marshal Smuts was one of the most powerful men in the British empire. He ruled South Africa for the British empire and he exercised power ruthlessly. When the Hottentots refused to pay their dog licences Smuts sent in planes to bomb them. As a result the black people hated him. But Smuts also saw himself as a philosopher – and he had a habit of walking up to the tops of mountains, taking off all his clothes, and dreaming up new theories about how nature and the world worked.

This culminated in 1926 when Smuts created his own philosophy. He called it Holism. It said that the world was composed of lots of “wholes” – the small wholes all evolving and fitting together into larger wholes until they all came together into one big whole – a giant natural system that would find its own stability if all the wholes were in the right places. Einstein liked the theory, and it became one of the big ideas that lots of right-thinking intellectuals wrote about in the 1930s. Even the King became fascinated by it.

But Tansley attacked. He publicly accused Smuts of what he called “the abuse of vegetational concepts” – which at the time was considered very rude. He said that Smuts had created a mystical philosophy of nature and its self-organisation in order to oppress black people. Or what Tansley maliciously called the “less exalted wholes”.

And Tansley wasn’t alone. Others, including HG Wells, pointed out that really what Smuts was doing was using a scientific theory about order in nature to justify a particular order in society – in this case the British empire. Because it was clear that the global self-regulating system that Smuts described looked exactly like the empire. And at the same time Smuts made a notorious speech saying that blacks should be segregated from whites in South Africa. The implication was clear: that blacks should stay in their natural “whole” and not disturb the system. It clearly prefigured the arguments for apartheid.

And this was the central problem with the concept of the self-regulating system, one that was going to haunt it throughout the 20th century. It can be easily manipulated by those in power to enforce their view of the world, and then be used to justify holding that power stable.

Because, although Tansley and Smuts and their argument about power would be forgotten, hybrid combinations of their ideas were going to re-emerge later in the century – strange fusions of systems engineering and mystical visions of organic wholes.

Thirty years later, thousands of young Americans who were disenchanted with politics went off instead to set up their own experimental communities – the commune movement. And they turned to Arthur Tansley’s idea of the ecosystem as a model for how to create a human system of order within the communes.

But they also fused it with cybernetic ideas drawn from computer theory, and out of this came a vision of strong, independent humans linked, just like in nature, in a network that was held together through feedback. The commune dwellers mimicked the ecosystem idea in their house meetings where they all had to say exactly what was on their minds at that moment – so information flowed freely round the system. And through that the communes were supposed to stabilise themselves.

But they didn’t. In many communes across America in the late 1960s house meetings became vicious bullying sessions where the strong preyed mercilessly on the weak, and nobody was allowed to voice any objections. The rules of the self-organising system said that no coalitions or alliances were allowed because that was politics – and politics was bad. If you talk today to ex-commune members they tell horrific stories of coercion, violent intimidation and sexual oppression within these utopian communities, while the other commune members stood mutely watching, unable under the rules of the system to do anything to stop it.

Again, the central weakness of the self-organising system was dramatically demonstrated. Whether it was used for conservative or radical ends, it could not cope with power, which is one of the central dynamic forces in human society.

But at the very same time a new generation of ecologists began to question the very basis of Arthur Tansley’s idea of the self-regulating ecosystem. Out of this came a bloody battle within the science of ecology, with the new generation showing powerfully that wherever they looked in nature they found not stability, but constant, dynamic change; that Tansley’s idea of a underlying pattern of stability in nature was really a fantasy, not a scientific truth.

But in an age that was increasingly disillusioned with politics, the ghosts not just of Tansley but also of Smuts now began to re-emerge in epic form. In the late 70s an idea rose up that we – and everything else on the planet – are connected together in complex webs and networks. Out of it came epic visions of connectivity such as the Gaia theory and utopian ideas about the world wide web. And human beings believed that their duty was not to try to control the system, but to help it maintain its natural self-organising balance.

At the end of 1991 a giant experiment began in the Arizona desert. Its aim was to create from scratch a model for a whole self-organising world.

Biosphere 2 was a giant sealed world. Eight humans were locked in with a mass of flora and other fauna, and a balanced ecosystem was supposed to naturally emerge. But from the start it was completely unbalanced. The CO2 levels started soaring, so the experimenters desperately planted more green plants, but the CO2 continued to rise, then dissolved in the “ocean” and ate their precious coral reef. Millions of tiny mites attacked the vegetables and there was less and less food to eat. The men lost 18% of their body weight. Then millions of cockroaches took over. The moment the lights were turned out in the kitchen, hordes of roaches covered every surface. And it got worse – the oxygen in the world started to disappear and no one knew where it was going. The “bionauts” began to suffocate. And they began to hate one another – furious rows erupted that often ended with them spitting in one another’s faces. A psychiatrist was brought in to see if they had gone insane, but concluded simply that it was a struggle for power.

Then millions of ants appeared from nowhere and waged war on the cockroaches. In 1993 the experiment collapsed in chaos and hatred.

The idea of nature that underpinned all these visions of self-organisation was a fantasy. A fantasy that was born at a time when those who ran the British empire were desperately trying to cling on to power as the dynamic forces of history whirled around them. So they turned to science to create a vision of a static world where everything is stable and your moral duty is to make sure that nothing ever changes.

The other problem with the self-organising system is that it cannot deal with power. Although it sees human beings all linked together in a system, its fundamental rule is that they must remain separate individuals. Alliances and coalitions would compromise the precious autonomy of the individual, and destabilise the system.

And in a Newsnight studio on a March evening this year, this is what you could hear. Lucy Annson insisted again and again to Emily Maitlis that she was only a spokesperson for herself, and under the rules of the network no one could stand back and judge the system. Emily said: “You’re not a completely peaceful organisation.” Lucy came back with the killer line: “I don’t think anyone can make an assessment of that, other than the people involved in the actions themselves.”

What the anti-cuts movement has done without realising is adopt an idea of how to order the world without hierarchies, a machine theory that leads to a static managerialism. It may be very good for organising creative and self-expressive demonstrations, but it will never change the world.

At the end of Biosphere 2 the ants destroyed the cockroaches. They then proceeded to eat through the silicone seal that enclosed the world. Through collective action the ants worked together and effectively destroyed the existing system. They then marched off into the Arizona desert. Who knows what they got up to there.

Academic Adaptation and “The New Communications Climate” (Open the Echo Chamber blog)

Posted by Edward R. Carr (31 May 2011)

[Original post here].

Andrew Revkin has a post up on Dot Earth that suggests some ways of rethinking scientific engagement with the press and the public.  The post is something of a distillation of a more detailed piece in the WMO Bulletin.  Revkin was kind enough to solicit my comments on the piece, as I have appeared in Dot Earth before in an effort to deal with this issue as it applies to the IPCC, and this post is something of a distillation of my initial rapid response.

First, I liked the message of these two pieces a lot, especially the push for a more holistic engagement with the public through different forms of media, including the press.  As Revkin rightly states, we need to “recognize that the old model of drafting a press release and waiting for the phone to ring is not the path to efficacy and impact.” Someone please tell my university communications office.

A lot of the problem stems from our lack of engagement with professionals in the messaging and marketing world.  As I said to the very gracious Rajendra Pachauri in an email exchange back when we had the whole “don’t talk to the media” controversy:

I am in no way denigrating your [PR] efforts. I am merely suggesting that there are people out there who spend their lives thinking about how to get messages out there, and control that message once it is out there. Just as we employ experts in our research and in these assessment reports precisely because they bring skills and training to the table that we lack, so too we must consider bringing in those with expertise in marketing and outreach.

I assume that a decent PR team would be thinking about multiple platforms of engagement, much as Revkin is suggesting.  However, despite the release of a new IPCC communications strategy, I’m not convinced that the IPCC (or much of the global change community more broadly) yet understands how desperately we need to engage with professionals on this front.  In some ways, there are probably good reasons for the lack of engagement with pros, or with the “new media.” For example, I’m not sure Twitter will help with managing climate change rumors/misinformation as it is released, if only because we are now too far behind the curve – things are so politicized that it is too late for “rapid response” to misinformation. I wish we’d been on this twenty years ago, though . . .

But this “behind the curve” mentality does not explain our lack of engagement.  Instead, I think there are a few other things lurking here.  For example, there is the issue of institutional politics. I love the idea of using new media/information and communication technologies for development (ICT4D) to gather and communicate information, but perhaps not in the ways Revkin suggests.  I have a section later inDelivering Development that outlines how, using existing mobile tech in the developing world, we could both get better information about what is happening to the global poor (the point of my book is that, as I think I demonstrate in great detail, we actually have a very weak handle on what is going on in most parts of the developing world) and could empower the poor to take charge of efforts to address the various challenges, environmental, economic, political and social, that they face every day.  It seems to me, though, that the latter outcome is a terrifying prospect for some in development organizations, as this would create a much more even playing field of information that might force these organizations to negotiate with and take seriously the demands of the people with whom they are working.  Thus, I think we get a sort of ambiguity about ICT4D in development practice, where we seem thrilled by its potential, yet continue to ignore it in our actual programming.  This is not a technical problem – after all, we have the tech, and if we want to do this, we can – it is a problem of institutional politics.  I did not wade into a detailed description of the network I envision in the book because I meant to present it as a political challenge to a continued reticence on the part of many development organizations and practitioners to really engage the global poor (as opposed to tell them what they need and dump it on them).  But my colleagues and I have a detailed proposal for just such a network . . . and I think we will make it real one day.

Another, perhaps more significant barrier to major institutional shifts with regard to outreach is the a chicken-and-egg situation of limited budgets and a dominant academic culture that does not understand media/public engagement or politics very well and sees no incentive for engagement.  Revkin nicely hits on the funding problem as he moves past simply beating up on old-school models of public engagement:

As the IPCC prepares its Fifth Assessment Report, it does so with what, to my eye, appears to be an utterly inadequate budget for communicating its findings and responding in an agile way to nonstop public scrutiny facilitated by the Internet.

However, as much as I agree with this point (and I really, really agree), the problem here is not funding unto itself – it is the way in which a lack of funding erases an opportunity for cultural change that could have a positive feedback effect on the IPCC, global assessments, and academia more generally that radically alters all three. The bulk of climate science, as well as social impact studies, come from academia – which has a very particular culture of rewards.  Virtually nobody in academia is trained to understand that they can get rewarded for being a public intellectual, for making one’s work accessible to a wide community – and if I am really honest, there are many places that actively discourage this engagement.  But there is a culture change afoot in academia, at least among some of us, that could be leveraged right now – and this is where funding could trigger a positive feedback loop.

Funding matters because once you get a real outreach program going, productive public engagement would result in significant personal, intellectual and financial benefits for the participants that I believe could result in very rapid culture change.  My twitter account has done more for the readership of my blog, and for my awareness of the concerns and conversations of the non-academic development world, than anything I have ever done before – this has been a remarkable personal and intellectual benefit of public engagement for me.  As universities continue to retrench, faculty find themselves ever-more vulnerable to downsizing, temporary appointments, and a staggering increase in administrative workload (lots of tasks distributed among fewer and fewer full-time faculty).  I fully expect that without some sort of serious reversal soon, I will retire thirty-odd years hence as an interesting and very rare historical artifact – a professor with tenure.  Given these pressures, I have been arguing to my colleagues that we must engage with the public and with the media to build constituencies for what we do beyond our academic communities.  My book and my blog are efforts to do just this – to become known beyond the academy such that I, as a public intellectual, have leverage over my university, and not the other way around.  And I say this as someone who has been very successful in the traditional academic model.  I recognize that my life will need to be lived on two tracks now – public and academic – if I really want to help create some of the changes in the world that I see as necessary.

But this is a path I started down on my own, for my own idiosyncratic reasons – to trigger a wider change, we cannot assume that my academic colleagues will easily shed the value systems in which they were intellectually raised, and to which they have been held for many, many years.  Without funding to get outreach going, and demonstrate to this community that changing our model is not only worthwhile, but enormously valuable, I fear that such change will come far more slowly than the financial bulldozers knocking on the doors of universities and colleges across the country.  If the IPCC could get such an effort going, demonstrate how public outreach improved the reach of its results, enhanced the visibility and engagement of its participants, and created a path toward the progressive politics necessary to address the challenge of climate change, it would be a powerful example for other assessments.  Further, the participants in these assessments would return to their campuses with evidence for the efficacy and importance of such engagement . . . and many of these participants are senior members of their faculties, in a position to midwife major cultural changes in their institutions.

All this said, this culture change will not be birthed without significant pains.  Some faculty and members of these assessments will want nothing to do with the murky world of politics, and prefer to continue operating under the illusion that they just produce data and have no responsibility for how it is used.  And certainly the assessments will fear “politicization” . . . to which I respond “too late.”  The question is not if the findings of an assessment will be politicized, but whether or not those who best understand those findings will engage in these very consequential debates and argue for what they feel is the most rigorous interpretation of the data at hand.  Failure to do so strikes me as dereliction of duty.  On the other hand, just as faculty might come to see why public engagement is important for their careers and the work they do, universities will be gripped with contradictory impulses – a publicly-engaged faculty will serve as a great justification for faculty salaries, increased state appropriations, new facilities, etc.  Then again, nobody likes to empower the labor, as it were . . .

In short, in thinking about public engagement and the IPCC, Revkin is dredging up a major issue related to all global assessments, and indeed the practices of academia.  I think there is opportunity here – and I feel like we must seize this opportunity.  We can either guide a process of change to a productive end, or ride change driven by others wherever it might take us.  I prefer the former.

Scientists Cry Foul Over Report Criticizing National Science Foundation (msnbc.com)

By Stephanie Pappas, LiveScience Senior Writer

http://www.msnbc.msn.com/id/43187678

A report released by the office of Sen. Tom Coburn (R-Okla.) distorts the goals and purposes of National Science Foundation-funded (NSF) research in an effort to paint the agency as wasteful, scientists say.

Coburn released “The National Science Foundation: Under the Microscope” May 26, raising “serious questions regarding the agency’s management and priorities,” according to Coburn’s office. But scientists whose research is targeted in the report say Coburn has oversimplified or otherwise misrepresented their work. [Infographic: Science Spending in the Federal Budget ]

“Good Lord!” Texas A&M psychologist Gerianne Alexander, whose work on hormones and infant development appears in the report, wrote in an email to LiveScience. “The summary of the funded research is very inaccurate.”

This isn’t the first time politicians have taken aim at the NSF in the name of deficit reduction. In December 2010, Rep. Adrian Smith (R-Neb.) called for citizens to review NSF grants and highlighted a few projects he viewed as wasteful, including research meant to evaluate productivity.

NSF’s entire budget of approximately $7 billion represents about one-half of 1 percent of the projected 2011 federal deficit.

Funding and review

The new report acknowledges that NSF has funded research leading to innovations ranging from the Internet to bar codes. NSF also runs a rigorous evaluation process when choosing to fund grants. Each year, the agency receives more than 45,000 competitive proposals, NSF spokesperson Maria Zacharias told LiveScience in December. NSF funds about 11,500 of those, Zacharias said.

However, according to a review by Coburn’s staff, the senator is unconvinced that NSF is making the right decisions.

“It is not the intent of this report to suggest that there is no utility associated with these research efforts,” the report reads. “The overarching question to ask, however, is simple. Are these projects the best possible use of our tax dollars, particularly in our current fiscal crisis?”

Science out of context

Scientists say Coburn’s office fails to put their research into context, often choosing silly-sounding projects to characterize entire research programs.

Alexander’s work, for example, is characterized as a $480,000 experiment meant to discover “if boys like trucks and girls like dolls.” According to the report, scientists could have saved their time by “talking to any new parent.”

In fact, Alexander said, the research project is more complicated.

“The grant supports research asking whether the postnatal surge in testosterone levels in early infancy contributes to the development of human behavior,” she said. “This is not a trivial issue.” [Read: The Truth About Genderless Babies ]

That’s because some preliminary evidence suggests that disruptions in hormones like testosterone can alter behavior, Alexander said, potentially contributing to the development of disorders such as attention deficit hyperactivity disorder (ADHD) and autism.

Toy choice is a way to measure sex differences in behavior, because babies tend to choose stereotyped boy-girl toys early on, Alexander said. She and her team measure infant hormone levels and look for effects on behavior, activity levels, temperament and verbal development.

Likewise, a much-ballyhooed project that put shrimp on a treadmill was part of research intended to find out how marine animals will cope with increased environmental stress.

Robot laundry?

Coburn focused much of the report on social science research. But the report also questions several robotics projects, including a robot that can fold laundry. The report mocks the research, noting that it takes the robot 25 minutes to fold a single towel.

In fact, the $1.5 million NSF grant went not to teach robots how to do slow-motion laundry, but to learn how to make robots that can interact with complex objects, said lead researcher Pieter Abbeel of UC Berkeley. The towel-folding, which came six months into a four-year project, was an ideal challenge, Abbeel said, because folding a soft, deformable towel is very different from the pick-up-this-bolt, screw-in-this-screw tasks that current robots can perform.

“Towel-folding is just a first, small step toward a new generation of robotic devices that could, for example, significantly increase the independence of elderly and sick people, protect our soldiers in combat, improve the delivery of government services and a host of other applications that would revolutionize our day-to-day lives,” Abbeel wrote in an email to LiveScience.

Overseeing basic science

“It’s legitimate to ask what kind of scientific research is important and what isn’t,” said John Hibbing, a professor of political science whose research on the genetics of political leaningsappeared in Coburn’s report. However, Hibbing expressed doubt that Coburn’s nonscientific review process could meet that goal.

“I sympathize with the desire to identify things that are silly and not useful,” Hibbing told LiveScience. “But I’m not sure he’s identified a really practical strategy to distinguish between the two.”

Genes, germs and the origins of politics (New Scientist)

NS 2813: Genes, germs and the origins of politics

* 18 May 2011 by Jim Giles

A controversial new theory claims fear of infection makes the difference between democracy and dictatorship

COMPARE these histories. In Britain, democracy evolved steadily over hundreds of years. During the same period, people living in what is now Somalia had many rulers, but almost all deprived them of the chance to vote. It’s easy to find other stark contrasts. Citizens of the United States can trace their right to vote back to the end of the 18th century. In Syria, many citizens cannot trace their democratic rights anywhere – they are still waiting for the chance to take part in a meaningful election.

Conventional explanations for the existence of such contrasting political regimes involve factors such as history, geography, and the economic circumstances and culture of the people concerned, to name just a few. But the evolutionary biologist Randy Thornhill has a different idea. He says that the nature of the political system that holds sway in a particular country – whether it is a repressive dictatorship or a liberal democracy – may be determined in large part by a single factor: the prevalence of infectious disease.

It’s an idea that many people will find outrageously simplistic. How can something as complex as political culture be explained by just one environmental factor? Yet nobody has managed to debunk it, and its proponents are coming up with a steady flow of evidence in its favour. “It’s rather astonishing, and it could be true,” says Carlos Navarrete, a psychologist at the Michigan State University in East
Lansing.

Thornhill is no stranger to controversy, having previously co-authored A Natural History of Rape, a book proposing an evolutionary basis for rape. His iconoclastic theory linking disease to politics was inspired in part by observations of how an animal’s development and behaviour can respond rapidly to physical dangers in a region, often in unexpected ways. Creatures at high risk of being eaten by predators, for example, often reach sexual maturity at a younger age than genetically similar creatures in a safer environment, and are more likely to breed earlier in their lives. Thornhill wondered whether threats to human lives might have similarly influential consequences to our psychology.

The result is a hypothesis known as the parasite-stress model, which Thornhill developed at the University of New Mexico, Albuquerque, with his colleague Corey Fincher.

 

 

Xenophobic instincts

The starting point for Thornhill and Fincher’s thinking is a basic human survival instinct: the desire to avoid illness. In a region where disease is rife, they argue, fear of contagion may cause people to avoid outsiders, who may be carrying a strain of infection to which they have no immunity. Such a mindset would tend to make a community as a whole xenophobic, and might also discourage interaction between the various groups within a society – the social classes, for instance – to prevent unnecessary contact that might spread disease. What is more, Thornhill and Fincher argue, it could encourage people to conform to social norms and to respect authority, since adventurous behaviour may flout rules of conduct set in place to prevent contamination.

Taken together, these attitudes would discourage the rich and influential from sharing their wealth and power with those around them, and inhibit the rest of the population from going against the status quo and questioning the authority of those above them. This is clearly not a situation conducive to democracy. When the threat of disease eases, however, these influences no longer hold sway, allowing forces that favour a more democratic social order to come to the fore.

That’s the idea, anyway. But where is the evidence?

The team had some initial support from earlier studies that had explored how a fear of disease affects individual attitudes. In 2006, for example, Navarrete found that when people are prompted to think about disgusting objects, such as spoilt food, they become more likely to express nationalistic values and show a greater distrust of foreigners (Evolution and Human Behavior, vol 27, p 270). More recently, a team from Arizona State University in Tempe found that reading about contagious illnesses made people less adventurous and open to new experiences, suggesting that they have become more inward looking and conformist (Psychological Science, vol 21, p 440).

Temporarily shifting individual opinions is one thing, but Thornhill and Fincher needed to show that these same biases could change the social outlook of a whole society. Their starting point for doing so was a description of cultural attitudes called the “collectivist-individualist” scale. At one end of this scale lies the collectivist outlook, in which people place the overall good of society ahead of the freedom of action of the individuals within it. Collectivist societies are often, though not exclusively, characterised by a greater respect for authority – if it’s seen as being necessary for the greater good. They also tend to be xenophobic and conformist. At the other end there is the individualist viewpoint, which has more emphasis on openness and freedom for the individual.

Pathogen peril

In 2008, the duo teamed up with Damian Murray and Mark Schaller of the University of British  Columbia in Vancouver, Canada, to test the idea that societies with more pathogens would be more collectivist. They rated people in 98 different nations and regions, from Estonia  to Ecuador, on the collectivist-individualist scale, using data from questionnaires and studies of linguistic cues that can betray a social outlook. Sure enough, they saw a correlation: the greater the threat of disease in a region, the more collectivist people’s attitudes were (Proceedings of the Royal Society B, vol 275, p 1279). The correlation remained even when they controlled for potential confounding factors, such as wealth and urbanisation.

A study soon followed showing similar patterns when comparing US states. In another paper, Murray and Schaller examined a different set of data and showed that cultural differences in one collectivist trait – conformity – correlate strongly with disease prevalence (Personality and Social Psychology Bulletin, vol 37, p 318).

Thornhill and Fincher’s next challenge was to find evidence linking disease prevalence not just with cultural attitudes but with the political systems they expected would go with them. To do so, they used a 66-point scale of pathogen prevalence, based on data assembled by the Global Infectious Diseases and Epidemiology Online Network. They then compared their data set with indicators that assess the politics of a country. Democracy is a tough concept to quantify, so the team looked at a few different measures, including the Freedom House Survey, which is based on the subjective judgements of a team of political scientists working for an independent American think tank, and the Index of Democratization, which is based on estimates of voter participation (measured by how much of a population cast their votes and the number of referendums offered to a population) and the amount of competition between political parties.

The team’s results, published in 2009, showed that each measure varied strongly with pathogen prevalence, just as their model predicted (Biological Reviews, vol 84, p 113). For example, when considering disease prevalence, Somalia is 22nd on the list of nations, while the UK comes in 177th. The two countries come out at opposite ends of the democratic scale (see “An infectious idea”).

Importantly, the relationship still holds when you look at historical records of pathogen prevalence. This, together with those early psychological studies of immediate reactions to disease, suggests it is a nation’s health driving its political landscape, and not the other way around, according to the team.

Last year, they published a second paper that used more detailed data of the diseases prevalent in each region. They again found that measures of collectivism and democracy correlate with the presence of diseases that are passed from human to human – though not with the prevalence of diseases transmitted directly from animals to humans, like rabies (Evolutionary Psychology, vol 8, p 151). Since collectivist behaviours would be less important for preventing such infections, this finding fits with Thornhill and Fincher’s hypothesis.

“Thornhill’s work strikes me as interesting and promising,” says Ronald Inglehart, a political scientist at the University of Michigan in Ann Arbor who was unaware of it before we contacted him. He notes that it is consistent with his own finding that a society needs to have a degree of economic security before democracy can develop. Perhaps this goes hand in hand with a reduction in disease prevalence to signal the move away from collectivism, he suggests.

Inglehart’s comments nevertheless highlights a weakness in the evidence so far assembled in support of the parasite-stress model. An association between disease prevalence and democracy does not necessarily mean that one drives the other. Some other factor may drive both the prevalence of disease in an area and its political system. In their 2009 paper, Thornhill and Fincher managed to eliminate some of the possible “confounders”. For example, they showed that neither a country’s overall wealth nor the way it is distributed can adequately explain the link between the prevalence of disease there and how democratic it is.

But many other possibilities remain. For example, pathogens tend to be more prevalent in the tropics, so perhaps warmer climates encourage collectivism. Also, many of the nations that score high for disease and low for democracy are in sub-Saharan Africa, and have a history of having been colonised, and of frequent conflict and foreign exploitation since independence. Might the near-constant threat of war better explain that region’s autocratic governments? There’s also the possibility that education and literacy would have an impact, since better educated people may be more likely to question authority and demand their rights to a democracy. Epidemiologist Valerie Curtis of the London School of Hygiene and Tropical Medicine thinks such factors might be the ones that count, and says the evidence so far does not make the parasite-stress theory any more persuasive than these explanations.

Furthermore, some nations buck the trend altogether. Take the US and Syria, for example: they have sharply contrasting political systems but an almost identical prevalence of disease. Though even the harshest critic of the theory would not expect a perfect correlation, such anomalies require some good explanations.

Also lacking so far in their analysis is a coherent account of how historical changes in the state of public health are linked to political change. If Thornhill’s theory is correct, improvements in a nation’s health should lead to noticeable changes in social outlook. Evidence consistent with this idea comes from the social revolution of the 1960s in much of western Europe and North America, which involved a shift from collectivist towards individualist thinking. This was preceded by improvements in public health in the years following the second world war – notably the introduction of penicillin, mass vaccination and better malaria control.

There are counter-examples, too. It is not clear whether the opening up of European society during the 18th century was preceded by any major improvements in people’s health, for example. Nor is there yet any clear evidence linking the current pro-democracy movements in the Middle East and north Africa to changes in disease prevalence. The theory also predicts that episodes such as the recent worldwide swine-flu epidemic should cause a shift away from democracy and towards authoritarian, collectivist attitudes. Yet as Holly Arrow, a psychologist at the University of Oregon in Eugene, points out, no effect has been recorded.

Mysterious mechanisms

To make the theory stick, Thornhill and his collaborators will also need to provide a mechanism for their proposed link between pathogens and politics. If cultural changes are responsible, young people might learn to avoid disease – and outsiders – from the behaviour of those around them. Alternatively, the reaction could be genetically hard-wired. So far, it has not proved possible to eliminate any of the possible mechanisms. “It’s an enormous set of unanswered questions. I expect it will take many years to explore,” Schaller says.

One possible genetic explanation involves 5-HTTLPR, a gene that regulates levels of the neurotransmitter serotonin. People carrying the short form of the gene are more likely to be anxious and to be fearful of health risks, relative to those with the long version. These behaviours could be a life-saver if they help people avoid situations that would put them at risk of infection, so it might be expected that the short version of the gene is favoured in parts of the world where disease risk is high. People with the longer version of 5-HTTLPR, on the other hand, tend to have higher levels of serotonin and are therefore more extrovert and more prone to risk-taking. This could bring advantages such as an increased capacity to innovate, so the long form of the gene should be more
common in regions relatively free from illness.

That pattern is exactly what neuroscientists Joan Chiao and Katherine Blizinsky at Northwestern University in Evanston, Illinois, have reported in a paper published last year. Significantly, nations where the short version of the gene is more common also tend to have more collectivist attitudes (Proceedings of the Royal Society B, vol 277, p 529).

It is only tentative evidence, and some doubt that Chiao and Blizinsky’s findings are robust enough to support their conclusions (Proceedings of the Royal Society B, vol 278, p 329). But if the result pans out with further research, it suggests the behaviours involved in the parasite-stress model may be deeply ingrained in our genetic make-up, providing a hurdle to more rapid political change in certain areas. While no one is saying that groups with a higher proportion of short versions of the gene will never develop a democracy, the possibility that some societies are more genetically predisposed to it than others is nevertheless an uncomfortable idea to contemplate.

Should the biases turn out to be more temporary – if flexible psychological reactions to threat, or cultural learning, are the more important mechanisms – the debate might turn to potential implications of the theory. Projects aiming to improve medical care in poor countries might also lead a move to more democratic and open governments, for example, giving western governments another incentive to fund these schemes. “The way to develop a region is to emancipate it from parasites,” says Thornhill.

Remarks like that seem certain to attract flak. Curtis, for instance, bristled a little when New Scientist put the idea to her, pointing out that the immediate threat to human life is a pressing enough reason to be concerned about infectious disease.

Thornhill still has a huge amount of work ahead of him if he is to provide a convincing case that will assuage all of these doubts. In the meantime, his experience following publication of A Natural History of Rape has left him prepared for a hostile reception. “I had threats by email and phone,” he recalls. “You’re sometimes going to hurt people’s feelings. I consider it all in a day’s work.”

Jim Giles is a New Scientist correspondent based in San Francisco

The Controversy about Hypothesis Testing

From an interesting call for papers:

“Scientists spend a lot of time testing a hypothesis, and classifying experimental results as (in)significant evidence. But even after a century of hot debate, there is no consensus on what this concept of significance implies, how the results of hypothesis tests should be interpreted, and which practical pitfalls have to be avoided. Take the fierce criticisms of significance testing in economics, take the endless debate about statistical reform in psychology, take the foundational disagreement between frequentists and Bayesians about what constitutes statistical evidence.”

(Link to the conference here).

Ordem no caos (FAPESP)

31/05/2011

Por Elton Alisson

Pesquisadores desenvolvem modelo teórico para explicar e determinar as condições para a ocorrência de sincronização isócrona em sistemas caóticos. Estudo pode levar à melhoria de sistemas como o de telecomunicações.

Agência FAPESP – Na natureza, enxames de vaga-lumes enviam sinais luminosos uns para os outros. Isso é feito inicialmente de forma autônoma, individual e independente e, sob determinadas circunstâncias, pode dar origem a um fenômeno robusto de natureza coletiva chamado sincronização. Como resultado, milhares de vaga-lumes piscam em uníssono, de forma ritmada, emitindo sinais luminosos em sincronia com os demais.

Há pouco mais de 20 anos se descobriu que a sincronização também ocorre em sistemas caóticos – sistemas complexos de comportamento imprevisível nas mais variadas áreas, como economia, clima ou agricultura. Outra descoberta mais recente foi que a sincronização resiste a atrasos na propagação de sinais emitidos.

Nessas situações, sob determinadas circunstâncias, a sincronização pode emergir em sua forma isócrona, isto é, com atraso zero. Isso significa que equipamentos como osciladores estão perfeitamente sincronizados no tempo, mesmo recebendo sinais atrasados dos demais. Entretanto, os modelos teóricos desenvolvidos para explicar o fenômeno não levaram esse fato em consideração até o momento.

Uma nova pesquisa realizada por cientistas do Instituto Tecnológico de Aeronáutica (ITA) e do Instituto Nacional de Pesquisas Espaciais (Inpe) resultou em um modelo teórico para demonstrar como a sincronização ocorre quando há atraso na emissão e no recebimento de informação entre osciladores caóticos.

Os resultados do estudo, que podem ser utilizados para aprimorar sistemas tecnológicos, foram publicados em abril no periódico Journal of Physics A: Mathematical and Theoretical.

Durante o estudo, os pesquisadores buscaram explicar a sincronização quando há atraso no recebimento da informação entre os osciladores caóticos. O objetivo é determinar as condições sob as quais o fenômeno ocorre em sistemas reais.

“Utilizando a teoria da estabilidade de Lyapunov-Krasovskii, que trata do problema da estabilidade em sistemas dinâmicos, estabelecemos critérios de estabilidade que, a partir de parâmetros como o tempo de atraso no recebimento das informações entre os osciladores, permitem determinar se os osciladores entrarão em estado de sincronização isócrona”, disse um dos autores do artigo, José Mario Vicensi Grzybowski, à Agência FAPESP.

“Foi a primeira demonstração de forma totalmente analítica da estabilidade da sincronização isócrona. Não há similares na literatura”, afirmou Grzybowski, que realiza trabalho de doutorado em engenharia eletrônica e computação no ITA com Bolsa da FAPESP.

As descobertas do estudo poderão possibilitar o aprimoramento de sistemas tecnológicos baseados em sincronização, especialmente em sistemas de telecomunicação baseados em caos.

Além disso, entre as possíveis aplicações estão os satélites em formação de voo, em que um precisa manter uma distância relativa adequada em relação aos outros e, ao mesmo tempo, estabelecer um referencial (sincronização) que permita o intercâmbio de informações, coleta e combinação eletrônica de imagens oriundas dos diversos satélites da formação.

“Nesse caso, o referencial pode ser estabelecido por meio de um fenômeno que emerge naturalmente desde que as condições apropriadas sejam proporcionadas, diminuindo ou até dispensando o uso de algoritmos”, disse.

Redes complexas naturais

Veículos aéreos não tripulados, que podem explorar uma determinada região em conjunto, além de robôs e sistemas de controle distribuídos, que também precisam trabalhar de forma coordenada em uma rede, podem utilizar os resultados da pesquisa.

Os autores do estudo também pretendem fazer com que o fenômeno da sincronização ocorra em sistemas tecnológicos sem a necessidade de existir um líder que oriente a forma como os outros agentes osciladores devem se comportar.

“Pretendemos eliminar a figura do líder e fazer com que a sincronização ocorra em função da interação entre os agentes, como ocorre com uma espécie de vaga-lumes na Ásia, que entra em sincronização sem que um deles lidere”, disse Elbert Einstein Macau, pesquisador do Inpe e outro autor do estudo, do qual participou também Takashi Yoneyama, do ITA.

Segundo eles, nessa pesquisa foi analisada a sincronização com um atraso de tempo na transmissão da informação entre dois osciladores. Mas no trabalho que desenvolvem atualmente os resultados serão expandidos para uma rede de osciladores de modo a ampliar a escala do problema, e de sua solução.

Dessa forma, segundo eles, será possível modelar fenômenos baseados na sincronização isócrona em escala de rede e contemplar fenômenos naturais que apresentam nível de complexidade muitas vezes superior.

“Em princípio, qualquer fenômeno real que se baseia na sincronização isócrona poderá ser tratado a partir desses elementos teóricos, que podem servir para projetos de redes tecnológicas, ou para analisar e compreender comportamentos emergentes em redes naturais, mesmo naquelas em que não temos formas de influir diretamente”, disse Grzybowski.

O artigo Stability of isochronal chaos synchronization (doi:10.1088/1751-8113/44/17/175103) pode ser lido em http://iopscience.iop.org/1751-8121/44/17/175103/pdf/1751-8121_44_17_175103.pdf