Arquivo da categoria: opinião pública

>On Birth Certificates, Climate Risk and an Inconvenient Mind (N.Y. Times, Dot Earth Blog)

>
April 28, 2011, 9:23 AM
By ANDREW C. REVKIN

As Donald Trump tries to milk a last bit of publicity out of the failed “birther” challenge to President Obama, it’s worth reading a fresh take by an Australian psychologist on the deep roots of denial in people with fundamentalist passions of whatever stripe. Here’s an excerpt:

[I]deology trumps facts.
And it doesn’t matter what the ideology is, whether socialism, any brand of fundamentalist religion, or free-market extremism. The psychological literature shows quite consistently that a threat to one’s worldview is more than likely met by a dismissal of facts, however strong the evidence. Indeed, the stronger the evidence, the greater the threat — and hence the greater the denial.
In its own bizarre way, then, the rising noise level of climate denial provides further evidence that global warming resulting from human CO2 emissions is indeed a fact, however inconvenient it may be. Read the rest.
The piece, published today on the Australian news blog The Drum, is byStephan Lewandowsky of the School of Psychology at the University of Western Australia.
Of course, just being aware that ideology can deeply skew how people filter facts and respond to risks begs the question of how to make progress in the face of the wide societal divisions this pattern creates.
It’s easy to forget that there’s been plenty of climate denial to go around. It took a decade for those seeking a rising price on carbon dioxide emissions as a means to transform American and global energy norms to realize that a price sufficient to drive the change was a political impossibility.
As a new paper in the Proceedings of the National Academy of Sciences found, even when greenhouse-gas emissions caps were put in place, trade with unregulated countries simply shifted the brunt of the emissions elsewhere.
When he was Britain’s prime minister, Tony Blair put it this way in 2005: “The blunt truth about the politics of climate change is that no country will want to sacrifice its economy in order to meet this challenge.”
My choice, of course, is to attack the two-pronged energy challenge the world faces with a sustained energy quest, nudged and nurtured from the top but mainly fostered from the ground up.
And I’m aware I still suffer from a hint of “scientism,” even “rational optimism,” in expecting that this argument can catch on, but so be it.
10:11 a.m. | Updated For much more on the behavioral factors that shape the human struggle over climate policy, I encourage you to explore “Living in Denial: Climate Change, Emotions, and Everyday Life,” a new book by Kari Marie Norgaard, a sociologist who has just moved from Whitman College to the University of Oregon.
Robert Brulle of Drexel University brought the book to my attention several months ago, and I invited him to do a Dot Earth “Book Report,” to kick off a discussion of Norgaard’s insights, which emerge from years of research she conducted on climate attitudes in a rural community in western Norway. (I’d first heard of of Norgaard’s research while reporting my 2007 article on behavior and climate risk.)
(I also encourage you to read the review in the journal Nature Climate Changeby Mike Hulme, a professor of climate at the University of East Anglia and the author of “Why We Disagree about Climate Change.”)
Here’s Brulle’s reaction to Norgaard’s book:
As a sociologist and longtime student of human responses to environmental problems, I’ve seen reams of analysis come and go on why we get some things right and some very wrong. A new book by Kari Norgaard has done the best job yet of cutting to the core on our seeming inability to grasp and meaningfully respond to human-driven climate change.
As the science of climate change has become stronger and more dire, media coverage, public opinion, and government actions regarding this issue has declined. At the same time, climate denial positions have become increasingly accepted, despite a lack of scientific evidence. Even among the public that accepts the science of global climate change, the dire circumstances we now face in this regard are consistently downplayed, and the logical implications that follow from the scientific analysis of the necessity to enact swift and aggressive measures to combat climate change are not followed through either intellectually or politically.
Instead, at best, a series of half measures have been proposed, which though they may be comforting, are essentially symbolic measures that allow the status quo to continue unchanged, and thus will not adequately address the issue of global climate change. Thus attempts to address climate change have encountered significant cultural, political, and economic barriers that have not been overcome. While there have been several attempts to explain the lack of meaningful action regarding climate change, these models have not developed into an integrated and empirically supported approach. Additionally, many of these models are based in an individualistic perspective, and thus engage in a form of psychological reductionism. Finally, none of these models are able to coherently explain the inter-related phenomena regarding climate change that is occurring at the individual, small group, institutional, and societal levels.
To move beyond the limitations of these approaches, Dr. Norgaard develops a sociological model that views the response to global climate change as a social process. One of the fundamental insights of sociology is that individuals are part of a larger structure of cultural and social interactions. Thus through the socialization processes, we construct certain ways of life and understandings of the world that guide our everyday interactions. Individuals become the carriers of the orientations and practices that constitute our social order. A disjuncture between our taken-for-granted way of living, such as the new behaviors necessitated by climate change, are experienced at the individual level as identity threats, at the institutional level as challenges to social cohesion, and at the societal level as legitimation threats. When this occurs, there are powerful processes that work at the psychological, institutional, and overall society level to maintain the current orientations and ensure social stability. Taken together, these social processes create cultural and social stability. They also create, from the view of climate change, a form of social inertia that inhibits rapid social change.
From this sociological perspective, Dr. Norgaard takes on the apparent paradox of climate change and public awareness; as our knowledge about the nature and seriousness of climate change has increased, our political and social engagement with the issue has declined. Why? Dr. Norgaard’s answer (crudely put) is that our personality structures and social norms are so thoroughly enmeshed with a growth economy based on fossil fuels that any consideration of the need to change our way of life to deal with climate change evokes powerful emotions of anxiety and desires to avoid this issue. This avoidance behavior is socially reinforced by collective group norms, as well as the messages we receive from the mass media and the political elite. She develops this thesis through the use of an impressive array of sociological theory, including the sociology of the emotions, cultural sociology, and political economy. Additionally, she utilizes specific theoretical approaches regarding the social denial of catastrophic risk. Here she skillfully repurposes the literature on nuclear war and collective denial to the issue of climate change. This is a unique and insightful use of this literature. Thus her theoretical contribution is substantial and original. She then illustrates this process through a thick qualitative analysis based on participant observation in Norway. In her analysis of conversations, she illustrates how collective denial of climate change takes place through conversations. This provided powerful ground truth evidence of her theoretical framework.
This is an extremely important intellectual contribution. Research on climate change and culture has been primarily focused on individual attitudinal change. This work brings a sociological perspective to our understanding of individual and collective responses to climate change information, and opens up a new research area. It also has important practical implications. Most climate change communication efforts are based on conveying information to individuals. The assumption is that individuals will take in this information and then act rationally in their own interests. Dr. Norgaard’s analysis course charts a different approach. As she demonstrates, it is not a lack of information that inhibits action on climate change. Rather, the knowledge brings about unpleasant emotions and anxiety. Individuals and communities seek to restore a sense of equilibrium and stability, and thus engage in a form of denial which, although the basic facts of climate change are acknowledged, the logical conclusions and actions that follow from the information are minimized and not acted upon. This perspective calls for a much different approach to climate change communications, and defines a new agenda for this field.

[Note: people interested in this line of argument should follow the work done by researchers at the Center for Research on Environmental Decisions (CRED), at Columbia University, @ http://cred.columbia.edu.] 

Anúncios

>Climategate: What Really Happened? (Mother Jones)

>

>The Science of Why We Don’t Believe Science (Mother Jones)

>

Illustration: Jonathon Rosen
How our brains fool us on climate, creationism, and the vaccine-autism link.

— By Chris Mooney
Mon Apr. 18, 2011 3:00 AM PDT

“A MAN WITH A CONVICTION is a hard man to change. Tell him you disagree and he turns away. Show him facts or figures and he questions your sources. Appeal to logic and he fails to see your point.” So wrote the celebrated Stanford University psychologist Leon Festinger, in a passage that might have been referring to climate change denial—the persistent rejection, on the part of so many Americans today, of what we know about global warming and its human causes. But it was too early for that—this was the 1950s—and Festinger was actually describing a famous case study in psychology.

Festinger and several of his colleagues had infiltrated the Seekers, a small Chicago-area cult whose members thought they were communicating with aliens—including one, “Sananda,” who they believed was the astral incarnation of Jesus Christ. The group was led by Dorothy Martin, a Dianetics devotee who transcribed the interstellar messages through automatic writing.

Through her, the aliens had given the precise date of an Earth-rending cataclysm: December 21, 1954. Some of Martin’s followers quit their jobs and sold their property, expecting to be rescued by a flying saucer when the continent split asunder and a new sea swallowed much of the United States. The disciples even went so far as to remove brassieres and rip zippers out of their trousers—the metal, they believed, would pose a danger on the spacecraft.

Festinger and his team were with the cult when the prophecy failed. First, the “boys upstairs” (as the aliens were sometimes called) did not show up and rescue the Seekers. Then December 21 arrived without incident. It was the moment Festinger had been waiting for: How would people so emotionally invested in a belief system react, now that it had been soundly refuted?

At first, the group struggled for an explanation. But then rationalization set in. A new message arrived, announcing that they’d all been spared at the last minute. Festinger summarized the extraterrestrials’ new pronouncement: “The little group, sitting all night long, had spread so much light that God had saved the world from destruction.” Their willingness to believe in the prophecy had saved Earth from the prophecy!

From that day forward, the Seekers, previously shy of the press and indifferent toward evangelizing, began to proselytize. “Their sense of urgency was enormous,” wrote Festinger. The devastation of all they had believed had made them even more certain of their beliefs.

In the annals of denial, it doesn’t get much more extreme than the Seekers. They lost their jobs, the press mocked them, and there were efforts to keep them away from impressionable young minds. But while Martin’s space cult might lie at on the far end of the spectrum of human self-delusion, there’s plenty to go around. And since Festinger’s day, an array of new discoveries in psychology and neuroscience has further demonstrated how our preexisting beliefs, far more than any new facts, can skew our thoughts and even color what we consider our most dispassionate and logical conclusions. This tendency toward so-called “motivated reasoning” helps explain why we find groups so polarized over matters where the evidence is so unequivocal: climate change, vaccines, “death panels,” the birthplace and religion of the president, and much else. It would seem that expecting people to be convinced by the facts flies in the face of, you know, the facts.

The theory of motivated reasoning builds on a key insight of modern neuroscience: Reasoning is actually suffused with emotion (or what researchers often call “affect”). Not only are the two inseparable, but our positive or negative feelings about people, things, and ideas arise much more rapidly than our conscious thoughts, in a matter of milliseconds—fast enough to detect with an EEG device, but long before we’re aware of it. That shouldn’t be surprising: Evolution required us to react very quickly to stimuli in our environment. It’s a “basic human survival skill,” explains political scientist Arthur Lupia of the University of Michigan. We push threatening information away; we pull friendly information close. We apply fight-or-flight reflexes not only to predators, but to data itself.

“We apply fight-or-flight reflexes not only to predators, but to data itself.”

We’re not driven only by emotions, of course—we also reason, deliberate. But reasoning comes later, works slower—and even then, it doesn’t take place in an emotional vacuum. Rather, our quick-fire emotions can set us on a course of thinking that’s highly biased, especially on topics we care a great deal about.

Consider a person who has heard about a scientific discovery that deeply challenges her belief in divine creation—a new hominid, say, that confirms our evolutionary origins. What happens next, explains political scientist Charles Taber of Stony Brook University, is a subconscious negative response to the new information—and that response, in turn, guides the type of memories and associations formed in the conscious mind. “They retrieve thoughts that are consistent with their previous beliefs,” says Taber, “and that will lead them to build an argument and challenge what they’re hearing.”

In other words, when we think we’re reasoning, we may instead be rationalizing. Or to use an analogy offered by University of Virginia psychologist Jonathan Haidt: We may think we’re being scientists, but we’re actually being lawyers. Our “reasoning” is a means to a predetermined end—winning our “case”—and is shot through with biases. They include “confirmation bias,” in which we give greater heed to evidence and arguments that bolster our beliefs, and “disconfirmation bias,” in which we expend disproportionate energy trying to debunk or refute views and arguments that we find uncongenial.

That’s a lot of jargon, but we all understand these mechanisms when it comes to interpersonal relationships. If I don’t want to believe that my spouse is being unfaithful, or that my child is a bully, I can go to great lengths to explain away behavior that seems obvious to everybody else—everybody who isn’t too emotionally invested to accept it, anyway. That’s not to suggest that we aren’t also motivated to perceive the world accurately—we are. Or that we never change our minds—we do. It’s just that we have other important goals besides accuracy—including identity affirmation and protecting one’s sense of self—and often those make us highly resistant to changing our beliefs when the facts say we should.

Modern science originated from an attempt to weed out such subjective lapses—what that great 17th century theorist of the scientific method, Francis Bacon, dubbed the “idols of the mind.” Even if individual researchers are prone to falling in love with their own theories, the broader processes of peer review and institutionalized skepticism are designed to ensure that, eventually, the best ideas prevail.

“Scientific evidence is highly susceptible to misinterpretation. Giving ideologues scientific data that’s relevant to their beliefs is like unleashing them in the motivated-reasoning equivalent of a candy store.”

Our individual responses to the conclusions that science reaches, however, are quite another matter. Ironically, in part because researchers employ so much nuance and strive to disclose all remaining sources of uncertainty, scientific evidence is highly susceptible to selective reading and misinterpretation. Giving ideologues or partisans scientific data that’s relevant to their beliefs is like unleashing them in the motivated-reasoning equivalent of a candy store.

Sure enough, a large number of psychological studies have shown that people respond to scientific or technical evidence in ways that justify their preexisting beliefs. In a classic 1979 experiment, pro- and anti-death penalty advocates were exposed to descriptions of two fake scientific studies: one supporting and one undermining the notion that capital punishment deters violent crime and, in particular, murder. They were also shown detailed methodological critiques of the fake studies—and in a scientific sense, neither study was stronger than the other. Yet in each case, advocates more heavily criticized the study whose conclusions disagreed with their own, while describing the study that was more ideologically congenial as more “convincing.”

Since then, similar results have been found for how people respond to “evidence” about affirmative action, gun control, the accuracy of gay stereotypes, and much else. Even when study subjects are explicitly instructed to be unbiased and even-handed about the evidence, they often fail.

And it’s not just that people twist or selectively read scientific evidence to support their preexisting views. According to research by Yale Law School professor Dan Kahan and his colleagues, people’s deep-seated views about morality, and about the way society should be ordered, strongly predict whom they consider to be a legitimate scientific expert in the first place—and thus where they consider “scientific consensus” to lie on contested issues.

In Kahan’s research, individuals are classified, based on their cultural values, as either “individualists” or “communitarians,” and as either “hierarchical” or “egalitarian” in outlook. (Somewhat oversimplifying, you can think of hierarchical individualists as akin to conservative Republicans, and egalitarian communitarians as liberal Democrats.) In one study, subjects in the different groups were asked to help a close friend determine the risks associated with climate change, sequestering nuclear waste, or concealed carry laws: “The friend tells you that he or she is planning to read a book about the issue but would like to get your opinion on whether the author seems like a knowledgeable and trustworthy expert.” A subject was then presented with the résumé of a fake expert “depicted as a member of the National Academy of Sciences who had earned a Ph.D. in a pertinent field from one elite university and who was now on the faculty of another.” The subject was then shown a book excerpt by that “expert,” in which the risk of the issue at hand was portrayed as high or low, well-founded or speculative. The results were stark: When the scientist’s position stated that global warming is real and human-caused, for instance, only 23 percent of hierarchical individualists agreed the person was a “trustworthy and knowledgeable expert.” Yet 88 percent of egalitarian communitarians accepted the same scientist’s expertise. Similar divides were observed on whether nuclear waste can be safely stored underground and whether letting people carry guns deters crime. (The alliances did not always hold. In another study, hierarchs and communitarians were in favor of laws that would compel the mentally ill to accept treatment, whereas individualists and egalitarians were opposed.)

“Head-on attempts to persuade can sometimes trigger a backfire effect, where people not only fail to change their minds when confronted with the facts—they may hold their wrong views more tenaciously than ever.”

In other words, people rejected the validity of a scientific source because its conclusion contradicted their deeply held views—and thus the relative risks inherent in each scenario. A hierarchal individualist finds it difficult to believe that the things he prizes (commerce, industry, a man’s freedom to possess a gun to defend his family) could lead to outcomes deleterious to society. Whereas egalitarian communitarians tend to think that the free market causes harm, that patriarchal families mess up kids, and that people can’t handle their guns. The study subjects weren’t “anti-science”—not in their own minds, anyway. It’s just that “science” was whatever they wanted it to be. “We’ve come to a misadventure, a bad situation where diverse citizens, who rely on diverse systems of cultural certification, are in conflict,” says Kahan.

And that undercuts the standard notion that the way to persuade people is via evidence and argument. In fact, head-on attempts to persuade can sometimes trigger a backfire effect, where people not only fail to change their minds when confronted with the facts—they may hold their wrong views more tenaciously than ever.

Take, for instance, the question of whether Saddam Hussein possessed hidden weapons of mass destruction just before the US invasion of Iraq in 2003. When political scientists Brendan Nyhan and Jason Reifler showed subjects fake newspaper articles in which this was first suggested (in a 2004 quote from President Bush) and then refuted (with the findings of the Bush-commissioned Iraq Survey Group report, which found no evidence of active WMD programs in pre-invasion Iraq), they found that conservatives were more likely than before to believe the claim. (The researchers also tested how liberals responded when shown that Bush did not actually “ban” embryonic stem-cell research. Liberals weren’t particularly amenable to persuasion, either, but no backfire effect was observed.)

Another study gives some inkling of what may be going through people’s minds when they resist persuasion. Northwestern University sociologist Monica Prasad and her colleagues wanted to test whether they could dislodge the notion that Saddam Hussein and Al Qaeda were secretly collaborating among those most likely to believe it—Republican partisans from highly GOP-friendly counties. So the researchers set up a study in which they discussed the topic with some of these Republicans in person. They would cite the findings of the 9/11 Commission, as well as a statement in which George W. Bush himself denied his administration had “said the 9/11 attacks were orchestrated between Saddam and Al Qaeda.”

“One study showed that not even Bush’s own words could change the minds of Bush voters who believed there was an Iraq-Al Qaeda link.”

As it turned out, not even Bush’s own words could change the minds of these Bush voters—just 1 of the 49 partisans who originally believed the Iraq-Al Qaeda claim changed his or her mind. Far more common was resisting the correction in a variety of ways, either by coming up with counterarguments or by simply being unmovable:

Interviewer: [T]he September 11 Commission found no link between Saddam and 9/11, and this is what President Bush said. Do you have any comments on either of those? 

Respondent: Well, I bet they say that the Commission didn’t have any proof of it but I guess we still can have our opinions and feel that way even though they say that.

The same types of responses are already being documented on divisive topics facing the current administration. Take the “Ground Zero mosque.” Using information from the political myth-busting site FactCheck.org, a team at Ohio State presented subjects with a detailed rebuttal to the claim that “Feisal Abdul Rauf, the Imam backing the proposed Islamic cultural center and mosque, is a terrorist-sympathizer.” Yet among those who were aware of the rumor and believed it, fewer than a third changed their minds.

A key question—and one that’s difficult to answer—is how “irrational” all this is. On the one hand, it doesn’t make sense to discard an entire belief system, built up over a lifetime, because of some new snippet of information. “It is quite possible to say, ‘I reached this pro-capital-punishment decision based on real information that I arrived at over my life,'” explains Stanford social psychologist Jon Krosnick. Indeed, there’s a sense in which science denial could be considered keenly “rational.” In certain conservative communities, explains Yale’s Kahan, “People who say, ‘I think there’s something to climate change,’ that’s going to mark them out as a certain kind of person, and their life is going to go less well.”

This may help explain a curious pattern Nyhan and his colleagues found when they tried to test the fallacy that President Obama is a Muslim. When a nonwhite researcher was administering their study, research subjects were amenable to changing their minds about the president’s religion and updating incorrect views. But when only white researchers were present, GOP survey subjects in particular were more likely to believe the Obama Muslim myth than before. The subjects were using “social desirabililty” to tailor their beliefs (or stated beliefs, anyway) to whoever was listening.

Which leads us to the media. When people grow polarized over a body of evidence, or a resolvable matter of fact, the cause may be some form of biased reasoning, but they could also be receiving skewed information to begin with—or a complicated combination of both. In the Ground Zero mosque case, for instance, a follow-up study showed that survey respondents who watched Fox News were more likely to believe the Rauf rumor and three related ones—and they believed them more strongly than non-Fox watchers.

Okay, so people gravitate toward information that confirms what they believe, and they select sources that deliver it. Same as it ever was, right? Maybe, but the problem is arguably growing more acute, given the way we now consume information—through the Facebook links of friends, or tweets that lack nuance or context, or “narrowcast” and often highly ideological media that have relatively small, like-minded audiences. Those basic human survival skills of ours, says Michigan’s Arthur Lupia, are “not well-adapted to our information age.”

“A predictor of whether you accept the science of global warming? Whether you’re a Republican or a Democrat.”

If you wanted to show how and why fact is ditched in favor of motivated reasoning, you could find no better test case than climate change. After all, it’s an issue where you have highly technical information on one hand and very strong beliefs on the other. And sure enough, one key predictor of whether you accept the science of global warming is whether you’re a Republican or a Democrat. The two groups have been growing more divided in their views about the topic, even as the science becomes more unequivocal.

So perhaps it should come as no surprise that more education doesn’t budge Republican views. On the contrary: In a 2008 Pew survey, for instance, only 19 percent of college-educated Republicans agreed that the planet is warming due to human actions, versus 31 percent of non-college educated Republicans. In other words, a higher education correlated with an increased likelihood of denying the science on the issue. Meanwhile, among Democrats and independents, more education correlated with greater acceptance of the science.

Other studies have shown a similar effect: Republicans who think they understand the global warming issue best are least concerned about it; and among Republicans and those with higher levels of distrust of science in general, learning more about the issue doesn’t increase one’s concern about it. What’s going on here? Well, according to Charles Taber and Milton Lodge of Stony Brook, one insidious aspect of motivated reasoning is that political sophisticates are prone to be more biased than those who know less about the issues. “People who have a dislike of some policy—for example, abortion—if they’re unsophisticated they can just reject it out of hand,” says Lodge. “But if they’re sophisticated, they can go one step further and start coming up with counterarguments.” These individuals are just as emotionally driven and biased as the rest of us, but they’re able to generate more and better reasons to explain why they’re right—and so their minds become harder to change.

That may be why the selectively quoted emails of Climategate were so quickly and easily seized upon by partisans as evidence of scandal. Cherry-picking is precisely the sort of behavior you would expect motivated reasoners to engage in to bolster their views—and whatever you may think about Climategate, the emails were a rich trove of new information upon which to impose one’s ideology.

Climategate had a substantial impact on public opinion, according to Anthony Leiserowitz, director of the Yale Project on Climate Change Communication. It contributed to an overall drop in public concern about climate change and a significant loss of trust in scientists. But—as we should expect by now—these declines were concentrated among particular groups of Americans: Republicans, conservatives, and those with “individualistic” values. Liberals and those with “egalitarian” values didn’t lose much trust in climate science or scientists at all. “In some ways, Climategate was like a Rorschach test,” Leiserowitz says, “with different groups interpreting ambiguous facts in very different ways.”

“Is there a case study of science denial that largely occupies the political left? Yes: the claim that childhood vaccines are causing an epidemic of autism.”

So is there a case study of science denial that largely occupies the political left? Yes: the claim that childhood vaccines are causing an epidemic of autism. Its most famous proponents are an environmentalist (Robert F. Kennedy Jr.) and numerous Hollywood celebrities (most notably Jenny McCarthy and Jim Carrey). The Huffington Post gives a very large megaphone to denialists. And Seth Mnookin, author of the new book The Panic Virus, notes that if you want to find vaccine deniers, all you need to do is go hang out at Whole Foods.

Vaccine denial has all the hallmarks of a belief system that’s not amenable to refutation. Over the past decade, the assertion that childhood vaccines are driving autism rates has been undermined by multiple epidemiological studies—as well as the simple fact that autism rates continue to rise, even though the alleged offending agent in vaccines (a mercury-based preservative called thimerosal) has long since been removed.

Yet the true believers persist—critiquing each new study that challenges their views, and even rallying to the defense of vaccine-autism researcher Andrew Wakefield, after his 1998 Lancet paper—which originated the current vaccine scare—was retracted and he subsequently lost his license (PDF) to practice medicine. But then, why should we be surprised? Vaccine deniers created their own partisan media, such as the website Age of Autism, that instantly blast out critiques and counterarguments whenever any new development casts further doubt on anti-vaccine views.

It all raises the question: Do left and right differ in any meaningful way when it comes to biases in processing information, or are we all equally susceptible?

There are some clear differences. Science denial today is considerably more prominent on the political right—once you survey climate and related environmental issues, anti-evolutionism, attacks on reproductive health science by the Christian right, and stem-cell and biomedical matters. More tellingly, anti-vaccine positions are virtually nonexistent among Democratic officeholders today—whereas anti-climate-science views are becoming monolithic among Republican elected officials.

Some researchers have suggested that there are psychological differences between the left and the right that might impact responses to new information—that conservatives are more rigid and authoritarian, and liberals more tolerant of ambiguity. Psychologist John Jost of New York University has further argued that conservatives are “system justifiers”: They engage in motivated reasoning to defend the status quo.

This is a contested area, however, because as soon as one tries to psychoanalyze inherent political differences, a battery of counterarguments emerges: What about dogmatic and militant communists? What about how the parties have differed through history? After all, the most canonical case of ideologically driven science denial is probably the rejection of genetics in the Soviet Union, where researchers disagreeing with the anti-Mendelian scientist (and Stalin stooge) Trofim Lysenko were executed, and genetics itself was denounced as a “bourgeois” science and officially banned.

The upshot: All we can currently bank on is the fact that we all have blinders in some situations. The question then becomes: What can be done to counteract human nature itself?

“We all have blinders in some situations. The question then becomes: What can be done to counteract human nature?”

Given the power of our prior beliefs to skew how we respond to new information, one thing is becoming clear: If you want someone to accept new evidence, make sure to present it to them in a context that doesn’t trigger a defensive, emotional reaction.

This theory is gaining traction in part because of Kahan’s work at Yale. In one study, he and his colleagues packaged the basic science of climate change into fake newspaper articles bearing two very different headlines—”Scientific Panel Recommends Anti-Pollution Solution to Global Warming” and “Scientific Panel Recommends Nuclear Solution to Global Warming”—and then tested how citizens with different values responded. Sure enough, the latter framing made hierarchical individualists much more open to accepting the fact that humans are causing global warming. Kahan infers that the effect occurred because the science had been written into an alternative narrative that appealed to their pro-industry worldview.

You can follow the logic to its conclusion: Conservatives are more likely to embrace climate science if it comes to them via a business or religious leader, who can set the issue in the context of different values than those from which environmentalists or scientists often argue. Doing so is, effectively, to signal a détente in what Kahan has called a “culture war of fact.” In other words, paradoxically, you don’t lead with the facts in order to convince. You lead with the values—so as to give the facts a fighting chance.

[Original link with access to mentioned studies here.]

>O custo de Belo Monte (JC, O Globo)

>
JC e-mail 4240, de 18 de Abril de 2011.

Artigo de Felício Pontes Jr* no jornal O Globo nesta segunda-feira (18).

A tecnologia para exploração da energia solar sempre foi apresentada como de alto custo, bastante superior aos de outras fontes de energia. Por isso, um país como o Brasil, privilegiado pela alta incidência de insolação em seu território, deixou de investir na
tecnologia solar em favor de outras fontes, principalmente a hídrica, responsável hoje pela geração de mais de 70% da energia no País. No entanto, esse argumento, o dos altos custos, não se justifica mais.

Nos Estados Unidos, dois projetos desenvolvidos na Califórnia de aproveitamento da energia térmica utilizando espelhos para a concentração de calor, Ivanpah e Blythe, provam que os custos dessa tecnologia já são bastante menores. O projeto Ivanpah, da empresa Brightsource, dobra a produção de energia solar no país. É prevista a geração de 370 MW de energia firme. São três usinas que, no total, terão um custo de R$ 3,4 bilhões. Já o projeto Blythe, das empresas Chevron e Solar Millennium, pretende produzir 960 MW ao custo de R$ 9,6 bilhões.

Se multiplicássemos o custo para geração de um megawatt nesses dois projetos de matriz solar por 4 mil megawatts médios – a quantidade, sendo otimista, de geração de energia prevista no projeto hidrelétrico de Belo Monte – teríamos um total de R$ 38 bilhões, no caso de Ivanpah, e de R$ 36,7 bilhões, se utilizarmos os valores relativos a Blythe.

Na primeira ação judicial contra Belo Monte, proposta em 2001, o governo dizia que a usina custaria R$ 10,4 bilhões. Ao pedir empréstimo ao BNDES, em 2011, o consórcio de empresas para fazer Belo Monte solicitou R$ 25 bilhões, o que representaria em torno de 80% dos custos. Logo, o custo oficial seria de R$ 31,2 bilhões. Nesse custo não estão previstos o valor do desmatamento que pode atingir 5,3 mil km² de floresta (segundo o próprio consórcio), o valor de 100 km de leito do Xingu que praticamente ficará seco, a indenização a povos indígenas e ribeirinhos localizados nesse trecho, todos os bairros de Altamira que estão abaixo da cota 100 e, portanto, serão inundados… só para mostrar alguns exemplos.

Os custos finais de Belo Monte ainda são incertos, graças ao descumprimento das leis do licenciamento ambiental em vários momentos. Conforme apontou o relatório de análise de riscos feito por especialistas e intitulado “Megaprojeto, Megarriscos”, Belo Monte tem elevados riscos associados a incertezas sobre a estrutura de custos de construção do empreendimento, referentes a fatores geológicos e topológicos, de engenharia e de instabilidade em valores de mercado. Tem elevados riscos financeiros relacionados à capacidade de geração de energia elétrica, que é muito inferior à capacidade instalada. E tem riscos associados à capacidade do empreendedor de atender obrigações legais de investir em ações de mitigação e compensação de impactos sociais e ambientais do empreendimento.

Assim, computando-se todos os custos socioambientais que normalmente estão fora do orçamento das hidrelétricas na Amazônia (vide Tucuruí, Jirau, Santo Antônio e Balbina) e mais os incertos custos da própria obra (como escavações), pode-se afirmar que o valor da energia solar já é competitivo com o de Belo Monte. Se não fosse, algumas das maiores empresas do mundo não estariam nessa área. O Grupo EBX investe na primeira usina solar comercial do País, no Ceará, a MPX Solar, com 4,4 mil painéis fotovoltaicos e capacidade de abastecer 1.500 residências. E a Google investe US$ 168 milhões no projeto Ivanpah.

Mas, enquanto países de clima temperado e com territórios muito menores, como a Alemanha e a Espanha, produzem mais energia a partir do sol do que o Brasil, aqui o governo prefere impor um modelo ultrapassado. E que agora não tem mais a vantagem de ser mais barato.

Em Belo Monte, senhores investidores, tenham certeza de que todos esses custos socioambientais serão cobrados se a barragem vier a ser construída.

*Procurador da República no Pará.
(O Globo)

>Classe média vai pautar eleições, afirmam analistas (OESP)

>
Para especialistas, questão levantada por FHC em artigo sobre como fisgar emergentes será determinante no futuro dos políticos

OESP, 16 de abril de 2011
Por Gabriel Manzano

A “nova classe média”, trazida ao centro do debate político pelo ex-presidente Fernando Henrique Cardoso, na semana passada, e namorada pelo PT, que vê na presidente Dilma Rousseff a figura talhada para conquistá-la, chegou para mudar o cenário eleitoral do País, admitem analistas, marqueteiros e estudiosos.

O tema apareceu no artigo O Papel da Oposição, divulgado por FHC, e reforçou a condição desse grupo como objeto de desejo do mundo político. É um vasto universo de 29 milhões de pessoas – pobres que, nos últimos seis anos, subiram da classe D para a C e carregam consigo novos comportamentos e expectativas. Analistas, líderes partidários, comunicólogos e marqueteiros já se esforçam para entender como reagirá, no futuro, esse segmento que, ao subir na vida, fez da classe média o maior grupo social do País, com 94 milhões de pessoas (51% da população).

“Não se trata de gente sem nada, que aceite qualquer coisa. É gente que trabalhou duro, subiu, sabe o que quer, tem mais informação e se torna mais exigente”, resume Marcia Cavallari, diretora executiva do Ibope. “Isso merece um discurso novo e FHC acertou ao mandar a oposição ir atrás dela”, disse.

Não por acaso, o economista Marcelo Néri, da Fundação Getúlio Vargas – primeiro a detectar esse fenômeno, num estudo de 2010 – considera essa iniciativa de Fernando Henrique “a segunda ideia mais inteligente da oposição em anos, depois do plano de estabilização dos anos 1994-2002”. Esse brasileiro, diz ele, “quer sonhar, e não apenas diminuir seus pesadelos”.

O impacto desse cenário já se faz sentir no mundo político, que ainda procura entender a enorme votação da candidata Marina Silva (PV) nas eleições presidenciais de 2010. “Mas é perda de tempo tentar adivinhar se é um grupo de esquerda ou de direita”, observa Antonio Prado, sócio-diretor da Análise, Pesquisa e Planejamento de Mercado (APPM), em São Paulo.

Oportunidades. Grande parte desses emergentes, afirma Prado, “são cidadãos que tomaram iniciativas, buscaram créditos, tornaram-se microempresários”. Seus filhos estão entrando na universidade via ProUni. “Como trabalhadores, não querem um Estado que os tutele, mas que lhes dê oportunidades para crescer.” E como cidadãos, continua o analista, eles esperam “que haja ordem na sociedade, para nenhum malandro lhes passar a perna” – afinal, esforçaram-se demais para chegar aonde chegaram. Dos políticos, esse eleitor espera “coerência e dedicação ao bem comum”.

Para quem imagina que isso tudo tem um certo jeito de direita, Prado avisa que “esse brasileiro já foi pobre e percebeu que uma tarefa prioritária do Estado é atacar as desigualdades”. Ou seja, a nova classe é a favor dos programas sociais.

A vendedora Solange Ferreira Luz, moradora da periferia de São Paulo, é um exemplo típico desse novo eleitor mais informado e mais exigente. “Minha maior preocupação é a escola de meus dois filhos”, diz ela. Tanto que juntou dinheiro para comprar um computador e prefere que eles estudem em escolas técnicas estaduais, que lhe parecem melhores que as municipais.

Para esse eleitor, perde peso o discurso sobre “a elite de 500 anos” ou o neoliberalismo. O que lhe interessa mais, lembra Marcia Cavallari, “é que há empregos e ele não tem preparo para se candidatar a muitos deles. Então, a qualidade do ensino se torna um fator decisivo para sua vida, para ele aprender e subir. E ele quer que seus filhos cheguem à universidade e tenham uma vida melhor que a sua. Isso torna inevitável, em próximas eleições, o debate qualificado sobre o nível da educação no Brasil.”

Pode-se estender essa nova percepção a outros setores. “Para esses emergentes sociais, é tudo novidade. Ele já faz viagens de avião – e os aeroportos estão como estão. O filho na universidade saberá avaliar melhor o nível da educação”, compara Renato Meirelles, diretor do instituto Datapopular, que faz estudos sobre o mercado popular no Brasil. Ele menciona, a propósito, pesquisas segundo as quais 68% dos filhos, na classe C, estudaram mais que os pais. Na classe A, esse percentual é de apenas 10%.

Ralé e batalhadores. Os limites desse cenário, no entanto, não podem ser ignorados. Primeiro, porque os “novos” se juntam a uma enorme classe média e podem, é claro, assimilar seus projetos e valores no dia a dia. Esse termo, nova classe média, “designa setores que ampliaram sua capacidade de consumo”, adverte Leôncio Martins Rodrigues, “mas não define especialmente um novo segmento social”.

O sociólogo Jessé Souza até se recusa a admitir que exista uma nova classe média: existem o que ele chama de “batalhadores”, uma multidão que tanto poderá ser “cooptada pelo discurso e pela prática individualista”, como “assumir um papel protagonista e ajudar a ‘ralé’ – as massas desassistidas”. O próprio Fernando Henrique afirma também que uma classe implica um estilo de vida, valores, e prefere falar de “novas categorias sociais”.

Marcelo Néri destaca, também, que “nem política nem economicamente há nada conquistado nesse público – nem pelo PT nem pelas oposições”. Além disso, “todos podem perder com a inflação, se ela voltar, e também com o desemprego”.

>Seminário comemora o Dia Meteorológico Mundial

>
JC e-mail 4222, de 22 de Março de 2011.

Palestras abordam mudanças climáticas, avanços tecnológicos na meteorologia e gerenciamento de risco de desastres.

O Instituto Nacional de Meteorologia (Inmet) realizará amanhã (23), em sua sede, em Brasília, o seminário Clima para Você, tema definido na 61ª Seção do Conselho Executivo da Organização Meteorológica Mundial (OMM) para as celebrações do Dia Meteorológico Mundial. Cinco palestras tratarão de temas da maior relevância no momento atual como mudanças climáticas, avanços tecnológicos na meteorologia e gerenciamento de risco de desastres naturais.

A data comemora a criação da OMM, em 1950, como uma agência especializada das Nações Unidas (ONU) que trata de três elementos fundamentais para o homem: o Clima, o Tempo e a Água.

O secretário geral da OMM, Michel Jarraud, em mensagem dirigida aos 189 países membros da Organização pelo Dia Meteorológico Mundial, disse que as atividades da OMM relacionadas ao clima são vistas hoje como fundamentais à segurança e bem-estar humanos e à consecução de benefícios econômicos para todas as nações.

Antonio Divino Moura, diretor do Inmet e terceiro vice-presidente da OMM, diz que o “o tema Clima e o Homem estão profundamente interligados. Há uma relação de reciprocidade entre eles: um afeta o outro. O modo de viver do homem tropical e do homem de latitudes médias e polares está diretamente relacionado ao clima. Por outro lado, o homem pode alterar o clima com ações que mudam elementos da natureza, como a queima de combustíveis fósseis; consequentemente, o clima reage. Hoje, com certo domínio tecnológico em escala planetária, o homem altera seu clima e sofre suas consequências.”
(Ascom Inmet)

>Palocci tentará melhorar o clima (JC, O Globo)

>
Ministro assume comando da política de mudanças climáticas para pôr fim a divergências.

JC e-mail 4218, de 16 de Março de 2011.

O chefe da Casa Civil, Antonio Palocci, assumiu o comando da pauta que vem ganhando mais destaque na área ambiental do governo: a política de mudanças climáticas, que, desde 2009, com a cúpula de Copenhague, rende ao Brasil prestígio internacional. A mudança de rumo no governo gerou inquietação nos dois principais ministérios que cuidam do tema: Meio Ambiente (MMA) e Ciência e Tecnologia (MCT). Ontem, foi efetivada a primeira baixa no Meio Ambiente, com a saída da secretária nacional de Mudanças Climáticas, Branca Americano, a ser substituída pelo pesquisador Eduardo Assad, da Embrapa.

Segundo a ministra do Meio Ambiente, Izabella Teixeira, que falou ontem a empresários e técnicos de governos estaduais, a ideia é acabar com as constantes divergências que a agenda climática causava entre os ministérios e obter sintonia. Visões conflitantes na Esplanada já causaram brigas e constrangimentos.

– Com a Rio+20, temos que tratar questões ambientais de forma diferente da que vínhamos tratando. Temos que ter uma estrutura de governança diferente. A mudança climática é o carro-chefe dessa discussão. Estamos trabalhando o melhor formato com o
MCT e a Casa Civil sob um novo modelo de governança da agenda de clima, a pedido da Casa Civil e do ministro Palocci. A Casa Civil, sendo o maestro, e o MCT e o MMA, os outros dois pés. Queremos acabar com as ilhas, para que haja convergência com a
agenda nacional – disse Izabella.

Departamento agregará temas

No MMA, a secretaria será transformada num super departamento, que deverá se chamar Secretaria do Clima, que agregará novos temas, como políticas de combate ao desmatamento, conservação de biodiversidade e gestão de florestas e recursos hídricos.

Na gestão do ministro do Meio Ambiente Carlos Minc, em 2009, o MMA teve problemas com o Itamaraty e o MCT. Minc pressionava para que o Brasil tivesse metas de redução de gases, e as outras duas pastas defendiam posição mais conservadora. Nas áreas que serão incorporadas pela nova secretaria, funcionários reclamam que Izabella não consultou os principais afetados sobre os novos rumos que seus trabalhos devem tomar.

Na Ciência e Tecnologia, técnicos que trabalham com mudança do clima resistem à proposta de reestruturação promovida com a nomeação do pesquisador Carlos Nobre para a Secretaria de Políticas e Programas de Pesquisa e Desenvolvimento. Para a conferência que acontecerá em Bangcoc, em abril, uma das preparatórias para o encontro anual da ONU, o MCT ainda não tem uma equipe formada para enviar. Uma das mudanças já anunciadas por Nobre atinge a menina dos olhos do ministério nesse setor: o Mecanismo de Desenvolvimento Limpo (MDL), que credencia projetos que reduzem emissões a receber créditos que podem ser negociados no mercado de carbono.

A guinada responde a uma das principais críticas feitas por elaboradores de projetos que reduzem emissões: a de que o processo de aceitação dessas propostas é excessivamente
burocrático.

– Vamos ter um novo olhar sobre o MDL. Vamos flexibilizar regras e torná-lo mais ágil. O Brasil tem condições de liderar, junto com a Escandinávia e a Alemanha, a transição para uma economia de baixo carbono – apontou o secretário.

Para o pesquisador do Instituto do Homem e Meio Ambiente da Amazônia (Imazon) Adalberto Veríssimo, a decisão do governo de pôr a Casa Civil na coordenação da política climática do governo é uma boa notícia. Ele argumenta que o aquecimento global é um problema que atravessa diferentes áreas temáticas e, por isso, deve estar no topo da hierarquia do Executivo.

– A coordenação de Palocci sobre as mudanças climáticas está correta. Esse é um assunto transversal, que interessa a várias áreas, como Minas e Energia, Transporte e Agricultura. É uma tarefa que vai precisar de equilíbrio. O aquecimento global é um dos pilares da discussão desta década. Colocá-lo na Casa Civil é sinal de que o Brasil quer continuar avançando na área – disse.

Sobre as mudanças nos ministérios, Veríssimo disse que a entrada de Nobre “oxigena” o debate dentro do MCT, que, segundo ele, contava com quadros retrógrados.

– Senti o MCT e o MMA falando a mesma língua. Foi a primeira vez que vi isso acontecer – disse Veríssimo.

Marina faz críticas a licenciamentos

A ex-senadora Marina Silva (PV) criticou ontem a ideia do governo federal de flexibilizar a concessão de licenciamentos ambientais para acelerar obras de infraestrutura. Ela falou antes de saber que Palocci cuidará da agenda climática.

– Vejo com preocupação essa história de mudar o processo de licenciamento ambiental. Acho que qualquer mudança dessa natureza, no sentido de flexibilizar, só vai agravar os problemas que estamos vivendo. O licenciamento tem um papel importante para reduzir e minimizar o impacto ambiental de uma obra – disse a candidata derrotada a presidente, após participar de aula magna do curso de pós-graduação do Instituto Nacional de Pesquisas Espaciais (Inpe), em São José dos Campos.

As propostas do governo serão implementadas por decretos que regularão o licenciamento de rodovias, portos, linhas de transmissão de energia elétrica, hidrovias e obras de exploração de petróleo do pré-sal.
(O Globo)

>Um novo furacão no Brasil (JC, O Globo)

>
Inmet, Marinha e Inpe divergem sobre tempestade Arani, que atinge litoral.

JC e-mail 4218, de 16 de Março de 2011.

Um fenômeno climático que tem provocado chuva intensa do Norte Fluminense ao sul da Bahia divide os principais órgãos meteorológicos do país. O Instituto Nacional de Meteorologia (Inmet) chama o Arani, como foi batizado, de furacão. Em um alerta especial, ressaltou a ocorrência de ventos de até 120 km/h sobre o Oceano Atlântico.

O diagnóstico, porém, não é compartilhado pela Marinha do Brasil, que define o mesmo fenômeno como tempestade subtropical – uma escala de gravidade abaixo -, nem pelo Instituto Nacional de Pesquisas Espaciais (Inpe), que afirma tratar-se de uma depressão tropical – outro degrau abaixo no nível de periculosidade.

Fenômeno se afasta da costa brasileira

O Arani (“tempo furioso”, em tupi) se formou pela conjunção de água e ar quentes em uma área de forte instabilidade próxima à costa do Espírito Santo. Esse sistema provocou uma circulação ciclônica de ventos, além de grandes volumes de chuva naquele estado. O perigo não foi maior porque a formação está sobre alto-mar e, nos próximos dois dias, deve se dirigir para sudeste, afastando-se ainda mais do litoral brasileiro.

De acordo com o Inmet, o Arani ganhou mais força quando se afastou do litoral, adquirindo as características de um furacão híbrido. Trata-se de uma formação diferente das que costumam devastar o Caribe e o Atlântico Norte, pois, em vez de um sistema independente, que se alimenta do aquecimento das águas do mar, está associado a um ciclone, originado de uma frente fria.

O furacão está a 110 quilômetros da costa brasileira e só representa ameaça a embarcações e aviões que sobrevoem a região do Cabo de São Tomé, litoral do Rio, que está em sua rota para o oceano. Nos próximos dias, o Arani deve atingir águas internacionais, e o monitoramento caberá à África do Sul.

O Inmet classificou o fenômeno com a ajuda de órgãos americanos de monitoramento de furacões. De acordo com a meteorologista Morgana Almeida, da equipe do instituto, não há risco de o movimento atual do fenômeno se inverter, trazendo prejuízos ao continente. O instituto alertou autoridades da Marinha, que tomaram providências para evitar o tráfego na área atingida pelos fortes ventos.

Mas o próprio Serviço Meteorológico da Marinha classifica o Arani de outra forma. O órgão identificou rajadas de, no máximo, 80 km/h. Há grande precipitação em alto-mar, mas as ondas provocadas por elas, de 3 a 4 metros, têm o mesmo tamanho daquelas formadas por uma frente fria.

– Formações como essa não são comuns, mas podem ocorrer no verão – ressalta a meteorologista Caroline Vidal Ferreira da Guia, do Inpe. – O Arani tem força para provocar transtornos à população, mas, segundo nossas medições, não chega a ser um furacão.
(O Globo)

>Living in Denial: Climate Change, Emotions, and Everyday Life (MIT Press)

>
Book release (April 2011, MIT Press):

Kari Marie Norgaard

Global warming is the most significant environmental issue of our time, yet public response in Western nations has been meager. Why have so few taken any action? In Living in Denial, sociologist Kari Norgaard searches for answers to this question, drawing on interviews and ethnographic data from her study of “Bygdaby,” the fictional name of an actual rural community in western Norway, during the unusually warm winter of 2001-2002.

In 2001-2002 the first snowfall came to Bygdaby two months later than usual; ice fishing was impossible; and the ski industry had to invest substantially in artificial snow-making. Stories in local and national newspapers linked the warm winter explicitly to global warming. Yet residents did not write letters to the editor, pressure politicians, or cut down on use of fossil fuels. Norgaard attributes this lack of response to the phenomenon of socially organized denial, by which information about climate science is known in the abstract but disconnected from political, social, and private life, and sees this as emblematic of how citizens of industrialized countries are responding to global warming.

Norgaard finds that for the highly educated and politically savvy residents of Bygdaby, global warming was both common knowledge and unimaginable. Norgaard traces this denial through multiple levels, from emotions to cultural norms to political economy. Her report from Bygdaby, supplemented by comparisons throughout the book to the United States, tells a larger story behind our paralysis in the face of today’s alarming predictions from climate scientists.

About the Author

Kari Marie Norgaard is Assistant Professor of Sociology at the University of Oregon.

>Can a group of scientists in California end the war on climate change? (Guardian)

>
The Berkeley Earth project say they are about to reveal the definitive truth about global warming

Ian Sample
guardian.co.uk
Sunday 27 February 2011 20.29 GMT

Richard Muller of the Berkeley Earth project is convinced his approach will lead to a better assessment of how much the world is warming. Photograph: Dan Tuffs for the Guardian

In 1964, Richard Muller, a 20-year-old graduate student with neat-cropped hair, walked into Sproul Hall at the University of California, Berkeley, and joined a mass protest of unprecedented scale. The activists, a few thousand strong, demanded that the university lift a ban on free speech and ease restrictions on academic freedom, while outside on the steps a young folk-singer called Joan Baez led supporters in a chorus of We Shall Overcome. The sit-in ended two days later when police stormed the building in the early hours and arrested hundreds of students. Muller was thrown into Oakland jail. The heavy-handedness sparked further unrest and, a month later, the university administration backed down. The protest was a pivotal moment for the civil liberties movement and marked Berkeley as a haven of free thinking and fierce independence.

Today, Muller is still on the Berkeley campus, probably the only member of the free speech movement arrested that night to end up with a faculty position there – as a professor of physics. His list of publications is testament to the free rein of tenure: he worked on the first light from the big bang, proposed a new theory of ice ages, and found evidence for an upturn in impact craters on the moon. His expertise is highly sought after. For more than 30 years, he was a member of the independent Jason group that advises the US government on defence; his college lecture series, Physics for Future Presidents was voted best class on campus, went stratospheric on YouTube and, in 2009, was turned into a bestseller.

For the past year, Muller has kept a low profile, working quietly on a new project with a team of academics hand-picked for their skills. They meet on campus regularly, to check progress, thrash out problems and hunt for oversights that might undermine their work. And for good reason. When Muller and his team go public with their findings in a few weeks, they will be muscling in on the ugliest and most hard-fought debate of modern times.

Muller calls his latest obsession the Berkeley Earth project. The aim is so simple that the complexity and magnitude of the undertaking is easy to miss. Starting from scratch, with new computer tools and more data than has ever been used, they will arrive at an independent assessment of global warming. The team will also make every piece of data it uses – 1.6bn data points – freely available on a website. It will post its workings alongside, including full information on how more than 100 years of data from thousands of instruments around the world are stitched together to give a historic record of the planet’s temperature.

Muller is fed up with the politicised row that all too often engulfs climate science. By laying all its data and workings out in the open, where they can be checked and challenged by anyone, the Berkeley team hopes to achieve something remarkable: a broader consensus on global warming. In no other field would Muller’s dream seem so ambitious, or perhaps, so naive.

“We are bringing the spirit of science back to a subject that has become too argumentative and too contentious,” Muller says, over a cup of tea. “We are an independent, non-political, non-partisan group. We will gather the data, do the analysis, present the results and make all of it available. There will be no spin, whatever we find.” Why does Muller feel compelled to shake up the world of climate change? “We are doing this because it is the most important project in the world today. Nothing else comes close,” he says.

Muller is moving into crowded territory with sharp elbows. There are already three heavyweight groups that could be considered the official keepers of the world’s climate data. Each publishes its own figures that feed into the UN’s Intergovernmental Panel on Climate Change. Nasa’s Goddard Institute for Space Studies in New York City produces a rolling estimate of the world’s warming. A separate assessment comes from another US agency, the National Oceanic and Atmospheric Administration (Noaa). The third group is based in the UK and led by the Met Office. They all take readings from instruments around the world to come up with a rolling record of the Earth’s mean surface temperature. The numbers differ because each group uses its own dataset and does its own analysis, but they show a similar trend. Since pre-industrial times, all point to a warming of around 0.75C.

You might think three groups was enough, but Muller rolls out a list of shortcomings, some real, some perceived, that he suspects might undermine public confidence in global warming records. For a start, he says, warming trends are not based on all the available temperature records. The data that is used is filtered and might not be as representative as it could be. He also cites a poor history of transparency in climate science, though others argue many climate records and the tools to analyse them have been public for years.

Then there is the fiasco of 2009 that saw roughly 1,000 emails from a server at the University of East Anglia’s Climatic Research Unit (CRU) find their way on to the internet. The fuss over the messages, inevitably dubbed Climategate, gave Muller’s nascent project added impetus. Climate sceptics had already attacked James Hansen, head of the Nasa group, for making political statements on climate change while maintaining his role as an objective scientist. The Climategate emails fuelled their protests. “With CRU’s credibility undergoing a severe test, it was all the more important to have a new team jump in, do the analysis fresh and address all of the legitimate issues raised by sceptics,” says Muller.

This latest point is where Muller faces his most delicate challenge. To concede that climate sceptics raise fair criticisms means acknowledging that scientists and government agencies have got things wrong, or at least could do better. But the debate around global warming is so highly charged that open discussion, which science requires, can be difficult to hold in public. At worst, criticising poor climate science can be taken as an attack on science itself, a knee-jerk reaction that has unhealthy consequences. “Scientists will jump to the defence of alarmists because they don’t recognise that the alarmists are exaggerating,” Muller says.

The Berkeley Earth project came together more than a year ago, when Muller rang David Brillinger, a statistics professor at Berkeley and the man Nasa called when it wanted someone to check its risk estimates of space debris smashing into the International Space Station. He wanted Brillinger to oversee every stage of the project. Brillinger accepted straight away. Since the first meeting he has advised the scientists on how best to analyse their data and what pitfalls to avoid. “You can think of statisticians as the keepers of the scientific method, ” Brillinger told me. “Can scientists and doctors reasonably draw the conclusions they are setting down? That’s what we’re here for.”

For the rest of the team, Muller says he picked scientists known for original thinking. One is Saul Perlmutter, the Berkeley physicist who found evidence that the universe is expanding at an ever faster rate, courtesy of mysterious “dark energy” that pushes against gravity. Another is Art Rosenfeld, the last student of the legendary Manhattan Project physicist Enrico Fermi, and something of a legend himself in energy research. Then there is Robert Jacobsen, a Berkeley physicist who is an expert on giant datasets; and Judith Curry, a climatologist at Georgia Institute of Technology, who has raised concerns over tribalism and hubris in climate science.

Robert Rohde, a young physicist who left Berkeley with a PhD last year, does most of the hard work. He has written software that trawls public databases, themselves the product of years of painstaking work, for global temperature records. These are compiled, de-duplicated and merged into one huge historical temperature record. The data, by all accounts, are a mess. There are 16 separate datasets in 14 different formats and they overlap, but not completely. Muller likens Rohde’s achievement to Hercules’s enormous task of cleaning the Augean stables.

The wealth of data Rohde has collected so far – and some dates back to the 1700s – makes for what Muller believes is the most complete historical record of land temperatures ever compiled. It will, of itself, Muller claims, be a priceless resource for anyone who wishes to study climate change. So far, Rohde has gathered records from 39,340 individual stations worldwide.

Publishing an extensive set of temperature records is the first goal of Muller’s project. The second is to turn this vast haul of data into an assessment on global warming. Here, the Berkeley team is going its own way again. The big three groups – Nasa, Noaa and the Met Office – work out global warming trends by placing an imaginary grid over the planet and averaging temperatures records in each square. So for a given month, all the records in England and Wales might be averaged out to give one number. Muller’s team will take temperature records from individual stations and weight them according to how reliable they are.

This is where the Berkeley group faces its toughest task by far and it will be judged on how well it deals with it. There are errors running through global warming data that arise from the simple fact that the global network of temperature stations was never designed or maintained to monitor climate change. The network grew in a piecemeal fashion, starting with temperature stations installed here and there, usually to record local weather.

Among the trickiest errors to deal with are so-called systematic biases, which skew temperature measurements in fiendishly complex ways. Stations get moved around, replaced with newer models, or swapped for instruments that record in celsius instead of fahrenheit. The times measurements are taken varies, from say 6am to 9pm. The accuracy of individual stations drift over time and even changes in the surroundings, such as growing trees, can shield a station more from wind and sun one year to the next. Each of these interferes with a station’s temperature measurements, perhaps making it read too cold, or too hot. And these errors combine and build up.

This is the real mess that will take a Herculean effort to clean up. The Berkeley Earth team is using algorithms that automatically correct for some of the errors, a strategy Muller favours because it doesn’t rely on human interference. When the team publishes its results, this is where the scrutiny will be most intense.

Despite the scale of the task, and the fact that world-class scientific organisations have been wrestling with it for decades, Muller is convinced his approach will lead to a better assessment of how much the world is warming. “I’ve told the team I don’t know if global warming is more or less than we hear, but I do believe we can get a more precise number, and we can do it in a way that will cool the arguments over climate change, if nothing else,” says Muller. “Science has its weaknesses and it doesn’t have a stranglehold on the truth, but it has a way of approaching technical issues that is a closer approximation of truth than any other method we have.”

He will find out soon enough if his hopes to forge a true consensus on climate change are misplaced. It might not be a good sign that one prominent climate sceptic contacted by the Guardian, Canadian economist Ross McKitrick, had never heard of the project. Another, Stephen McIntyre, whom Muller has defended on some issues, hasn’t followed the project either, but said “anything that [Muller] does will be well done”. Phil Jones at the University of East Anglia was unclear on the details of the Berkeley project and didn’t comment.

Elsewhere, Muller has qualified support from some of the biggest names in the business. At Nasa, Hansen welcomed the project, but warned against over-emphasising what he expects to be the minor differences between Berkeley’s global warming assessment and those from the other groups. “We have enough trouble communicating with the public already,” Hansen says. At the Met Office, Peter Stott, head of climate monitoring and attribution, was in favour of the project if it was open and peer-reviewed.

Peter Thorne, who left the Met Office’s Hadley Centre last year to join the Co-operative Institute for Climate and Satellites in North Carolina, is enthusiastic about the Berkeley project but raises an eyebrow at some of Muller’s claims. The Berkeley group will not be the first to put its data and tools online, he says. Teams at Nasa and Noaa have been doing this for many years. And while Muller may have more data, they add little real value, Thorne says. Most are records from stations installed from the 1950s onwards, and then only in a few regions, such as North America. “Do you really need 20 stations in one region to get a monthly temperature figure? The answer is no. Supersaturating your coverage doesn’t give you much more bang for your buck,” he says. They will, however, help researchers spot short-term regional variations in climate change, something that is likely to be valuable as climate change takes hold.

Despite his reservations, Thorne says climate science stands to benefit from Muller’s project. “We need groups like Berkeley stepping up to the plate and taking this challenge on, because it’s the only way we’re going to move forwards. I wish there were 10 other groups doing this,” he says.

For the time being, Muller’s project is organised under the auspices of Novim, a Santa Barbara-based non-profit organisation that uses science to find answers to the most pressing issues facing society and to publish them “without advocacy or agenda”. Funding has come from a variety of places, including the Fund for Innovative Climate and Energy Research (funded by Bill Gates), and the Department of Energy’s Lawrence Berkeley Lab. One donor has had some climate bloggers up in arms: the man behind the Charles G Koch Charitable Foundation owns, with his brother David, Koch Industries, a company Greenpeace called a “kingpin of climate science denial”. On this point, Muller says the project has taken money from right and left alike.

No one who spoke to the Guardian about the Berkeley Earth project believed it would shake the faith of the minority who have set their minds against global warming. “As new kids on the block, I think they will be given a favourable view by people, but I don’t think it will fundamentally change people’s minds,” says Thorne. Brillinger has reservations too. “There are people you are never going to change. They have their beliefs and they’re not going to back away from them.”

Waking across the Berkeley campus, Muller stops outside Sproul Hall, where he was arrested more than 40 years ago. Today, the adjoining plaza is a designated protest spot, where student activists gather to wave banners, set up tables and make speeches on any cause they choose. Does Muller think his latest project will make any difference? “Maybe we’ll find out that what the other groups do is absolutely right, but we’re doing this in a new way. If the only thing we do is allow a consensus to be reached as to what is going on with global warming, a true consensus, not one based on politics, then it will be an enormously valuable achievement.”

>Can Geoengineering Save the World from Global Warming? (Scientific American)

>
Ask the Experts | Energy & Sustainability
Scientific American

Is manipulating Earth’s environment to combat climate change a good idea–and where, exactly, did the idea come from?

By David Biello | February 25, 2011

STARFISH PRIME: This nighttime atmospheric nuclear weapons test generated an aurora (pictured) in Earth’s magnetic field, along with an electromagnetic pulse that blew out streetlights in Honolulu. It is seen as an early instance of geoengineering by science historian James Fleming. Image: Courtesy of US Govt. Defense Threat Reduction Agency

As efforts to combat climate change falter despite ever-rising concentrations of heat-trapping gases in the atmosphere, some scientists and other experts have begun to consider the possibility of using so-called geoengineering to fix the problem. Such “deliberate, large-scale manipulation of the planetary environment” as the Royal Society of London puts it, is fraught with peril, of course.

For example, one of the first scientists to predict global warming as a result of increasing concentrations of greenhouse gases in the atmosphere—Swedish chemist Svante Arrhenius—thought this might be a good way to ameliorate the winters of his native land and increase its growing season. Whereas that may come true for the human inhabitants of Scandinavia, polar plants and animals are suffering as sea ice dwindles and temperatures warm even faster than climatologists predicted.

Scientific American corresponded with science historian James Fleming of Colby College in Maine, author of Fixing the Sky: The Checkered History of Weather and Climate Control, about the history of geoengineering—ranging from filling the air with the artificial aftermath of a volcanic eruption to seeding the oceans with iron in order to promote plankton growth—and whether it might save humanity from the ill effects of climate change.

[An edited transcript of the interview follows.]

What is geoengineering in your view?
Geoengineering is planetary-scale intervention [in]—or tinkering with—planetary processes. Period.

As I write in my book, Fixing the Sky: The Checkered History of Weather and Climate Control, “the term ‘geoengineering’ remains largely undefined,” but is loosely, “the intentional large-scale manipulation of the global environment; planetary tinkering; a subset of terraforming or planetary engineering.”

As of June 2010 the term has a draft entry in the Oxford English Dictionary—the modification of the global environment or the climate in order to counter or ameliorate climate change. A 2009 report issued by the Royal Society of London defines geoengineering as “the deliberate large-scale manipulation of the planetary environment to counteract anthropogenic climate change.”

But there are significant problems with both definitions. First of all, an engineering practice defined by its scale (geo) need not be constrained by its stated purpose (environmental improvement), by any of its currently proposed techniques (stratospheric aerosols, space mirrors, etcetera) or by one of perhaps many stated goals (to ameliorate or counteract climate change). Nuclear engineers, for example, are capable of building both power plants and bombs; mechanical engineers can design components for both ambulances and tanks. So to constrain the essence of something by its stated purpose, techniques or goals is misleading at best.

Geo-scale engineering projects were conducted by both the U.S. and the Soviet Union between 1958 and 1962 that had nothing to do with countering or ameliorating climate change. Starting with the [U.S.’s] 1958 Argus A-bomb explosions in space and ending with the 1962 Starfish Prime H-bomb test, the militaries of both nations sought to modify the global environment for military purposes.

Project Argus was a top-secret military test aimed at detonating atomic bombs in space to generate an artificial radiation belt, disrupt the near-space environment, and possibly intercept enemy missiles. It, and the later tests conducted by both the U.S. and the Soviet Union, peaked with H-bomb detonations in space in 1962 that created an artificial [electro]magnetic [radiation] belt that persisted for 10 years. This is geoengineering.

This idea of detonating bombs in near-space was proposed in 1957 by Nicholas Christofilos, a physicist at Lawrence Berkeley National Laboratory. His hypothesis, which was pursued by the [U.S.] Department of Defense’s Advanced Research Projects Agency [subsequently known as DARPA] and tested in Project Argus and other nuclear shots, held that the debris from a nuclear explosion, mainly highly energetic electrons, would be contained within lines of force in Earth’s magnetic field and would travel almost instantly as a giant current spanning up to half a hemisphere. Thus, if a detonation occurred above a point in the South Atlantic, immense currents would flow along the magnetic lines to a point far to the north, such as Greenland, where they would severely disrupt radio communications. A shot in the Indian Ocean might, then, generate a huge electromagnetic pulse over Moscow. In addition to providing a planetary “energy ray,” Christofilos thought nuclear shots in space might also disrupt military communications, destroy satellites and the electronic guidance systems of enemy [intercontinental ballistic missiles], and possibly kill any military cosmonauts participating in an attack launched from space. He proposed thousands of them to make a space shield.

So nuclear explosions in space by the U.S. and the Soviet Union constituted some of the earliest attempts at geoengineering, or intentional human intervention in planetary-scale processes.

The neologism “geoengineer” refers to one who contrives, designs or invents at the largest planetary scale possible for either military or civilian purposes. Today, geoengineering, as an unpracticed art, may be considered “geoscientific speculation”. Geoengineering is a subset of terraformation, which also does not exist outside of the fantasies of some engineers.

I have recently written to the Oxford English Dictionary asking them to correct their draft definition.

Can geoengineering save the world from climate change?
In short, I think it may be infinitely more dangerous than climate change, largely due to the suspicion and social disruption it would trigger by changing humanity’s relationship to nature.

To take just one example from my book, on page 194: “Sarnoff Predicts Weather Control” read the headline on the front page of The New York Times on October 1, 1946. The previous evening, at his testimonial dinner at the Waldorf Astoria, RCA president Brig. Gen. David Sarnoff had speculated on worthy peaceful projects for the postwar era. Among them were “transformations of deserts into gardens through diversion of ocean currents,” a technique that could also be reversed in time of war to turn fertile lands into deserts, and ordering “rain or sunshine by pressing radio buttons,” an accomplishment that, Sarnoff declared, would require a “World Weather Bureau” in charge of global forecasting and control (much like the “Weather Distributing Administration” proposed in 1938). A commentator in The New Yorker intuited the problems with such control: “Who” in this civil service outfit, he asked, “would decide whether a day was to be sunny, rainy, overcast…or enriched by a stimulating blizzard?” It would be “some befuddled functionary,” probably bedeviled by special interests such as the raincoat and galoshes manufacturers, the beachwear and sunburn lotion industries, and resort owners and farmers. Or if a storm was to be diverted—”Detour it where? Out to sea, to hit some ship with no influence in Washington?”

How old is the idea of geoengineering? What other names has it had?
I can trace geoengineering’s direct modern legacy to 1945, and have prepared a table of such proposals and efforts for the [Government Accountability Office]. Nuclear weapons, digital computers and satellites seem to be the modern technologies of choice. Geoengineering has also been called terraformation and, more restrictively, climate engineering, climate intervention or climate modification. Many have proposed abandoning the term geoengineering in favor of solar radiation management and carbon (or carbon dioxide) capture and storage. Of course, the idea of control of nature is ancient—for example, Phaeton or Archimedes.

Phaeton, the son of Helios, received permission from his father [the Greek sun god] to drive the sun chariot, but failed to control it, putting the Earth in danger of burning up. He was killed by a thunderbolt from Zeus to prevent further disaster. Recently, a prominent meteorologist has written about climate control and urged us to “take up Phaeton’s reins,” which is not a good idea.

Archimedes is known as an engineer who said: “Give me a lever long enough and a place to stand, and I will move the Earth.” Some geoengineers think that this is now possible and that science and technology have given us an Archimedean set of levers with which to move the planet. But I ask: “Where will it roll if you tip it?”

How are weather control and climate control related?
Weather and climate are intimately related: Weather is the state of the atmosphere at a given place and time, while climate is the aggregate of weather conditions over time. A vast body of scientific literature addresses these interactions. In addition, historians are revisiting the ancient but elusive term klima, seeking to recover its multiple social connotations. Weather, climate and the climate of opinion matter in complex ways that invite—some might say require or demand—the attention of both scientists and historians. Yet some may wonder how weather and climate are interrelated rather than distinct. Both, for example, are at the center of the debate over greenhouse warming and hurricane intensity. A few may claim that rainmaking, for example, has nothing to do with climate engineering, but any intervention in the Earth’s radiation or heat budget (such as managing solar radiation) would affect the general circulation and thus the location of upper-level patterns, including the jet stream and storm tracks. Thus, the weather itself would be changed by such manipulation. Conversely, intervening in severe storms by changing their intensity or their tracks or modifying weather on a scale as large as a region, a continent or the Pacific Basin would obviously affect cloudiness, temperature and precipitation patterns with major consequences for monsoonal flows, and ultimately the general circulation. If repeated systematically, such interventions would influence the overall heat budget and the climate.

Both weather and climate control have long and checkered histories: My book explains [meteorologist] James Espy’s proposal in the 1830s to set fire to the crest of the Appalachian Mountains every Sunday evening to generate heated updrafts that would stimulate rain and clear the air for cities of the east coast. It also examines efforts to fire cannons at the clouds in the arid Southwest in the hope of generating rain by concussion.

In the 1920s airplanes loaded with electrified sand were piloted by military aviators who “attacked” the clouds in futile attempts to both make rain and clear fog. Many others have proposed either a world weather control agency or creating a global thermostat, either by burning vast quantities of fossil fuels if an ice age threatened or sucking the CO2 out of the air if the world overheated.

After 1945 three technologies—nuclear weapons, digital computers and satellites—dominated discussions about ultimate weather and climate control, but with very little acknowledgement that unintended consequences and social disruption may be more damaging than any presumed benefit.

What would be the ideal role for geoengineering in addressing climate change?
That it generates interest in and awareness of the impossibility of heavy-handed intervention in the climate system, since there could be no predictable outcome of such intervention, physically, politically or socially.

Why do scientists continue to pursue this then, after 200 or so years of failure?
Science fantasy is informed by science fiction and driven by hubris. One of the dictionary definitions of hubris cites Edward Teller (the godfather of modern geoengineering).

Teller’s hubris knew no bounds. He was the [self-proclaimed] father of the H-bomb and promoted all things atomic, even talking about using nuclear weapons to create canals and harbors. He was also an advocate of urban sprawl to survive nuclear attack, the Star Wars [missile] defense system, and a planetary sunscreen to reduce global warming. He wanted to control nature and improve it using technology.

Throughout history rainmakers and climate engineers have typically fallen into two categories: commercial charlatans using technical language and proprietary techniques to cash in on a gullible public, and sincere but deluded scientific practitioners exhibiting a modicum of chemical and physical knowledge, a bare minimum of atmospheric insight, and an abundance of hubris. We should base our decision-making not on what we think we can do “now” and in the near future. Rather, our knowledge is shaped by what we have and have not done in the past. Such are the grounds for making informed decisions and avoiding the pitfalls of rushing forward, claiming we know how to “fix the sky.”

>What we have and haven’t learned from ‘Climategate’

>
DON’T KNOW MUCH AGNOTOLOGY

Grist.org
BY David Roberts
28 FEB 2011 1:29 PM

I wrote about the “Climategate” controversy (over emails stolen from the University of East Anglia’s Climatic Research Unit) once, which is about what it warranted.

My silent protest had no effect whatsoever, of course, and the story followed a depressingly familiar trajectory: hyped relentlessly by right-wing media, bullied into the mainstream press as he-said she-said, and later, long after the damage is done, revealed as utterly bereft of substance. It’s a familiar script for climate faux controversies, though this one played out on a slightly grander scale.

Investigations galore

Consider that there have now been five, count ‘em five, inquiries into the matter. Penn State established an independent inquiry into the accusations against scientist Michael Mann and found “no credible evidence” [PDF] of improper research conduct. A British government investigation run by the House of Commons’ Science and Technology Committee found that while the CRU scientists could have been more transparent and responsive to freedom-of-information requests, there was no evidence of scientific misconduct. The U.K.’s Royal Society (its equivalent of the National Academies) ran an investigation that found “no evidence of any deliberate scientific malpractice.” The University of East Anglia appointed respected civil servant Sir Muir Russell to run an exhaustive, six-month independent inquiry; he concluded that “the honesty and rigour of CRU as scientists are not in doubt … We have not found any evidence of behaviour that might undermine the conclusions of the IPCC assessments.”

All those results are suggestive, but let’s face it, they’re mostly … British. Sen. James Inhofe (R-Okla.) wanted an American investigation of all the American scientists involved in these purported dirty deeds. So he asked the Department of Commerce’s inspector general to get to the bottom of it. On Feb. 18, the results of that investigation were released. “In our review of the CRU emails,” the IG’s office said in its letter to Inhofe [PDF], “we did not find any evidence that NOAA inappropriately manipulated data … or failed to adhere to appropriate peer review procedures.” (Oddly, you’ll find no mention of this central result in Inhofe’s tortured public response.)

Whatever legitimate issues there may be about the responsiveness or transparency of this particular group of scientists, there was nothing in this controversy — nothing — that cast even the slightest doubt on the basic findings of climate science. Yet it became a kind of stain on the public image of climate scientists. How did that happen?

Smooth criminals

You don’t hear about it much in the news coverage, but recall, the story began with a crime. Hackers broke into the East Anglia email system and stole emails and documents, an illegal invasion of privacy. Yet according to The Wall Street Journal’s Kim Strassel, the emails “found their way to the internet.” In ABC science correspondent Ned Potter’s telling, the emails “became public.” The New York Times’ Andy Revkin says they were “extracted from computers.”

None of those phrasings are wrong, per se, but all pass rather lightly over the fact that some actual person or persons put them on the internet, made them public, extracted them from the computers. Someone hacked in, collected emails, sifted through and selected those that could be most damning, organized them, and timed the release for maximum impact, just before the Copenhagen climate talks. Said person or persons remain uncaught, uncharged, and unprosecuted. There have since been attempted break-ins at other climate research institutions.

If step one was crime, step two was character assassination. When the emails were released, they were combed over by skeptic blogs and right-wing media, who collected sentences, phrases, even individual terms that, when stripped of all context, create the worst possible impression. Altogether the whole thing was as carefully staged as any modern-day political attack ad.

Yet when the “scandal” broke, rather than being about criminal theft and character assassination, it was instantly “Climategate.” It was instantly about climate scientists, not the illegal and dishonest tactics of their attackers. The scientists, not the ideologues and ratf*ckers, had to defend themselves.

Burden of proof

It’s a numbingly familiar pattern in media coverage. The conservative movement that’s been attacking climate science for 20 years has a storied history of demonstrable fabrications, distortions, personal attacks, and nothingburger faux-scandals — not only on climate science, but going back to asbestos, ozone, leaded gasoline, tobacco, you name it. They don’t follow the rigorous standards of professional science; they follow no intellectual or ethical standards whatsoever. Yet no matter how long their record of viciousness and farce, every time the skeptic blogosphere coughs up a new “ZOMG!” it’s as though we start from zero again, like no one has a memory longer than five minutes.

Here’s the basic question: At this point, given their respective accomplishments and standards, wouldn’t it make sense to give scientists the strong benefit of the doubt when they are attacked by ideologues with a history of dishonesty and error? Shouldn’t the threshold for what counts as a “scandal” have been nudged a bit higher?

Agnotological inquiry

The lesson we’ve learned from climategate is simple. It’s the same lesson taught by death panels, socialist government takeover, Sharia law, and Obama’s birth certificate. To understand it we must turn to agnotology, the study of culturally induced ignorance or doubt. (Hat tip to an excellent recent post on this by John Quiggen.)

Beck, Palin, and the rest of Fox News and talk radio operate on the pretense that they are giving consumers access to a hidden “universe of reality,” to use Limbaugh’s term. It’s a reality being actively obscured the “lamestream media,” academics, scientists, and government officials. Affirming the tenets of that secret reality has become an act of tribal reinforcement, the equivalent of a secret handshake.

The modern right has created a closed epistemic loop containing millions of people. Within that loop, the implausibility or extremity of a claim itself counts as evidence. The more liberal elites reject it, the more it entrenches itself. Standards of evidence have nothing to do with it.

The notion that there is a global conspiracy by professional scientists to falsify results in order to get more research money is, to borrow Quiggen’s words about birtherism, “a shibboleth, that is, an affirmation that marks the speaker as a member of their community or tribe.” Once you have accepted that shibboleth, anything offered to you as evidence of its truth, no matter how ludicrous, will serve as affirmation. (Even a few context-free lines cherry-picked from thousands of private emails.)

Living with the loop

There’s one thing we haven’t learned from climategate (or death panels or birtherism). U.S. politics now contains a large, well-funded, tightly networked, and highly amplified tribe that defines itself through rejection of “lamestream” truth claims and standards of evidence. How should our political culture relate to that tribe?

We haven’t figured it out. Politicians and the political press have tried to accommodate the shibboleths of the right as legitimate positions for debate. The press in particular has practically sworn off plain judgments of accuracy or fact. But all that’s done is confuse and mislead the broader public, while the tribe pushes ever further into extremity. The tribe does not want to be accommodated. It is fueled by elite rejection.

At this point mainstream institutions like the press are in a bind: either accept the tribe’s assertions as legitimate or be deemed “biased.” Until there is a way out of that trap, there will be more and more Climategates.

Fact-Free Science (N.Y. Times)

THE WAY WE LIVE NOW

By JUDITH WARNER
Published: February 25, 2011

Photo: Camille Seaman.

President Obama has made scientific innovation the cornerstone of his plans for “winning the future,” requesting in his recent budget proposal large financing increases for scientific research and education and, in particular, sustained attention to developing alternative energy sources and technologies. “This is our generation’s Sputnik moment,” he declared in his State of the Union address last month.

It would be easier to believe in this great moment of scientific reawakening, of course, if more than half of the Republicans in the House and three-quarters of Republican senators did not now say that the threat of global warming, as a man-made and highly threatening phenomenon, is at best an exaggeration and at worst an utter “hoax,” as James Inhofe of Oklahoma, the ranking Republican on the Senate Environment and Public Works Committee, once put it. These grim numbers, compiled by the Center for American Progress, describe a troubling new reality: the rise of the Tea Party and its anti-intellectual, anti-establishment, anti-elite worldview has brought both a mainstreaming and a radicalization of antiscientific thought.

The politicization of science isn’t particularly new; the Bush administration was famous for pressuring government agencies to bring their vision of reality in line with White House imperatives. In response to this, and with a renewed culture war over the very nature of scientific reality clearly brewing, the Obama administration tried to initiate a pre-emptive strike earlier this winter, issuing a set of “scientific integrity” guidelines aimed at keeping the work of government scientists free from ideological pollution. But since taking over the House of Representatives, the Republicans have packed science-related committees with lawmakers who refute such basic findings as the reality of global warming and the threats of climate change. Fred Upton, the head of the House Energy and Commerce Committee, has said outright that he does not believe that global warming is man-made. John Shimkus of Illinois, who also sits on the committee — as well as on the Subcommittee on Energy and Environment — has said that the government doesn’t need to make a priority of regulating greenhouse-gas emissions, because as he put it late last year, “God said the earth would not be destroyed by a flood.”

Source: Gallup

Whoever emerges as the Republican presidential candidate in 2012 will very likely have to embrace climate-change denial. Mitt Romney, Tim Pawlenty and Mike Huckabee, all of whom once expressed some support for action on global warming, have notably distanced themselves from these views. Saying no to mainstream climate science, notes Daniel J. Weiss, a senior fellow and director of climate strategy for the Center for American Progress, is now a required practice for Republicans eager to play to an emboldened conservative base. “Opposing the belief that global warming is human-caused has become systematic, like opposition to abortion,” he says. “It’s seen as another way for government to control people’s lives. It’s become a cultural issue.”

That taking on the scientific establishment has become a favored activity of the right is quite a turnabout. After all, questioning accepted fact, revealing the myths and politics behind established certainties, is a tactic straight out of the left-wing playbook. In the 1960s and 1970s, the push back against scientific authority brought us the patients’ rights movement and was a key component of women’s rights activism. That questioning of authority veered in a more radical direction in the academy in the late 1980s and early 1990s, when left-wing scholars doing “science studies” increasingly began taking on the very idea of scientific truth.

This was the era of the culture wars, the years when the conservative University of Chicago philosopher Allan Bloom warned in his book “The Closing of the American Mind” of the dangers of liberal know-nothing relativism. But somehow, in the passage from Bush I to Bush II and beyond, the politics changed. By the mid-1990s, even some progressives said that the assault on truth, particularly scientific truth, had gone too far, a point made most famously in 1996 by the progressive New York University physicist Alan Sokal, who managed to trick the left-wing academic journal Social Text into printing a tongue-in-cheek article, written in an overblown parody of dense academic jargon, that argued that physical reality, as we know it, may not exist.


Illustration: Nomoco

Following the Sokal hoax, many on the academic left experienced some real embarrassment. But the genie was out of the bottle. And as the political zeitgeist shifted, attacking science became a sport of the radical right. “Some standard left arguments, combined with the left-populist distrust of ‘experts’ and ‘professionals’ and assorted high-and-mighty muckety-mucks who think they’re the boss of us, were fashioned by the right into a powerful device for delegitimating scientific research,” Michael Bérubé, a literature professor at Pennsylvania State University, said of this evolution recently in the journal Democracy. He quoted the disillusioned French theorist Bruno Latour, a pioneer of science studies who was horrified by the climate-change-denying machinations of the right: “Entire Ph.D. programs are still running to make sure that good American kids are learning the hard way that facts are made up, that there is no such thing as natural, unmediated, unbiased access to truth . . . while dangerous extremists are using the very same argument of social construction to destroy hard-won evidence that could save our lives.”

Some conservatives argue that the Republican war on science is bad politics and that catering to the “climate-denier sect” in the party is a dangerous strategy, as David Jenkins, a member of Republicans for Environmental Protection wrote recently on the FrumForum blog. Public opinion, after all, has not kept pace with Republican rhetoric on the topic of climate change. A USA Today/Gallup poll conducted in January found that 83 percent of Americans want Congress to pass legislation promoting alternative energy, and a recent poll by the Opinion Research Corporation found that almost two-thirds want the Environmental Protection Agency to be more aggressive.

For those who have staked out extreme positions, backtracking may not be easy: “It is very difficult to get a man to understand something when his tribal identity depends on his not understanding it,” Bérubé notes. Maybe it’s time for some new identity politics.

Judith Warner is the author, most recently, of “We’ve Got Issues: Children and Parents in the Age of Medication.”

>The Drama of Climate Change (More Intelligent Life)

>

Climate science is a tricky subject for the stage, as two new plays in London make plain. Robert Butler puts his finger on the problem …

Special to MORE INTELLIGENT LIFE (Winter 2010)

In the last fortnight two plays about climate change have opened in London that have provoked polar reactions: “Greenland” (pictured top), at the National Theatre, got panned, and “The Heretic” (pictured below), at the Royal Court, got raves. If you go and see both, you could come away fairly confused about climate change.

Sea levels in the Maldives are rising in “Greenland”; sea levels in the Maldives are not rising in “The Heretic”. The Hockey Stick Graph, which links the rise in global temperature to human activity, has been broadly accepted by scientists in “Greenland”; the Hockey Stick Graph is an embarrassment to scientists in “The Heretic”. The UN climate change negotiations in Copenhagen might be the last chance for mankind to save itself in “Greenland”; these negotiations don’t get a mention in “The Heretic”. A life-size polar bear comes out in “Greenland”, a cuddly toy polar bear appears in “The Heretic”: one tries to create a sense of wonder, the other is a joke.

Highly divisive issues have generated some important plays: think of McCarthyism and Arthur Miller’s “The Crucible”, or AIDS and Tony Kushner’s “Angels in America”, or political correctness and David Mamet’s “Oleanna”. Yet 20 years after the first report from the Intergovernmental Panel on Climate Change, and five years after “An Inconvenient Truth”, no major playwright had written a play about the subject. Many reasons have been suggested for this. The science is complex. The links between cause and effect (on which plays depend) are hard to show. Arts organisations get sponsorship from Big Oil. And everyone has made up their mind anyway.

The National Theatre tried to get round this last point by commissioning four youngish playwrights, none of whom (as they cheerfully admitted in a pre-show discussion) had any special knowledge in this area. As the playwrights put it, they were “on a journey”. Since the National has taken active and informed steps to reduce its own carbon emissions, it sounds as if there were administrators in the building who initially knew more about climate change than the writers.

The title had to be chosen before anything was written. Many of the scenes would revolve around Copenhagen, but that name was out as the National had already staged a superb play about science called “Copenhagen”. In six months the writers industriously interviewed everyone from the British Government’s chief scientific adviser and Greenpeace activists to prominent sceptics and the chairman of Shell. They went on to develop multiple storylines to reflect this complexity. The idea—according to the play’s dramaturg Ben Power—was “to find a new way of talking about this subject”.

The production also comes accompanied with a series of talks (from scientists, sceptics and activists) and panel discussions. There is even something called a ‘talkaoke’, a roundtable discussion that takes place in the foyer immediately after the show, where—amazingly for theatre—people can take the microphone and criticise what they’ve just seen. What’s the result of all this careful, reasonable, fact-checked, inclusive sincerity? “Greenland” was slammed: “crushingly dull” (London Standard), “shamelessly partisan” (Daily Telegraph), “rotten theatre” (Sunday Times).

“The Heretic” took another route. The playwright, Richard Bean, a former stand-up comic, has a sharp eye for modern pieties. His recent play “England People Very Nice” tackled the subject of immigration. Bean particularly admires Joe Orton, a playwright from the 1960s, as someone who would “go around, find the open wound and pour salt in it.” In ‘The Heretic’, he delivers a comedy that fictionalises (and skews) many of the current controversies and “-gates” with plenty of verve and attack.

Bean’s play takes an Earth Scientist at York University (played with crisp disdain by Juliet Stevenson) whose research into sea-level rises isn’t going to help her faculty’s chances of getting a major grant. Her appearance on the BBC’s “Newsnight” leads to her sacking, and she ends up having a regular column in the Daily Telegraph. (This time round the Telegraph’s theatre critic could see nothing “shamelessly partisan” about the play, and said it was “an absolute corker”.)

It’s probably a mistake to worry about the ways in which the science is misrepresented when the rest of the plot is not very plausible either. For the first-night audience, neither failing seemed to matter because “The Heretic” has two much more important things going for it: likeable characters and very funny jokes. It even has a happy ending. The Royal Court, hotbed of radical left-wing plays for so many decades, has produced the most right-wing play in London. It’d be interesting to know how much this cheers its staff.

Only one play so far, Steve Waters’s “The Contingency Plan”, has managed to be authoritative and funny on the subject of climate change. (That play was staged at a tiny theatre, The Bush, and richly deserves to be revived.) This last fortnight has now produced one painfully authoritative play, which tells us the great majority of climate scientists are right, and also one painfully funny play, which tells us the great majority of climate scientists are wrong. The reason why one’s a flop and the other is a hit says more about theatre than it does about climate change.

Jim Thompson, author of “The Grifters”, once wrote there are 32 ways to write a story (and he had used every one of them), but there is only one plot: “Things are not what they seem.” The problem with climate change is that the scientific consensus is a bit of a bore. It just doesn’t catch our imagination. As an audience, we are naturally drawn to deception and mystery, the half-hidden and the shadows. The theatre may be the one place where we hope our trust in authority figures will prove to be misplaced.

The authors of “Greenland” might have had more fun if they had concentrated on the smooth and powerful authority figures who deny the science, rather than the earnest folks who fret over it. That’s where the action is. As David Mamet told an interviewer, “Drama is basically about lies, somebody lying to somebody.” It’s the impulse audiences had in Ancient Greece, when characters were portrayed with masks. Whichever way you tell the story, we want to see the mask slip.

“The Heretic” is at the Royal Court Theatre through March 19th; “Greenland” is at the National Theatre through April 2nd

Robert Butler, a former theatre critic, blogs on the arts and the environment at the Ashden Directory, which he edits. His last article was about the lasting power of “Heart of Darkness”

>‘Rapid Response Team’ Pairs Scientists and Media (The Yale Forum on Climate Change & The Media)

>
By Lisa Palmer | February 16, 2011
The Yale Forum on Climate Change & The Media


Think of it as the climate scientists/journalists version of “eHarmony.” A volunteer website launched by scientists serves as a matchmaking venue for media outlets and government officials looking for input on climate science topics.

It’s a Friday morning and Scott Mandia is scanning the Climate Science Rapid Response Team e-mail inbox he shares with two other climate science match-makers.

Today, on Mandia’s watch, a message from a journalist arrives at 5:30 a.m. It’s the first of two or three media requests he’ll likely get this day. Mandia’s task now? Ask for a response from one of 135 scientists in his network most qualified to answer the question. You might think of it as the climate scientists/journalists version of “eHarmony.”

Mandia, a professor of physical sciences at Suffolk County Community College, in New York, and his fellow Rapid Response founders, John Abraham, associate professor of thermodynamics at St. Thomas University, and Ray Weymann, a California-based retired astronomer and member of the National Academy of Sciences, take shifts. Each is a volunteer custodian of e-mail requests that flow in from their climate change match-making website connecting climate scientists with lawmakers and media outlets.

Launched in November 2010, the website tries to narrow the information gap between scientific understanding of climate change and what the public knows. Scientists involved with the group are screened and selected on an invitation-only basis. The experts come from a range of climate change science specialties, everything from climate modeling researchers and ecologists to economists and policy experts. Most are university faculty members or employees of government laboratories. It’s not a collection that most climate “contrarians” might be comfortable with.
The all-volunteer group promises to respond quickly to media requests to make sure science is portrayed accurately in the day’s news. They say turnaround time for requests is as fast as two hours for media operating on a short deadline.

“The scientists became members of our group because they understand that, as scientists, they have a responsibility to engage the public by engaging the media,” Mandia said in a phone interview. Mandia said he and his colleagues operate the service with no funding, and the website design was donated by Richard Hawkins, director of the Public Interest Research Centre in the United Kingdom.

Early on a Confusing Mix-up with AGU Media Project

Coincidentally, the Climate Science Rapid Response Team website debuted at the same time as the relaunch of the American Geophysical Unions’s Climate Q and A service, which has similarities with the Rapid Response Team but strictly limits questions to matters of science. (See Yale Forum related story.) Some confusion ensued when the Los Angeles Times erroneously reported a link between the AGU’s group and the Rapid Response volunteers, and AGU staff quickly initiated a damage-control effort in fear that some on Capitol Hill would find, based on the newspaper’s coverage, their effort overly politicized.

“When that (Los Angeles Times) story came out, it sounded like scientists were fighting back against politicians. We are not advocates about policy, but it made us look like we were the 98 pound weaklings getting sand kicked in their face,” said Mandia. But the bad press proved a boon to increase the numbers involved in the Rapid Response force.

“Scientists then realized they were being criticized unfairly and wanted to get involved,” said Mandia. The number of scientists involved with the Rapid Response Team quadrupled in number.
The AGU’s Q and A Service first formed to support media requests during the United Nations Climate Change Conference in Copenhagen in December 2009. It started again prior to the U.N. talks in Cancun. The Q and A service is open to anyone with a PhD degree willing to provide scientific expertise on a subject.

“AGU is not a partisan organization. We are here to make our science available so there is good information available to the media,” AGU Executive Director Chris McEntee said in a telephone interview.

About 700 scientists are registered with AGU’s service, which has provided answers to 68 media outlets. “We think it is important that policymakers, media, and the public get unbiased, nonpartisan information when making a decision,” said McEntee. “The service fits with our mission to promote scientific discovery for the benefit of humanity.”

Scientists Step Up

Mandia said scientists involved with his effort are usually tapped once or twice a month for media inquiries. No single person carries the burden of too many repeat requests because the group has selected a range of scientists, vetted for their expertise in various disciplines. The Rapid Response Team also has promised confidentiality of its scientists, who can remain anonymous if they wish. But Mandia said that, despite the offer, “none of them has ever requested anonymity.”

Andrew Dessler, an atmospheric scientist at Texas A & M University, is affiliated with both information services, but is more involved with the Climate Science Rapid Response Team. He was prompted into action because “dealing with climate change misinformation is difficult to do on your own,” Dessler wrote in an e-mail. “Effectively responding to the denial machine absolutely requires coordinated action by the climate science community. In this way, I think the CCRRT [sic] is a model of how scientists can effectively spend their limited resources on outreach.”

Dessler gives the Rapid Response service high marks, especially for institutionalizing the response process from scientists and distributing the communications workload. “You have to realize the asymmetry here. For [some] so-called skeptics, spreading misinformation is their full-time job. Scientists, on the other hand, already have a full-time job: research and teaching. Thus, we need to have mechanisms to level the playing field, and the CCRRT [sic] is one such mechanism,” said Dessler, adding that he encourages scientists to get involved in public outreach. “Because we are mainly funded by tax dollars, I think we have a responsibility to repay this by spreading the results of our research as far and wide as possible.”

A Goal of Precise Pairing

As of early February, more than 100 media organizations — newspaper, magazine, online media, television, and radio — and government officials have used the service to find climate scientists who could comment on a story. Mainstream media users have included The New York Times, The Guardian (UK), CNN International, and American Public Media’s “Marketplace,” among many others. Mandia said many of the media questions in December had to do with severe weather in the United States and in Northern Europe.

The Rapid Response website includes testimonials from such reporters as Ben Webster, of The Times in London: “I asked a difficult question about ice cores and was impressed by the efforts the team made to find the right people to respond. The response was balanced, stating clearly what was known but also the uncertainties.”

Eli Kintisch, a reporter for Science and author of Hack the Planet (Wiley, 2010), called on the service when he was looking for a scientist to serve as a color commentator of a live blog for Science he was producing during a House hearing. Facing time constraints, Kintisch relied on the matchmakers for the legwork of finding someone to fill this role.

“I have my own batch of sources on climate that I have used to comment on stories, and I have used ProfNet in the past occasionally. But I was looking for someone who had some experience with public engagement and would be available for two to four hours,” Kintisch said in a telephone interview. “The hearing was a review of the basics of climate science, and there were some prominent contrarians testifying, so I thought it would be useful to have someone available who knew the basics of climate science.”

While not all climate scientists feel comfortable engaging with the media, they are finding ways to get more involved in communications. Mandia said, “Some scientists are nervous about speaking to the press and worry they will be misquoted, but getting out of the ‘Ivory Tower’ is becoming very important.”

Lisa Palmer is a Maryland-based freelance writer and a regular contributor to The Yale Forum. (E-mail: lisa@yaleclimatemediaforum.org)

>The Role of Trust in Climate Change Adaptation and Resilience: Can ICTs help?

>
February 27, 2011
By Angelica Valeria Ospina
From http://niccd.wordpress.com.

Amidst the magnitude and uncertainty that characterizes the climate change field, trust is a topic that is often overlooked, despite being one of the cornerstones of resilience building and adaptive capacity.

Trust is an essential element of effective communication, networking and self-organisation, and thus is indispensable in efforts to withstand and recover from the effects of climate change-related manifestations, being acute shocks or slow-changing trends. It’s an equally important basis for vulnerable communities to be able to adapt, and potentially change, in face of the -largely unknown- impact of climatic occurrences.

Associated with the belief, reliability, expectations and perceptions between people and the institutions within which they operate or interact, trust often acts as an underlying cause of action or inaction, constituting an important factor in decision-making processes.

With the rapid diffusion of Information and Communication Technologies (ICTs) such as mobile phones and the Internet, the unprecedented speed at which information is produced and shared is posing a new set of possibilities -and challenges- to communication management and trust building, both essential to the development of resilience and adaptation to the changing climate.

Adaptation experiences suggest that vulnerable communities are more prone to act upon information that they can ‘trust’, a complex concept that could be linked to factors such as the source of the information -and the local perception of it-, the language used to convey the message, the role and credibility of ‘infomediaries’ or local facilitators that help disseminate the information, the use of local appropriation mechanisms and community involvement, among others.

Climate change Adaptation Strategies and National Programmes of Action are increasingly called to foster trust-building processes by engaging local actors and gaining a better understanding of local needs and priorities. Thus, trust building in the climate change field involves finding new collaborative spaces where the interests of all stakeholders can be heard, and both scientific and traditional knowledge can be shared and built upon towards more effective adaptive practices, and potentially, transformation.

The widespread diffusion of ICTs -such as mobile phones, Internet access and even community radios- within Developing country environments could be opening up new opportunities to use these tools in support of trust-building processes, a necessary step towards change and transformation.

So, how can ICTs help to build trust within climate change resilience and adaptation processes?

Research at the intersection of ICTs, climate change and development suggests the following aspects in regards to the supportive role of ICT tools towards trust:

  • Multi-level Communication: ICTs can facilitate communication and trust-building between and across actors at the micro (e.g. community members), meso (e.g. NGOs) and macro levels (e.g. policy makers), fostering participation in the design of adaptation -and mitigation- strategies, as well as accountability and monitoring during their implementation.
  • Network Strengthening: The role of social networks is key within processes of adaptation to climate change and resilience building. Trust is at the core of networks functioning. The use of ICTs such as mobile phones can help to enhance communication and the bonds of trust within and among networks, which can in turn contribute to the effectiveness of community networks’ support and the access to resources.
  • Self-organisation: The ability to self-organize is a key attribute of resilient systems, and involves processes of collaboration that require trust among stakeholders and institutions. By facilitating access to information and resources through both point-to-multipoint and point-to-point exchange, ICTs can be important contributors to self-organisation and to the coordination of both preventive and reactive joint efforts in face of climatic events. They can help climate change actors to verify or double-check facts if the information source is not entirely trusted, diversifying their potential responses to the occurrence of climatic events. Additionally, ICTs can play a role towards trust by enabling the assessment of options and trade-offs involved in decision-making.
  • Appropriation and Infomediaries: The role of actors that ‘translate’ or ‘mediate’ the technical and scientific information to suit the needs of the local context, is vital for the appropriation of information. Tools such as the Internet, GIS or mobile phones can support and strengthen the role of agricultural extension workers, deepening the relationships of trust that they have established with local producers affected by climate change manifestations by offering them a broader set of options and information, for example, on crop diversification or plague management, including more immediate response to their queries.
  • Transparency and Fluency: Online platforms that provide new channels for citizens to voice their views and concerns, and that allow an interaction with decision makers, are an example of ICTs potential towards transparency and information fluency, which is an important factor in the local perception, expectations and ‘trust’ on local, regional and national institutions.

While at the onset of extreme events we are quick to recognize the importance of communication, we often fail to acknowledge the pivotal role of trust towards adaptation and resilience, as well as the potential of innovative tools such as ICTs to help fostering trust, strengthening networks and collaboration.

But as important as discussing the potential of ICTs towards trust building in adaptive processes, is discussing the risks associated with their use.

Ensuring the quality, accuracy and relevance of the information is key to avoid maladaptive practices and poor decision-making, which could potentially lead to deepen existent vulnerabilities and inequalities. Issues of power and differential access to information also need to be addressed when considering the potential of these tools towards trust building, network strengthening and participatory processes –including those related to climate change.

Ultimately, ICTs could play an important supportive role helping to build and strengthen trust within vulnerable communities affected by climate change impacts, as well as in National Adaptation Plans and Programmes of Action seeking to build long-term climate change resilience with a multi-stakeholder, participatory base.

>Yale Project on Knowledge of Climate Change Across Global Warming’s Six Americas

>
From Anthony Leiserowitz, Yale Project on Climate Change Communication

“Today we are pleased to announce the release of a new report entitled “Knowledge of Climate Change Across Global Warming’s Six Americas.” This report draws from a national study we conducted last year on what Americans understand about how the climate system works, and the causes, impacts, and potential solutions to global warming and is available here.

Overall, we found that knowledge about climate change varies widely across the Six Americas – 49 percent of the Alarmed received a passing grade (A, B, or C), compared to 33 percent of the Concerned, 16 percent of the Cautious, 17 percent of the Doubtful, 4 percent of the Dismissive, and 5 percent of the Disengaged. In general, the Alarmed and the Concerned better understand how the climate system works and the causes, consequences, and solutions to climate change than the Disengaged, the Doubtful and the Dismissive. For example:

· 87% of the Alarmed and 76% of the Concerned understand that global warming is caused mostly by human activities compared to 37% of the Disengaged, 6% of the Doubtful and 3% of the Dismissive;
· 86% of the Alarmed and 71% of the Concerned understand that emissions from cars and trucks contribute substantially to global warming compared to 18% of the Disengaged, 16% of the Doubtful and 10% of the Dismissive;
· 89% of the Alarmed and 64% of the Concerned understand that a transition to renewable energy sources is an important solution compared to 12% of the Disengaged, 13% of the Doubtful and 7% of the Dismissive.

However, this study also found that occasionally the Doubtful and Dismissive have as good or a better understanding than the Alarmed or Concerned. For example:

· 79% of the Dismissive and 74% of the Doubtful correctly understand that the greenhouse effect refers to gases in the atmosphere that trap heat, compared to 66% of the Alarmed and 64% of the Concerned;
· The Dismissive are less likely to incorrectly say that “the greenhouse effect” refers to the Earth’s protective ozone layer than all other groups, including the Alarmed (13% vs. 24% respectively);
· 50% of the Dismissive and 57% of the Doubtful understand that carbon dioxide traps heat from the Earth’s surface, compared to 59% of the Alarmed, and 45% of the Concerned.

This study also identified numerous gaps between expert and public knowledge about climate change. For example, only:

· 13% of the Alarmed know how much carbon dioxide there is in the atmosphere today (approximately 390 parts per million) compared to 5% of the Concerned, 9% of the Cautious, 4% of the Disengaged, 6% of the Doubtful and 7% of the Dismissive;
· 52% of the Alarmed have heard of coral bleaching, vs. 24% of the Concerned, 23% of the Cautious, 5% of the Disengaged, 21% of the Doubtful and 24% of the Dismissive;
· 46% of the Alarmed have heard of ocean acidification, vs. 22% of the Concerned, 25% of the Cautious, 6% of the Disengaged, 23% of the Doubtful and 16% of the Dismissive.

This study also found important misconceptions leading many to misunderstand the causes and therefore the solutions to climate change. For example, many Americans confuse climate change and the hole in the ozone layer. Such misconceptions were particularly apparent for the Alarmed and Concerned segments:

· 63% of the Alarmed and 49% of the Concerned believe that the hole in the ozone layer is a significant contributor to global warming compared to 32% of the Cautious, 12% of the Disengaged, 6% of the Doubtful and 7% of the Dismissive;
· 49% of the Alarmed and 36% of the Concerned believe that aerosol spray cans are a significant contributor to global warming compared to 20% of the Cautious, 9% of the Disengaged, 7% of the Doubtful and 5% of the Dismissive;
· 39% of the Alarmed and 23% of the Concerned believe that banning aerosol spray cans would reduce global warming compared to 13% of the Cautious, 3% of the Disengaged, 4% of the Doubtful and 1% of the Dismissive.

Concerned, Cautious and Disengaged Americans also recognize their own limited understanding of the issue. Fewer than 1 in 10 say they are “very well informed” about climate change, and 75 percent or more say they would like to know more. The Alarmed also say they need more information (76%), while the Dismissive say they do not need any more information about global warming (73%).

Overall, these and other results within this report demonstrate that most Americans both need and desire more information about climate change. While information alone is not sufficient to engage the public in the issue, it is often a necessary precursor of effective action.”

>Catástrofe na região serrana do Rio já é o maior desastre climático do País (Estadão)

>
[Talvez o pior evento de deslizamento, mas não chega sequer perto do pior desastre climático. A seca de 1877-1879 matou cerca de 500 MIL pessoas no Nordeste, segundo a maioria dos autores.]



Mortos são 785, mesmo número de enchente no Rio em 1967. Em ranking da ONU, também é o 8º maior deslizamento do mundo

22 de janeiro de 2011 | 0h 00
Bruno Tavares – O Estado de S.Paulo
A tragédia da região serrana do Rio se igualou ontem ao maior desastre climático da história do País. Até as 22 horas de ontem, as autoridades contabilizavam 785 mortos, o mesmo número de vítimas da enchente do Rio em 1967, segundo ranking da ONU. O número tende a aumentar, pois o Ministério Público fluminense estima que ainda existam 400 desaparecidos nos seis municípios devastados pelas chuvas do dia 12.
Marcos de Paula/AE
Marcos de Paula/AE
Fotos de desaparecidos em Teresópolis
O desastre também entra para os registros da ONU como o 8.º pior deslizamento da história mundial. O maior evento dessa natureza, segundo o Centro para a Pesquisa da Epidemiologia de Desastres, ocorreu em 1949, na antiga União Soviética, com 12 mil mortes. O segundo maior foi no Peru, em dezembro de 1941, e deixou 5 mil vítimas.
O deslizamento da região serrana já havia superado o número de vítimas registrado em 1967, em Caraguatatuba, quando 436 pessoas morreram. Por suas características devastadoras, o evento ocorrido há mais de quatro décadas na Serra do Mar paulista era considerado emblemático pelos geólogos.
Apesar da grande quantidade de água que desceu dos morros fluminenses e de vários rios terem transbordado, especialistas brasileiros e da própria ONU classificam o evento como deslizamento de terra. Na avaliação dos estudiosos, grande parte da destruição e das mortes foi causada pelas avalanches de terra e detritos – tecnicamente chamadas de corrida de lama.
O fenômeno é raro, pois depende de uma conjunção de fatores para ocorrer. No caso da região serrana do Rio, todos eles estavam presentes. Os morros são íngremes, o que favorece os escorregamentos de terra. Além disso, é preciso um grande volume de chuva concentrado em um curto espaço de tempo. Foi o que aconteceu ali. Segundo dados do Instituto Estadual do Ambiente (Inea), as estações climáticas localizadas no núcleo da tempestade registraram 249 e 297 milímetros de chuva em 24 horas – a partir das 20 horas do dia 11. Na avaliação da presidente do Inea, Marilene Ramos, um temporal dessa intensidade tem probabilidade de acontecer a cada 350 anos.
Enterro. Por questão “não só humanitária, mas também de saúde pública”, o juiz da 2.ª Vara de Família de Teresópolis, José Ricardo Ferreira de Aguiar, determinou o enterro dos corpos de 25 vítimas das chuvas que estavam em um caminhão e trailers frigoríficos. No Cemitério Carlinda Berlim, o principal dos cinco da cidade, foram 232 enterros desde a semana passada. Pelo Instituto Médico-Legal, até ontem já tinham passado 312 cadáveres. O juiz crê que existam “no mínimo quatro vezes mais soterrados” do que os encontrado.
A maioria dos corpos enterrados ontem – 22 adultos e três crianças – teve a identificação levantada pela equipe de papiloscopistas do IML, da Força Nacional e do Instituto Félix Pacheco. Mas, como os corpos não foram reclamados por parentes, o enterro foi determinado pelo juiz. No caso dos corpos sepultados sem identificação, houve coleta de DNA. Assim, será possível confrontar dados dos parentes que buscarem informações.
A partir de agora, segundo decisão do juiz, os corpos não reconhecidos serão liberados após coleta de material biológico. “Em duas horas o corpo sairá dignamente para ser sepultado.”
No caso dos desaparecidos, o Ministério Público afirma que informações registradas por parentes e amigos têm sido confrontadas com dados de hospitais e do IML. Ontem, fotos foram colocadas na frente de um centro de informações em Teresópolis. / COLABOROU MARCELO AULER

>In Denial – Climate on the Couch (BBC)

>
Thu 10 Feb 2011
BBC Radio 4

http://www.bbc.co.uk/programmes/b00y92mn

Something strange is happening to the climate – the climate of opinion. On the one hand, scientists are forecasting terrible changes to the planet, and to us. On the other, most of us don’t seem that bothered, even though the government keeps telling us we ought to be. Even climate scientists and environmental campaigners find it hard to stop themselves taking holidays in long haul destinations.

So why the gap between what the science says, and what we feel and do? In this programme Jolyon Jenkins investigates the psychology of climate change. Have environmentalists and the government been putting out messages that are actually counterproductive? Might trying to scare people into action actually be causing them to consume more? Are images of polar bears actually damaging to the environmentalists’ case because they alienate people who don’t think of themselves as environmentalists – and make climate change seem like a problem that’s a long way off and doesn’t have much relevance to normal life? Does the message that there are “simple and painless” steps we can take to reduce our carbon footprint (like unplugging your phone charger) unintentionally cause people to think that the problem can’t be that serious if the answers are so trivial?

Jolyon talks to people who are trying to move beyond the counterproductive messages. On the one hand there are projects like Natural Change, run by WWF Scotland, which try to reconnect people with nature using the therapeutic techniques of “ecopsychology” – intense workshops that take place in the wilderness of the west of Scotland, and which seem to convert the uncommitted into serious greens. On the other, there are schemes that try to take the issue out of the green ghetto and engage normal people with climate change. Jolyon visits a project in Stirling which has set itself the ambitious challenge of talking face to face with 35,000 people, through existing social groups like rugby clubs, knitting circles and art groups. It wants to sign up these groups to carbon cutting plans, and make carbon reduction a social norm rather than something that only eco-warriors bother with.

And he attends a “swishing party” in London, which tries to replicate the buzz women get from clothes shopping, but in a carbon neutral way. Can the green movement find substitutes for consumerism that are as fun and status-rich, that will deliver carbon reduction but without making people feel they have signed up to a life of grim austerity? And even if the British and Europeans shift their attitudes, can the Americans ever be reconciled to the climate change message? Producer Jolyon Jenkins.

>Can We Trust Climate Models? Increasingly, the Answer is ‘Yes’

>

18 JAN 2011: ANALYSIS

Yale Environment 360

Forecasting what the Earth’s climate might look like a century from now has long presented a huge challenge to climate scientists. But better understanding of the climate system, improved observations of the current climate, and rapidly improving computing power are slowly leading to more reliable methods.

by michael d. lemonick

A chart appears on page 45 of the 2007 Synthesis Report of the Intergovernmental Panel on Climate Change (IPCC), laying out projections for what global temperature and sea level should look like by the end of this century. Both are projected to rise, which will come as no surprise to anyone who’s been paying even the slightest attention to the headlines over the past decade or so. In both cases, however, the projections span a wide range of possibilities. The temperature, for example, is likely to rise anywhere from 1.8 C to 6.4 C (3.2 F to 11.5 F), while sea level could increase by as little as 7 inches or by as much as 23 — or anywhere in between.

It all sounds appallingly vague, and the fact that it’s all based on computer models probably doesn’t reassure the general public all that much. For many people, “model” is just another way of saying “not the real world.” In fairness, the wide range of possibilities in part reflects uncertainty about human behavior: The chart lays out different possible scenarios based on how much CO2 and other greenhouse gases humans might emit over the coming century. Whether the world adopts strict emissions controls or decides to ignore the climate problem entirely will make a huge difference to how much warming is likely to happen.

But even when you factor out the vagaries of politics and economics, and assume future emissions are known perfectly, the projections from climate models still cover a range of temperatures, sea levels, and other manifestations of climate change. And while there’s just one climate, there’s more than one way to simulate it. The IPCC’s numbers come from averaging nearly two dozen individual models produced by institutions including the National Center for Atmospheric Research (NCAR), the Geophysical Fluid Dynamics Laboratory (GFDL), the U.K.’s Met Office, and more. All of these models have features in common, but they’re constructed differently — and all of them leave some potentially important climate processes out entirely. So the question remains: How much can we really trust climate models to tell us about the future?

The answer, says Keith Dixon, a modeler at GFDL, is that it all depends on questions you’re asking. “If you want to know ‘is climate change something that should be on my radar screen?’” he says, “then you end up with some very solid results. The climate is warming, and we can say why. Looking to the 21st century, all reasonable projections of what humans will be doing suggest that not only will the climate continue to warm, you have a good chance of it accelerating. Those are global-scale issues, and they’re very solid.”

The reason they’re solid is that, right from the emergence of the first crude versions back in the 1960s, models have been at their heart a series of equations that describe airflow, radiation and energy balance as the Sun

The problem is that warming causes changes that act to accelerate or slow the warming.

warms the Earth and the Earth sends some of that warmth back out into space. “It literally comes down to mathematics,” says Peter Gleckler, a research scientist with the Program for Climate Model Diagnosis and Intercomparison at Livermore National Laboratory, and the basic equations are identical from one model to another. “Global climate models,” he says, echoing Dixon, “are designed to deal with large-scale flow of the atmosphere, and they do very well with that.”

The problem is that warming causes all sorts of changes — in the amount of ice in the Arctic, in the kind of vegetation on land, in ocean currents, in permafrost and cloud cover and more — that in turn can either cause more warming, or cool things off. To model the climate accurately, you have to account for all of these factors. Unfortunately, says James Hurrell, who led the NCAR’s most recent effort to upgrade its own climate model, you can’t. “Sometimes you don’t include processes simply because you don’t understand them well enough,” he says. “Sometimes it’s because they haven’t even been discovered yet.”

A good example of the former, says Dixon, is the global carbon cycle — the complex interchange of carbon between oceans, atmosphere, and biosphere. Since atmospheric carbon dioxide is driving climate change, it’s obviously important, but until about 15 years ago, it was too poorly understood to be included in the models. “Now,” says Dixon, “we’re including it — we’re simulating life, not just physics.” Equations representing ocean dynamics and sea ice also have been added to climate models as scientists have understood these crucial processes better.

Other important phenomena, such as changes in clouds, are still too complex to model accurately. “We can’t simulate individual cumulus clouds,” says Dixon, because they’re much smaller than the 200-kilometer grid boxes that make up climate models’ representation of the world. The same applies to aerosols — tiny particles, including natural dust and manmade soot — that float around in the atmosphere and can cool or warm the planet, depending on their size and composition.

But there’s no one right way to model these small-scale phenomena. “We don’t have the observations and don’t have the theory,” says Gleckler. The best they can do on this point is to simulate the net effect of all the clouds or aerosols in a grid box, a process known as “parameterization.” Different

‘It’s not a science for which everything is known, by definition,’ says one expert.

modeling centers go about it in different ways, which, unsurprisingly, leads to varying results. “It’s not a science for which everything is known, by definition,” says Gleckler. “Many groups around the world are pursuing their own research pathways to develop improved models.” If the past is any guide, modelers will be able to abandon parameterizations one by one, replacing them with mathematical representations of real physical processes.

Sometimes, modelers don’t understand a process well enough to include it at all, even if they know it could be important. One example is a caveat that appears on that 2007 IPCC chart. The projected range of sea-level rise, it warns, explicitly excludes “future rapid dynamical changes in ice flow.” In other words, if land-based ice in Greenland and Antarctica starts moving more quickly toward the sea than it has in the past — something glaciologists knew was possible, but hadn’t yet been documented — these estimates would be incorrect. And sure enough, satellites have now detected such movements. “The last generation of NCAR models,” says Hurrell, “had no ice sheet dynamics at all. The model we just released last summer does, but the representation is relatively crude. In a year or two, we’ll have a more sophisticated update.”

Sophistication only counts, however, if the models end up doing a reasonable job of representing the real world. It’s not especially useful to wait until 2100 to find out, so modelers do the next best thing: They perform “hindcasts,” which are the inverse of forecasts. “We start the models from the middle of the 1800s,” says Dixon, “and let them run through the present.” If a model reproduces the overall characteristics of the real-world climate record reasonably well, that’s a good sign.

What the models don’t try to do is to match the timing of short-term climate variations we’ve experienced. A model might produce a Dust Bowl like that of the 1930s, but in the model it might happen in the 1950s. It should produce the ups and downs of El Niño and La Niña currents in the Pacific with about the right frequency and intensity, but not necessarily at the same times as they happen in the real Pacific. Models should show slowdowns and accelerations in the overall warming trend, the result of natural fluctuations, at about the rate they happen in the real climate. But they won’t necessarily show the specific flattening of global warming we’ve observed during the past decade — a temporary slowdown that had skeptics declaring the end of climate change.

It’s also important to realize that climate represents what modelers call a boundary condition. Blizzards in the Sahara are outside the boundaries of our current climate, and so are stands of palm trees in Greenland next year. But within those boundaries, things can bounce around a great deal from year to year or decade to decade. What modelers aim to produce is a virtual climate that resembles the real one in a statistical sense, with El Niños, say, appearing about as often as they do in reality, or hundred-year storms coming once every hundred years or so.

This is one essential difference between weather forecasting and climate projection. Both use computer models, and in some cases, even the very same models. But weather forecasts start out with the observed state of the

Many decisions about how to adapt to climate change can’t wait for better climate models.

atmosphere and oceans at this very moment, then project it forward. It’s not useful for our day-to-day lives to know that September has this average high or that average low; we want to know what the actual temperature will be tomorrow, and the day after, and next week. Because the atmosphere is chaotic, anything less than perfect knowledge of today’s conditions (which is impossible, given that observations are always imperfect) will make the forecast useless after about two weeks.

Since climate projections go out not days or weeks, but decades, modelers don’t even try to make specific forecasts. Instead, they look for changes in averages — in boundary conditions. They want to know if Septembers in 2050 will be generally warmer than Septembers in 2010, or whether extreme weather events — droughts, torrential rains, floods — will become more or less frequent. Indeed, that’s the definition of climate: the average conditions in a particular place.

“Because models are put together by different scientists using different codes, each one has its strengths and weaknesses,” says Dixon. “Sometimes one [modeling] group ends up with too much or too little sea ice but does very well with El Niño and precipitation in the continental U.S., for example,” while another nails the ice but falls down on sea-level rise. When you average many models together, however, the errors tend to cancel.

Even when models reproduce the past reasonably well, however, it doesn’t guarantee that they’re equally reliable at projecting the future. That’s in part because some changes in climate are non-linear, which is to say that a small nudge can produce an unexpectedly large result. Again, ice sheets are a good example: If you look at melting alone, it’s pretty straightforward to calculate how much extra water will enter the sea for every degree of temperature rise. But because meltwater can percolate down to lubricate the undersides of glaciers, and because warmer oceans can lift the ends of glaciers up off the sea floor and remove a natural brake, the ice itself can end up getting dumped into the sea, unmelted. A relatively small temperature rise can thus lead to an unexpectedly large increase in sea level. That particular non-linearity was already suspected, if not fully understood, but there could be others lurking in the climate system.

Beyond that, says Dixon, if three-fourths of the models project that the Sahel (the area just south of the Sahara) will get wetter, for example, and a fourth says it will dry out, “there’s a tendency to go with the majority. But we can’t rule out without a whole lot of investigation whether the minority is doing something right. Maybe they have a better representation of rainfall patterns.” Even so, he says, if you have the vast majority coming up with similar results, and you go back to the underlying theory, and it makes physical sense, that tends to give you more confidence they’re right. The best confidence-builder of all, of course, is when a trend projected by models shows up in observations — warmer springs and earlier snowmelt in the Western U.S., for example, which not only makes physical sense in a warming world, but which is clearly happening.

Climate Forecasts: The Case For Living with Uncertainty

As climate science advances, predictions about the extent of future warming and its effects are likely to become less — not more — precise, journalist Fred Pearce writes. That may make it more difficult to convince the public of the reality of climate change, but it hardly diminishes the urgency of taking action.
READ MORE

And the models are constantly being improved. Climate scientists are already using modified versions to try and predict the actual timing of El Ninos and La Niñas over the next few years. They’re just beginning to wrestle with periods of 10, 20 and even 30 years in the future, the so-called decadal time span where both changing boundary conditions and natural variations within the boundaries have an influence on climate. “We’ve had a modest amount of skill with El Niños,” says Hurrell, “where 15-20 years ago we weren’t so skillful. That’s where we are with decadal predictions right now. It’s going to improve significantly.”

After two decades of evaluating climate models, Gleckler doesn’t want to downplay the shortcomings that remain in existing models. “But we have better observations as of late,” he says, “more people starting to focus on these things, and better funding. I think we have better prospects for making some real progress from now on.” 

POSTED ON 18 JAN 2011 IN BUSINESS & INNOVATION CLIMATE ENERGY SCIENCE & TECHNOLOGY NORTH AMERICA NORTH AMERICA 

>O povo ribeirinho do São Francisco traduz as lutas populares do Brasil (IHU Online)

>

 Instituto Humanitas Unisinos – IHU Online – 2/2/2011
“Existe um sertão com bastante água. A questão é que esta água é colocada majoritariamente a serviço dos interesses do capital e suas oligarquias, a água é apropriada privadamente”, aponta o fotógrafo.

Confira a entrevista.

 
 

Um fotógrafo operário. Assim se define João Zinclar que já foi metalúrgico e hoje vive da fotografia. O gaúcho, que hoje vive em Campinas-SP, durante seis anos percorreu as margens do rio São Francisco e registrou a vida deste e de quem depende dele para viver. Assim nasceu o livro O Rio São Francisco e as Águas no Sertão (Campinas: sem editora, 2010). Em entrevista à IHU On-Line, realizada por email, Zinclar conta como foi esse processo de captação das imagens e convivência com o povo da região. “Percorremos a extensão do rio, que é de 2.700 quilômetros várias vezes, perfazendo mais de 15 mil quilômetros nesses seis anos”, descreve.

Nesse tempo, Zinclar acompanhou todo o processo de transposição do rio São Francisco, desde as discussões sobre o projeto até o início das obras. “A natureza vem sendo constantemente privatizada, transformada em mercadoria. Esse processo não é novo, faz parte da natureza do capitalismo em todos os tempos. Hoje, o controle sobre a água indica um novo patamar dessa disputa. A transposição é parte dessa apropriação privada das riquezas comuns. A água é um bem comum, não há vida sem água e hoje uma parte considerável da humanidade não tem acesso a este recurso”, afirmou. Ao longo da entrevista é possível ver algumas das imagens que Zinclar publicou em seu livro.

Confira a entrevista.

IHU On-Line – Como foi a viagem e a produção das imagens para conhecer as águas do sertão brasileiro?

João Zinclar – O trabalho que resultou no livro O Rio São Francisco e as Águas no Sertão lançado em novembro de 2010 em Campinas-SP tem durado seis (6) anos, desde janeiro de 2005 até os dias de hoje. É motivado pela questão política envolvendo a grande polêmica e os conflitos acerca da equivocada proposta do governo federal de efetivar as obras da transposição das águas rio São Francisco para o chamado nordeste setentrional.

Por entender a questão da água como valor de luta estratégica para os trabalhadores e o povo, (a guerra pela água em Cochabamba na Bolívia no início do século é um exemplo disso), considerei que a fotografia poderia contribuir nessa polêmica sobre o futuro das águas do velho Chico. Ajudar na divulgação e documentação das lutas populares de resistência ao projeto de transposição, mostrar a grave situação de degradação na vida do rio, para uma compreensão melhor no principal caso real e de relevância nacional sobre conflitos em torno da defesa, do uso e controle de águas no Brasil.

Percorremos a extensão do rio, que é de 2.700 quilômetros várias vezes, perfazendo mais de 15 mil quilômetros nesses seis anos. Nesse tempo, contarei e convivi com comunidades tradicionais, quilombolas, indígenas, ribeirinhos, sem terra, pescadores, trabalhadores rurais. Assim, fotografei e documentei suas lutas para defender o rio do veneno capitalista que contamina e usurpa suas águas, com suas mineradoras, barragens e monoculturas agroexportadoras que devastam criminosamente biomas importantes para a formação do São Francisco como o cerrado e a caatinga.

Além das margens do rio, também percorremos várias regiões por onde estão sendo construídos e passando os canais da transposição. Estivemos no Ceará, Rio Grande do Norte e Paraíba, onde procuramos mostrar e abordar o sertão de outra forma, um sertão em tom azul, azul de água, com uma quantidade enorme estocadas em grandes, médios e pequenos açudes espalhados pelo sertão, construídos ao longo do último século em nome do “combate à seca”. Aí foi possível revelar uma das principais críticas ao projeto de transposição: a de que a obra vai chover no molhado, vai levar água para onde já tem água, água essa que, se distribuída para o povo, seria suficiente para abastecer todos os usos, desfazendo o mito da falta dela no sertão.

Quero destacar que esse processo todo só foi possível com o importante apoio nas mais variadas formas de pessoas amigas, dos movimentos sociais, sindicatos, pastorais sociais, de profissionais jornalistas, tanto de Campinas-SP, como do povo da beira do rio e no sertão. Foi a solidariedade desse povo que me ajudou a compreender realidades distantes de nosso dia a dia e também entender melhor a luta de classes no Brasil. Antes de virar livro, essas fotos percorreram várias cidades da beira do rio, de outros estados e países e serviram para ilustrar reportagens e debates sobre o rio São Francisco.

IHU On-Line – Você é um operário fotógrafo. Que diferenças o seu olhar de “operário” ressalta sobre o povo e a vida do rio São Francisco?

João Zinclar – A fotografia é paixão antiga. Hoje consigo sobreviver dela, como free-lancer, a serviço da luta operária e popular, mas minha profissão primeira é operário metalúrgico. Trabalhei no chão de fábrica durante muitos anos, fui dirigente sindical da categoria, onde forjei minha consciência de classe e visão socialista de mundo.

Não existe neutralidade jornalística nessa história. Portanto, a visão que conduz o livro é a de uma postura classista e anticapitalista e que a luta do povo ribeirinho em defesa de sua sobrevivência, de seu trabalho e da qualidade da água de seu rio contribui, à sua maneira, no conflito mais geral contra o capital, no campo e na cidade, resistindo à nova fase do avanço predatório do capitalismo no campo brasileiro.

A diversidade das lutas dos povos que habitam o velho Chico, com indígenas e quilombolas enfrentando o poder econômico em disputas para retomar terras, pescadores na defesa da pesca artesanal, sem terra em luta pela reforma agrária e outras manifestações, deveria ter a devida atenção dos trabalhadores urbanos e suas organizações políticas.

IHU On-Line – O Rio São Francisco passa por um momento de conflito em função das obras da transposição. O que você viu sobre as obras? O que você ouviu do povo sobre isso?

João Zinclar – Entre 2005 e 2008, vários movimentos sociais e pessoas se colocaram contrários ao projeto. Pessoas se mobilizaram com as greves de fome de Dom Luiz Cappio, com as ocupações de barragens e dos canteiros das obras da transposição. Vários protestos foram realizados, bem como denúncias de arbitrariedades aos direitos humanos e alertas sobre os impactos ambientais para iniciar a obra. O governo triturou tudo isso e as obras iniciaram e, hoje, estão em andamento.

Algumas lutas recentes (como greves de trabalhadores das empreiteiras por melhorias salariais e reclamações contra as péssimas condições de trabalho) produziram uma redução nos ritmos das obras. Agora, as questões se colocam de outra maneira e nem por isso são menos importantes. A desinformação sobre o projeto e os impactos negativos sobre a vida das comunidades atingidas pela transposição na região receptora das águas do velho Chico é a regra. A máquina propagandística do governo é poderosa e isso tem enorme potencial desmobilizador. Os atingidos pelas obras da transposição têm tido grandes dificuldades em se articular. Reclamam dos valores recebidos e das compensações materiais pagas pelo governo, pois anos de trabalho não se contabilizam facilmente. Muitos deixam sua história de vida e seu trabalho em troca de valores irrisórios em sua tentativa de recomeçar tudo de novo em outras localidades, atingidas pela crescente valorização das terras em torno dos canais da transposição.

Há uma insatisfação grande com o enfraquecimento das economias locais e a destruição das bases de vida de pequenos agricultores. A oferta de emprego não cumpre o prometido: são temporários e poucos. As obras afetam os povos originários, que têm na terra um referencial cultural, de vida com outros valores, que não apenas econômicos, pois a construção do eixo norte devasta terras Trukás em Cabrobró-PE, e território Anacé no Ceará, o eixo leste ameaça território sagrado dos pipipã em Pernambuco.

Além disso, o debate em torno da revitalização do rio continua atual, uma vez que as iniciativas do governo pouco realizaram nesse aspecto, mantendo o mesmo padrão de degradação do rio que afeta duramente da qualidade de vida dos ribeirinhos. A controvérsia e a oposição ao projeto de transposição continuam, essa é uma questão mal resolvida, que terá desdobramentos futuros, sustentada na insatisfação popular, quando perceberem a contradição, além de ser a água mais cara do Brasil, ás águas da transposição não são para servir ao povo do sertão, como diz o discurso do governo.

IHU On-Line – Qual a importância de registrar o São Francisco e as águas do sertão nesse momento atual em que vivemos?

João Zinclar – Sempre que pensamos no sertão nordestino vem em nossa cabeça a imagem dramática de seca, da caatinga retorcida, de vida difícil, quase inviável. A imagem cunhada por Euclides da Cunha de que “O sertanejo é antes de tudo um forte” ilustra essa ideia. Só um forte é capaz de conviver com isso. No entanto, olhando com outra abordagem também real, existe um sertão com bastante água. A questão é que esta água é colocada majoritariamente a serviço dos interesses do capital e suas oligarquias, a água é apropriada privadamente. O atual modelo de desenvolvimento na região, com o agronegócio à frente, se apropria das riquezas naturais, de forma radical, na medida em que novas frentes de negócios vão se abrindo. Essa é a questão central, em minha opinião.

A natureza vem sendo constantemente privatizada, transformada em mercadoria. Esse processo não é novo, faz parte da natureza do capitalismo em todos os tempos. Hoje o controle sobre a água indica um novo patamar dessa disputa. A transposição é parte dessa apropriação privada das riquezas comuns. A água é um bem comum, não há vida sem água e hoje uma parte considerável da humanidade não tem acesso a este recurso. Eu posso escolher se compro um jeans novo ou não, um livro ou um celular, mas não posso optar por não consumir água. A transposição do São Francisco, a nova polêmica em torno da construção da Usina de Belo Monte, assim como a ocupação das margens dos rios e encostas, a proteção das nascentes são temas políticos, não apenas técnico e ambiental. A luta social dos povos atingidos por esse “desenvolvimento” precisa se articular num horizonte político mais amplo, capaz de resgatar o caráter de classe desse debate. Porque são as populações pobres e os trabalhadores que mais sofrem com os efeitos desse processo.

IHU On-Line – O que suas imagens revelam sobre a Alma do Velho Chico?

João Zinclar – As imagens captadas revelam a diversidade de um povo. Expressão de um Brasil contraditório e de luta. O povo ribeirinho traduz as lutas populares no Brasil. Muitas vezes desarticuladas, essas ações estão repletas de vida e inovação. Lutas que incorporam tradições seculares, povos indígenas, a religiosidade, a luta contra a opressão num momento em que elas assumem a vanguarda numa luta pela preservação dos bens comuns, não em oposição ao desenvolvimento, mas propondo pensar as questões: Qual desenvolvimento? E pra quem? Busquei captar essa relação entre um projeto “moderno” que se apropria dos bens coletivos em nome de um único desenvolvimento possível e um mundo que se constrói, pensando na preservação dos valores coletivos sem abrir mão de avançar por melhores condições de vida.

IHU On-Line – O que o São Francisco representa para o povo que vive em seu entorno?

João Zinclar – Representa a vida em todos os sentidos, sem chavão, o São Francisco é a sobrevivência de homens e mulheres que dependem de suas águas, contam com a potencialidade de sua biodiversidade para desenvolverem sua forma de economia independente, sua cultura de vida, amparados na pesca e na agricultura de vazante e familiar.

>Baixo retorno político (Fapesp)

>Especiais

2/2/2011

Por Fábio de Castro

Mais educação não se traduz automaticamente em mais democracia, segundo estudo realizado na USP. Entre 1989 e 2006, diminuiu a diferença entre a participação política dos mais e menos escolarizados (ABr)

Agência FAPESP – Na avaliação do senso comum, educação e politização andam de mãos dadas. Para a elite brasileira – de acordo com pesquisas de opinião –, o aumento da escolaridade da população tem o poder de gerar cidadãos que participam mais da vida política do país e que valorizam mais a democracia. Mas um novo estudo mostra que essa visão não corresponde à realidade.

A pesquisa de doutorado de Rogério Schlegel, defendida no Departamento de Ciência Política da Faculdade de Filosofia, Letras e Ciências Humanas (FFLCH) da Universidade de São Paulo (USP), utilizou análises estatísticas para interpretar os dados de pesquisas de opinião realizadas entre 1989 e 2006. O trabalho concluiu que a educação brasileira está trazendo ganhos decrescentes em termos políticos.

“O estudo mostrou que os cidadãos mais escolarizados já não se tornam tão participativos e democráticos como ocorria há duas décadas. O maior nível de escolaridade ainda diferencia os cidadãos, mas essa diferença encolheu muito em 20 anos – isto é, os retornos políticos da educação têm sido decrescentes no Brasil. Em alguns quesitos de participação e apoio à democracia, a diferença entre os mais e os menos escolarizados chega a ser inexistente”, disse Schlegel à Agência FAPESP.

A pesquisa de Schlegel foi orientada pelo professor José Álvaro Moisés, da FFLCH-USP, e integra o Projeto Temático “A Desconfiança do Cidadão nas Instituições Democráticas”, coordenado por Moisés e financiado pela FAPESP.

Segundo Schlegel, uma pesquisa de opinião coordenada em 2000 pela professora Elisa Reis, do Departamento de Sociologia da Universidade Federal do Rio de Janeiro (UFRJ), já mostrava que, na avaliação da elite brasileira, a baixa escolaridade é o maior entrave para a democracia no país.

“Por trás dessa ideia há um pressuposto de que a educação só tem impacto no comportamento político por meio da capacitação cognitiva – isto é, basta fornecer mais educação e as pessoas terão mais recursos para acompanhar a política, discutindo, lendo jornais e fazendo exigências. Mas na realidade há caminhos alternativos para a entrada nessa vida política. Os resultados do estudo indicam que não há uma relação linear entre obter mais acesso à educação e obter mais instrumentos para participar da democracia”, afirmou.

Foram analisados vários mecanismos capazes de explicar os retornos políticos decrescentes da escolarização. A hipótese mais plausível é que o fenômeno tenha sido causado pela queda na qualidade da educação brasileira.

“Ao falhar na capacitação cognitiva do indivíduo e na transmissão de conhecimentos, o sistema educacional brasileiro estaria deixando de dar as ferramentas que ajudam o cidadão a atuar na esfera política. O resultado é que o aumento do acesso ao ensino ou do volume de escolarização – em tempo passado na escola ou anos de estudo completados – não é acompanhado pelos ganhos esperados em matéria de comportamento político”, disse.

O estudo teve fundamento em quatro pesquisas de opinião realizadas pelo grupo ligado ao Projeto Temático, a primeira realizada logo após a redemocratização, em 1989, e a mais recente – financiada pela FAPESP –, em 2006.

A partir desses dados, Schlegel utilizou análises estatísticas para controlar as diversas variáveis sociodemográficas disponíveis e observar, de forma isolada, o efeito da escolaridade no comportamento do cidadão ao longo do tempo.

“Na sociologia econômica é comum o uso, por exemplo, do conceito de ‘retorno econômico da educação’ para avaliar até que ponto uma maior escolaridade pode se refletir em maior renda, ou em maior arrecadação de impostos. A partir desse conceito, o estudo trabalha com a ideia de ‘retorno político da educação’”, explicou.

Indiferença política

O retorno político foi avaliado por meio de diferentes quesitos, como participação, apoio aos princípios democráticos e confiança nas instituições. Os resultados mostraram que a distância entre mais e menos escolarizados caiu marcadamente em relação à demonstração de interesse por política, consumo de notícias sobre o tema e hábito de conversar sobre ele.

“Em alguns quesitos, o nível de escolaridade é praticamente indiferente. No caso da participação em partidos, sindicatos e associações de bairro, por exemplo, o envolvimento é igualmente baixo entre os menos e mais escolarizados”, disse Schlegel.

A maior perda de retorno político, durante os 17 anos do período analisado, deu-se na faixa do ensino médio – faixa de escolarização que teve a maior expansão de alunos nas últimas duas décadas.

“Na média, hoje não se diferencia alguém que se formou no ensino médio de um cidadão com fundamental incompleto, em termos de preferir a democracia como forma de governo ou rejeitar a concentração de poder nas mãos de um líder centralizador”, afirmou.

Em 1993, de acordo com o estudo, a chance de um universitário ser muito interessado em política era 3,6 vezes maior que a de alguém com o ensino fundamental incompleto. Em 2006, as chances se reduziram para 1,6. “A diferença entre o universitário e alguém sem nenhum diploma escolar era enorme, em termos de interesse na política. Agora, a diferença ainda existe, mas é muito menor”, ressaltou Schegel.

Em 1989, uma pessoa com o segundo grau completo tinha 66% mais chance de preferir a democracia a qualquer outro regime, em comparação com alguém sem diploma do ensino fundamental. “Em 2006, já não havia mais diferença estatística entre os dois públicos. Nesse quesito, havia no passado uma distância que desapareceu entre os diferentes níveis de escolaridade em termos de comportamento político”, disse.

A confiança nas instituições tem uma relação especial com a escolaridade. Em 1993, quem tinha mais escolaridade confiava mais nos partidos que em 2006. Mas os dados não permitem concluir se houve de fato um aumento ou diminuição da confiança.

“Tratava-se de um momento em que os partidos estavam em reconstrução e havia uma noção generalizada de que eles eram o caminho para construir a democracia. Em 2006, essa noção já havia sido desfeita pelos partidos de aluguel e isso pode ter desencadeado a maior desconfiança dos mais escolarizados”, explicou.

Para Schlegel, os resultados do estudo, ao identificar que a escolarização vem trazendo ganhos decrescentes em termos políticos, desaconselham apostas na educação como panaceia capaz de promover uma cidadania superior e fazer superar os déficits democráticos no Brasil.

“A educação importa, mas sozinha não resolve. Os efeitos benéficos da escolarização para a convivência democrática precisam de ensino de qualidade para todos para se concretizarem plenamente”, disse.

>Um céu de probabilidades (O Povo)

>
O cearense carrega uma memória cultural, muitas vezes inconsciente, sobre a escassez de água que afetou as famílias no passado. Em entrevista ao Vida&Arte Cultura, o professor da UFRJ, Renzo Taddei, traça aspectos sobre a compreensão do clima e como isso reflete no cidadão

05.02.2011| 17:00

Em Quixadá, os profetas da chuva fazem suas previsões todos os anos (DÁRIO GABRIEL, EM 9/1/2010)

Diante dos ciclos da natureza, profetas preveem o futuro. Os cientistas tornam públicas as medições matemáticas e físicas que ditam a probabilidade de nublar ou fazer sol. Mediadas pela imprensa, as previsões meteorológicas afetam o cidadão e sua maneira de perceber o clima e a cidade. Professor-adjunto da Escola de Comunicação da Universidade Federal do Rio de Janeiro (UFRJ), Renzo Taddei tem relacionamento íntimo com o semi-árido cearense.

Pesquisador há quase uma década das manifestações populares na previsão do clima, em especial na atuação dos profetas da chuva de Quixadá, Renzo vez por outra vem à Fortaleza ministrar palestras, participar de encontros e pesquisas sobre o tema. Na última quarta, o professor recebeu a reportagem numa das salas da Funceme. Na entrevista que você lê a seguir, o pesquisador chama a atenção para alertas globais, como o aquecimento climático. “Já se percebeu que o tom alarmista de catástrofe iminente raramente produz algum efeito positivo. O que produz é uma sensação de impotência geral, como se não há nada que se possa fazer”. (Elisa Parente)

O POVO – Como é possível entender o clima para além do efeito atmosférico?
Renzo Taddei – A questão do clima no Ceará é muito interessante porque, ano passado, segundo a Seplag (Secretaria do Planejamento e Gestão), a economia cearense cresceu 8%. O que é um número incrível. Em 2010, tivemos o pior ano em chuvas nos últimos 30 anos. Foi a pior seca e um ano de bom crescimento para o Estado. Claramente, a agricultura contribui pouco para a economia no Ceará e, cada vez menos, as pessoas se sentem terrivelmente vulneráveis com relação ao clima. Por vários fatores, inclusive por causa dos programas sociais do governo Lula, pelas melhorias de infra-estrutura dos últimos governos do Ceará. Ou seja, a vida está um pouco mais fácil.

OP – De que maneira isto afeta na organização da cidade?
Renzo – Existiu uma estatística onde mais da metade da população acima de 40, 50 anos tinha nascido fora da Capital. Então a presença do imaginário rural é muito forte. Esse é um dos elementos do peso psicológico da seca. Até quem não tem nada a ver com agricultura, se alegra ao ver chuva. A história do bonito pra chover traz uma continuidade e uma ruptura, principalmente com relação às gerações mais novas que nasceram em Fortaleza. Entrevistei um agrônomo que me disse ter o hábito de desligar a água enquanto se ensaboava no chuveiro. E o filho dele perguntou por que ele desligava a água se o banho não tinha acabado. Ele se deu conta de que a geração mais nova sequer tem memória a respeito da escassez de água do seu Estado. A última grande crise em Fortaleza foi 1993, na construção do Canal do Trabalhador. Fortaleza é a única capital encravada no semi-árido, mesmo assim as pessoas não têm essa consciência. Você anda por Fortaleza e vê que cada lançamento imobiliário precisa ter um parque aquático, não é nem piscina. É o uso arquitetônico e recreativo da água. Só se compara a Las Vegas, que é outro lugar de abundância irresponsável encravado no deserto. Então as pessoas não têm experiência da falta de água, mas têm uma herança cultural de hipervalorização dela. Fortaleza é orgulhosa de sua pujança e gasta como novo rico. É um lance que tem a ver com cultura. Enquanto isso, o Canal da Integração está trazendo um mundo de água e a transposição do Rio São Francisco vai trazer ainda mais para Fortaleza não ter problema pelos próximos 30 anos.

OP – Mas isso também pode ter efeito contrário.
Renzo – Isso é perigoso porque, a longo prazo, não dá para achar que é sustentável você consumir. O grande debate da transposição, da construção do Castanhão e do Canal da Integração é esse. Está sempre aumentando a infra-estrutura que acumula água ao invés de educar a população para consumir menos. Não que dê para fazer os dois ao mesmo tempo, mas o fato é que os últimos governos têm preferido aumentar a oferta de água.

OP – Uma das linhas da sua pesquisa centra foco na antropologia da incerteza e do futuro. De que maneira isto está ligado ao estudo do clima?
Renzo – Isto talvez seja a parte mais desafiadora de entender qual o papel que o clima tem na nossa vida cotidiana. Porque a meteorologia rapidamente se deu conta de que a atmosfera é algo muito complexo. Agora a estação chuvosa no Ceará é algo muito mais complicado. Porque o agricultor quer saber hoje se vai ter chuva em maio. Não tem nenhum radar que mostre isso, mas a meteorologia usa a física e a matemática criando modelos que simulam no computador a maneira como funciona a natureza. Mas isso tem limitações. Por isso se fala em termos de probabilidade, porque às vezes uma coisa pequena pode mudar tudo. Então a meteorologia tem que conviver com essa relação complicada com a sociedade.

OP – Porque é tão difícil lidar com a incerteza?
Renzo – Tem inúmeras pesquisas que mostram que temos dificuldade tremenda de reter isso. A informação probabilística demanda um esforço cognitivo muito grande. É realmente complexo. É como se fôssemos programados mentalmente para não operar com probabilidade e para fingir que tudo é certo ou errado. Temos essa tendência a polarizar as coisas. E isso influencia a relação da sociedade com o clima. Veja, por exemplo, a estratégia dos agricultores. O agricultor familiar está o tempo inteiro prestando atenção em previsões do clima, só que não usa nenhuma. Ele espera o solo ficar úmido numa certa profundidade, para depositar a semente. Só que as primeiras chuvas da estação são fracas, o broto morre e ele precisa começar tudo de novo.

OP – A incerteza é parte da natureza.
Renzo – Acompanho os profetas da chuva desde 2002. É muito recorrente que alguns deles se digam observadores da natureza, e não profetas. Ser profeta tem uma carga simbólica religiosa muito forte e é um peso muito grande pra eles carregarem nas costas. Então eles fazem as previsões e, no final, dizem que quem sabe mesmo é Deus. O interessante é que, em termos de conteúdo, eles dizem exatamente o que a Funceme diz em termos de probabilidade. Existe uma incerteza envolvida. Então as pessoas aceitam, mas não dão à ciência o direito de viver as incertezas.

OP – Como a meteorologia figura nesta história?
Renzo – Uma parte dessa confusão tem a ver com a história da meteorologia no Ceará. Ela começou com muita fanfarronice, voando de avião, fazendo uma pulverização nas nuvens com sal de prata pra fazer chover mais rápido. Só que você percebe que o spray não produz chuva, só apressa. Então essa tecnologia sempre foi muito controvertida. Para uma mentalidade do sertão, isso equivaleria dizer que o homem da cidade se acreditava com o poder de produzir chuva. E tanto é assim que o Patativa do Assaré fez o poema Ao dotô do avião, onde ele coloca vários elementos importantes. O homem se adequa ao ciclo da natureza e não vice-versa. No Ceará, o clima sempre esteve ligado à religião. Então era desrespeitoso e absurdo achar que o cidadão iria produzir chuva.

OP – As pessoas já absorveram a gravidade do aquecimento global?
Renzo – Não sei se, algum dia, entenderemos o aquecimento global. A natureza funciona em ciclos. O dia e a noite, as estações do ano. São ciclos que, por serem curtos, a gente consegue entender bem. Só que existem aqueles que são muito longos. Costuma-se dizer que, aqui no semi-árido, existem ciclos onde duas ou três décadas são mais secas, depois outras mais chuvosas. Pode ser que exista um ciclo bem mais longo que a gente não tem nem ideia. Se o futuro provar que estamos errados, tudo bem, fizemos o que tinha de ser feito.

OP – Então existe uma visão positiva para o futuro?
Renzo – A ciência é feita por incertezas, ela só caminha porque ensina o que não sabe. Mas existe o que chamam de princípio precaucionário. Que diz que você precisa medir o quanto você perde se tiver certo e não fazer nada e o quanto perde se estiver errado e fizer muita coisa. Então imagina que não existe aquecimento global nenhum, só que tomamos as atitudes necessárias. O que a gente perde? Existe uma perda em termos de crescimento econômico. Agora a outra opção é que existe um aquecimento global, ele está acontecendo, tem a ver com produção industrial, mas a gente assume que não está acontecendo e não faz nada. O que perdemos no futuro? Várias pessoas dizem que não fazer nada pode ter um custo muito alto. A chance de estarmos certos é grande e mesmo que estejamos errados, tem como recuperar. Existe ainda um outro lado em que talvez não tenhamos como recuperar. Talvez a gente de fato passe por uma sequência grande de eventos extremos. O lance do aquecimento climático não tem a ver com o mundo ficar mais quente todo dia. O ponto é que eventos extremos, como chuva, furacão, tendam a ser mais frequentes. O nível do mar já está subindo, algumas nações já começam a se transferir. E voltamos à história de Fortaleza ser a Las Vegas do semi-árido. Não dá pra falar em cortar a emissão de carbono sem reduzir atividade industrial. E não podemos falar de aquecimento global sem redução de consumo. E como faz para a população da Aldeota parar de consumir tanto? Nos meus momentos mais pessimistas, eu penso que a humanidade só consegue se re-programar mentalmente em escala continental numa experiência de quase morte. O que significa uma imensa catástrofe. E aí todo mundo para e se repensa. Mas eu sou professor e tenho que acreditar que a educação tem o seu valor.

>Estudo europeu aperfeiçoará modelos para predizer o clima (FSP, JC)

>
JC e-mail 4187, de 27 de Janeiro de 2011

“É importante registrar que o estudo não abala a constatação de que a Groenlândia está, de qualquer forma, perdendo mais gelo do que acumula”

Marcelo Leite é jornalista. Artigo publicado na “Folha de SP”:

O artigo sobre geleiras da Groenlândia na atual edição da “Nature” constitui um bom exemplo das complexidades envolvidas nas previsões climáticas.

O trabalho científico é um exemplo, também, da dificuldade de apresentar ao público resultados incrementais da pesquisa.

À primeira vista, o estudo tira força da ideia de que o aquecimento global esteja acelerando a contribuição do gelo groenlandês para a elevação do nível dos mares.

O raciocínio era plausível. Com a atmosfera mais quente, ocorre mais derretimento na superfície. O líquido adicional fica disponível para penetrar por fendas até a base da geleira e lubrificar seu escorregamento.

Agora se sabe, graças ao grupo britânico e belga, que o fenômeno comporta um efeito de limiar.

Até um certo ponto de aquecimento, o derretimento superficial provoca aceleração. A partir desse ponto, a água passa a escoar melhor, sem lubrificar a base da geleira. Menos blocos gigantes de desprendem.

Isso não significa que seja nulo o efeito sobre a geleira groenlandesa. Algum aumento de temperatura de fato acelera sua ruptura.

O que foi posto em dúvida pelo estudo -até que novas pesquisas o confirmem ou refutem- é a hipótese de que o aumento contínuo de temperatura vá produzir uma perda linear de massa de gelo, sempre crescente.

Com esse conhecimento, os modelos de computador para predizer o comportamento do clima serão aperfeiçoados. Mas é importante registrar que o estudo não abala a constatação de que a Groenlândia está, de qualquer forma, perdendo mais gelo do que acumula.
(Folha de SP, 27/1)