Arquivo da tag: Previsão

The Battle Over Global Warming Is All in Your Head (Time)

Despite the fact that more people now acknowledge that climate change represents a significant threat to human well-being, this has yet to translate into any meaningful action. Psychologists may have an answer as to why this is

By , Aug. 19, 2013

165161414
ANDREY SMIRNOV/AFP/GETTY IMAGES. Climate campaigns, like this one from Greenpeace in Moscow, have failed to galvanize public support for strong climate action

Today the scientific community is in almost total agreement that the earth’s climate is changing as a result of human activity, and that this represents a huge threat to the planet and to us. According to a Pew survey conducted in March, however, public opinion lags behind the scientific conclusion, with only 69% of those surveyed accepting the view that the earth is warming — and only 1 in 4 Americans see global warming as a major threat. Still, 69% is a solid majority, which begs the question, Why aren’t we doing anything about it?

This political inertia in the face of unprecedented threat is the most fundamental challenge to tackling climate change. Climate scientists and campaigners have long debated how to better communicate the message to nonexperts so that climate science can be translated into action. According to Christopher Rapley, professor of climate science at University College London, the usual tactic of climate experts to provide the public with information isn’t enough because “it does not address key underlying causes.” We are all bombarded with the evidence of climate change on an almost a daily basis, from new studies and data to direct experiences of freakish weather events like last year’s epic drought in the U.S. The information is almost unavoidable.

If it’s not a data deficit that’s preventing people from doing more on global warming, what is it? Blame our brains. Renee Lertzman, an applied researcher who focuses on the psychological dimensions of sustainability, explains that the kind of systemic threat that climate change poses to humans is “unique both psychologically and socially.” We face a minefield of mental barriers and issues that prevent us from confronting the threat.

For some, the answer lies in cognitive science. Daniel Gilbert, a professor of psychology at Harvard, has written about why our inability to deal with climate change is due in part to the way our mind is wired. Gilbert describes four key reasons ranging from the fact that global warming doesn’t take a human form — making it difficult for us to think of it as an enemy — to our brains’ failure to accurately perceive gradual change as opposed to rapid shifts. Climate change has occurred slowly enough for our minds to normalize it, which is precisely what makes it a deadly threat, as Gilbert writes, “because it fails to trip the brain’s alarm, leaving us soundly asleep in a burning bed.”

Robert Gifford, a professor of psychology and environmental studies at the University of Victoria in Canada, also picks up on the point about our brains’ difficulty in grasping climate change as a threat. Gifford refers to this and other psychological barriers to mitigating climate change as “dragons of inaction.” Since authoring a paperon the subject in 2011 in which he outlined seven main barriers, or dragons, he has found many more. “We’re up to around 30,” he notes. “Now it’s time to think about how we can slay these dragons.” Gifford lists factors such as limited cognition or ignorance of the problem, ideologies or worldviews that may prevent action, social comparisons with other people and perceived inequity (the “Why should we change if X corporation or Y country won’t?”) and the perceived risks of changing our behavior.

Gifford is reluctant to pick out one barrier as being more powerful or limiting than another. “If I had to name one, I would nominate the lack of perceived behavioral control; ‘I’m only one person, what can I do?’ is certainly a big one.” For many, the first challenge will be in recognizing which dragons they have to deal with before they can overcome them. “If you don’t know what your problem is, you don’t know what the solution is,” says Gifford.

Yet this approach can only work if people are prepared to acknowledge that they have a problem. But for those of us who understand that climate change is a problem yet make little effort to cut the number of overseas trips we make or the amount of meat we consume, neither apathy nor denial really explains the dissonance between our actions and beliefs. Lertzman has come to the conclusion that this is not because of apathy — a lack of feeling — but because of the simple fact that we care an overwhelming amount about both the planet and our way of life, and we find that conflict too painful to bear. Our apparent apathy is just a defense mechanism in the face of this psychic pain.

“We’re reluctant to come to terms with the fact that what we love and enjoy and what gives us a sense of who we are is also now bound up with the most unimaginable devastation,” says Lertzman. “When we don’t process the pain of that, that’s when we get stuck and can’t move forward.” Lertzman refers to this inability to mourn as “environmental melancholia,” and points to South Africa’s postapartheid Truth and Reconciliation Commission as an example of how to effectively deal with this collective pain. “I’m not saying there should be one for climate or carbon, but there’s a lot to be said for providing a means for people to talk together about climate change, to make it socially acceptable to talk about it.”

Rosemary Randall, a trained psychotherapist, has organized something close to this. She runs the U.K.-based Carbon Conversations, a program that brings people together to talk in a group setting about ways of halving their personal carbon footprint. Writing in Aeon, an online magazine, Randall suggests that climate change is such a disturbing subject, that “like death, it can raise fears and anxieties that people feel have no place in polite conversation.” Randall acknowledges that while psychology and psychoanalysis aren’t the sole solutions to tackling climate change, “they do offer an important way of thinking about the problem.”

Lertzman says the mainstream climate-change community has been slow to register the value of psychology and social analysis in addressing global warming. “I think there’s a spark of some interest, but also a wariness of what this means, what it might look like,” she notes. Gifford says otherwise, however, explaining that he has never collaborated with other disciplines as much as he does now. “I may be a little biased because I’m invested in working in it, but in my view, climate change, and not mental health, is the biggest psychological problem we face today because it affects 100% of the global population.”

Despite the pain, shame, difficulty and minefield of other psychological barriers that we face in fully addressing climate change, both Lertzman and Gifford are still upbeat about our ability to face up to the challenge. “It’s patronizing to say that climate change is too big or abstract an issue for people to deal with,” says Lertzman. “There can’t be something about the human mind that stops us grappling with these issues given that so many people already are — maybe that’s what we should be focusing on instead.”

Read more: http://science.time.com/2013/08/19/in-denial-about-the-climate-the-psychological-battle-over-global-warming/#ixzz2chLdZ25H

They Finally Tested The ‘Prisoner’s Dilemma’ On Actual Prisoners — And The Results Were Not What You Would Expect (Business Insider Australia)

, 21 July 2013

Alcatraz Jail Prison

The “prisoner’s dilemma” is a familiar concept to just about anybody that took Econ 101.

The basic version goes like this. Two criminals are arrested, but police can’t convict either on the primary charge, so they plan to sentence them to a year in jail on a lesser charge. Each of the prisoners, who can’t communicate with each other, are given the option of testifying against their partner. If they testify, and their partner remains silent, the partner gets 3 years and they go free. If they both testify, both get two. If both remain silent, they each get one.

In game theory, betraying your partner, or “defecting” is always the dominant strategy as it always has a slightly higher payoff in a simultaneous game. It’s what’s known as a “Nash Equilibrium,” after Nobel Prize winning mathematician and A Beautiful Mind subject John Nash.

In sequential games, where players know each other’s previous behaviour and have the opportunity to punish each other, defection is the dominant strategy as well.

However, on a Pareto basis, the best outcome for both players is mutual cooperation.

Yet no one’s ever actually run the experiment on real prisoners before, until two University of Hamburg economists tried it out in a recent study comparing the behaviour of inmates and students.

Surprisingly, for the classic version of the game, prisoners were far more cooperative  than expected.

Menusch Khadjavi and Andreas Lange put the famous game to the test for the first time ever, putting a group of prisoners in Lower Saxony’s primary women’s prison, as well as students through both simultaneous and sequential versions of the game.The payoffs obviously weren’t years off sentences, but euros for students, and the equivalent value in coffee or cigarettes for prisoners.

They expected, building off of game theory and behavioural economic research that show humans are more cooperative than the purely rational model that economists traditionally use, that there would be a fair amount of first-mover cooperation, even in the simultaneous simulation where there’s no way to react to the other player’s decisions.

And even in the sequential game, where you get a higher payoff for betraying a cooperative first mover, a fair amount will still reciprocate.

As for the difference between student and prisoner behaviour, you’d expect that a prison population might be more jaded and distrustful, and therefore more likely to defect.

The results went exactly the other way for the simultaneous game, only 37% of students cooperate. Inmates cooperated 56% of the time.

On a pair basis, only 13% of student pairs managed to get the best mutual outcome and cooperate, whereas 30% of prisoners do.

In the sequential game, way more students (63%) cooperate, so the mutual cooperation rate skyrockets to 39%. For prisoners, it remains about the same.

What’s interesting is that the simultaneous game requires far more blind trust out from both parties, and you don’t have a chance to retaliate or make up for being betrayed later. Yet prisoners are still significantly more cooperative in that scenario.

Obviously the payoffs aren’t as serious as a year or three of your life, but the paper still demonstrates that prisoners aren’t necessarily as calculating, self-interested, and un-trusting as you might expect, and as behavioural economists have argued for years, as mathematically interesting as Nash equilibrium might be, they don’t line up with real behaviour all that well.

Vendettas, not war? Unpicking why our ancestors killed (New Scientist)

20:03 18 July 2013 by Bob Holmes

Is war in our blood? Perhaps not, if you believe a controversial new study that suggests violence in primitive cultures is overwhelmingly the result of personal squabbles, rather than organised violence between two different groups. The finding contradicts the popular view that humans have evolved to be innately warlike.

In recent years, many anthropologists and evolutionary biologists have come to believe that warfare arose deep in humans’ evolutionary past. In part that is because even chimpanzees exhibit this kind of intergroup violence, which suggests the trait shares a common origin. Proponents of this view also point to the occurrence of war in traditional hunter-gatherer societies today, such as some notoriously quarrelsome groups in the Amazon, and hence to its likely prevalence in early human societies.

Yet the archaeological record of warfare in early humans is sketchy, and not all contemporary hunter-gatherers make war.

In a bid to resolve the issue, Douglas Fry and Patrik Soderberg of Åbo Akademi University in Vasa, Finland, turned to the Ethnographic Atlas, a widely used database that was created in the 1960s to provide an unbiased cross-cultural sample of primitive societies.

From this, Fry and Soderberg selected the 21 societies that were exclusively nomadic hunter-gatherers – groups that upped sticks to wherever conditions were best – without livestock or social class divisions. They reasoned that these groups would most closely resemble early human societies.

Hello, sailor

The researchers then sifted through the early ethnographic accounts of each of these societies – the earliest of which was from the 17th century, while most were from the 19th and 20th centuries – and noted every reference to violent deaths, classifying them by how many people were involved and who they were. The records include accounts of events such as a man killing a rival for a woman, revenge killings for earlier deaths, and killing of outsiders such as shipwrecked sailors.

The pair found that in almost every society, deaths due to violence were rare – and the vast majority of those were one-on-one killings better classified as homicides than as warfare. Indeed, for 20 of the 21 societies, only 15 per cent of killings happened between two different groups. The exception was the Tiwi people of northern Australia, where intergroup feuds and retaliatory killings were common.

Fry and Soderberg say this suggests that warfare is rare in such primitive societies and may instead have become common only after the rise of more complex societies just a few thousand years ago. If so, then warfare would have likely played only a minor role in human evolution.

Anecdotal evidence

Not everyone agrees. For one thing, the data set Fry and Soderberg used is essentially a collection of anecdotes rather than a systematic survey of causes of death, says Kim Hill, an anthropologist at Arizona State University in Tempe. They are relying on the people who originally noted down these events to have included all the important details.

Moreover, they focus only on nomadic foragers and exclude any sedentary foraging societies – groups that would have foraged from a permanent base. Yet these sedentary foragers would probably have occupied the richest habitats and so would have been most likely to be involved in wars over territory, says Richard Wrangham, an anthropologist at Harvard University.

Fry and Soderberg are probably correct that most violent deaths are the result of homicide, not warfare – that was even true for the US during the Vietnam War, says Sam Bowles at the Santa Fe Institute in New Mexico. He has put forward the idea that altruism evolved out of the need for our ancestors to cooperate during times of war. But even if warfare is relatively uncommon, it can still exert an important evolutionary force, he says.

Journal reference: Science, DOI:10.1126/science.1235675

Climate Change Deniers Using Dirty Tricks from ‘Tobacco Wars’ (Science Daily)

July 4, 2013 — Fossil fuel companies have been funding smear campaigns that raise doubts about climate change, writes John Sauven in the latest issue of Index on Censorship magazine.

Environmental campaigner Sauven argues: “Some of the characters involved have previously worked to deny the reality of the hole in the ozone layer, acid rain and the link between tobacco and lung cancer. And the tactics they are applying are largely the same as those they used in the tobacco wars. Doubt is still their product.”

Governments around the world have also attempted to silence scientists who have raised concerns about climate change. Tactics used have included: the UK government spending millions infiltrating peaceful environmental organisations; Canadian government scientists barred from communicating with journalists without media officers; and US federal scientists pressured to remove words ‘global warming’ and ‘climate change’ from reports under the Bush administration.

Writing about government corruption in the Indian mining industry, Sauven says: “It will be in these expanding economies that the battle over the Earth’s future will be won or lost. And as in the tobacco wars, the fight over clean energy is likely to be a dirty one.”

Journal Reference:

  1. J. Sauven. Why can’t we tell the truth about climate change? Index on Censorship, 2013; 42 (2): 55 DOI:10.1177/0306422013494282

The Science of Why We Don’t Believe Science (Mother Jones)

How our brains fool us on climate, creationism, and the end of the world.

By  | Mon Apr. 18, 2011 3:00 AM PDT


“A MAN WITH A CONVICTION is a hard man to change. Tell him you disagree and he turns away. Show him facts or figures and he questions your sources. Appeal to logic and he fails to see your point.” So wrote the celebrated Stanford University psychologist Leon Festinger [1] (PDF), in a passage that might have been referring to climate change denial—the persistent rejection, on the part of so many Americans today, of what we know about global warming and its human causes. But it was too early for that—this was the 1950s—and Festinger was actually describing a famous case study [2] in psychology.

Festinger and several of his colleagues had infiltrated the Seekers, a small Chicago-area cult whose members thought they were communicating with aliens—including one, “Sananda,” who they believed was the astral incarnation of Jesus Christ. The group was led by Dorothy Martin, a Dianetics devotee who transcribed the interstellar messages through automatic writing.

Through her, the aliens had given the precise date of an Earth-rending cataclysm: December 21, 1954. Some of Martin’s followers quit their jobs and sold their property, expecting to be rescued by a flying saucer when the continent split asunder and a new sea swallowed much of the United States. The disciples even went so far as to remove brassieres and rip zippers out of their trousers—the metal, they believed, would pose a danger on the spacecraft.

Festinger and his team were with the cult when the prophecy failed. First, the “boys upstairs” (as the aliens were sometimes called) did not show up and rescue the Seekers. Then December 21 arrived without incident. It was the moment Festinger had been waiting for: How would people so emotionally invested in a belief system react, now that it had been soundly refuted?

Read also: the truth about Climategate. [3]. Read also: the truth about Climategate [4].

At first, the group struggled for an explanation. But then rationalization set in. A new message arrived, announcing that they’d all been spared at the last minute. Festinger summarized the extraterrestrials’ new pronouncement: “The little group, sitting all night long, had spread so much light that God had saved the world from destruction.” Their willingness to believe in the prophecy had saved Earth from the prophecy!

From that day forward, the Seekers, previously shy of the press and indifferent toward evangelizing, began to proselytize. “Their sense of urgency was enormous,” wrote Festinger. The devastation of all they had believed had made them even more certain of their beliefs.

In the annals of denial, it doesn’t get much more extreme than the Seekers. They lost their jobs, the press mocked them, and there were efforts to keep them away from impressionable young minds. But while Martin’s space cult might lie at on the far end of the spectrum of human self-delusion, there’s plenty to go around. And since Festinger’s day, an array of new discoveries in psychology and neuroscience has further demonstrated how our preexisting beliefs, far more than any new facts, can skew our thoughts and even color what we consider our most dispassionate and logical conclusions. This tendency toward so-called “motivated reasoning [5]” helps explain why we find groups so polarized over matters where the evidence is so unequivocal: climate change, vaccines, “death panels,” the birthplace and religion of the president [6] (PDF), and much else. It would seem that expecting people to be convinced by the facts flies in the face of, you know, the facts.

The theory of motivated reasoning builds on a key insight of modern neuroscience [7] (PDF): Reasoning is actually suffused with emotion (or what researchers often call “affect”). Not only are the two inseparable, but our positive or negative feelings about people, things, and ideas arise much more rapidly than our conscious thoughts, in a matter of milliseconds—fast enough to detect with an EEG device, but long before we’re aware of it. That shouldn’t be surprising: Evolution required us to react very quickly to stimuli in our environment. It’s a “basic human survival skill,” explains political scientist Arthur Lupia[8] of the University of Michigan. We push threatening information away; we pull friendly information close. We apply fight-or-flight reflexes not only to predators, but to data itself.

We apply fight-or-flight reflexes not only to predators, but to data itself.

We’re not driven only by emotions, of course—we also reason, deliberate. But reasoning comes later, works slower—and even then, it doesn’t take place in an emotional vacuum. Rather, our quick-fire emotions can set us on a course of thinking that’s highly biased, especially on topics we care a great deal about.

Consider a person who has heard about a scientific discovery that deeply challenges her belief in divine creation—a new hominid, say, that confirms our evolutionary origins. What happens next, explains political scientist Charles Taber [9] of Stony Brook University, is a subconscious negative response to the new information—and that response, in turn, guides the type of memories and associations formed in the conscious mind. “They retrieve thoughts that are consistent with their previous beliefs,” says Taber, “and that will lead them to build an argument and challenge what they’re hearing.”

In other words, when we think we’re reasoning, we may instead be rationalizing. Or to use an analogy offered by University of Virginia psychologist Jonathan Haidt [10]: We may think we’re being scientists, but we’re actually being lawyers [11] (PDF). Our “reasoning” is a means to a predetermined end—winning our “case”—and is shot through with biases. They include “confirmation bias,” in which we give greater heed to evidence and arguments that bolster our beliefs, and “disconfirmation bias,” in which we expend disproportionate energy trying to debunk or refute views and arguments that we find uncongenial.

That’s a lot of jargon, but we all understand these mechanisms when it comes to interpersonal relationships. If I don’t want to believe that my spouse is being unfaithful, or that my child is a bully, I can go to great lengths to explain away behavior that seems obvious to everybody else—everybody who isn’t too emotionally invested to accept it, anyway. That’s not to suggest that we aren’t also motivated to perceive the world accurately—we are. Or that we never change our minds—we do. It’s just that we have other important goals besides accuracy—including identity affirmation and protecting one’s sense of self—and often those make us highly resistant to changing our beliefs when the facts say we should.

Modern science originated from an attempt to weed out such subjective lapses—what that great 17th century theorist of the scientific method, Francis Bacon, dubbed the “idols of the mind.” Even if individual researchers are prone to falling in love with their own theories, the broader processes of peer review and institutionalized skepticism are designed to ensure that, eventually, the best ideas prevail.

Scientific evidence is highly susceptible to misinterpretation. Giving ideologues scientific data that’s relevant to their beliefs is like unleashing them in the motivated-reasoning equivalent of a candy store.

Our individual responses to the conclusions that science reaches, however, are quite another matter. Ironically, in part because researchers employ so much nuance and strive to disclose all remaining sources of uncertainty, scientific evidence is highly susceptible to selective reading and misinterpretation. Giving ideologues or partisans scientific data that’s relevant to their beliefs is like unleashing them in the motivated-reasoning equivalent of a candy store.

Sure enough, a large number of psychological studies have shown that people respond to scientific or technical evidence in ways that justify their preexisting beliefs. In a classic 1979 experiment [12] (PDF), pro- and anti-death penalty advocates were exposed to descriptions of two fake scientific studies: one supporting and one undermining the notion that capital punishment deters violent crime and, in particular, murder. They were also shown detailed methodological critiques of the fake studies—and in a scientific sense, neither study was stronger than the other. Yet in each case, advocates more heavily criticized the study whose conclusions disagreed with their own, while describing the study that was more ideologically congenial as more “convincing.”

Since then, similar results have been found for how people respond to “evidence” about affirmative action, gun control, the accuracy of gay stereotypes [13], and much else. Even when study subjects are explicitly instructed to be unbiased and even-handed about the evidence, they often fail.

And it’s not just that people twist or selectively read scientific evidence to support their preexisting views. According to research by Yale Law School professor Dan Kahan [14] and his colleagues, people’s deep-seated views about morality, and about the way society should be ordered, strongly predict whom they consider to be a legitimate scientific expert in the first place—and thus where they consider “scientific consensus” to lie on contested issues.

In Kahan’s research [15] (PDF), individuals are classified, based on their cultural values, as either “individualists” or “communitarians,” and as either “hierarchical” or “egalitarian” in outlook. (Somewhat oversimplifying, you can think of hierarchical individualists as akin to conservative Republicans, and egalitarian communitarians as liberal Democrats.) In one study, subjects in the different groups were asked to help a close friend determine the risks associated with climate change, sequestering nuclear waste, or concealed carry laws: “The friend tells you that he or she is planning to read a book about the issue but would like to get your opinion on whether the author seems like a knowledgeable and trustworthy expert.” A subject was then presented with the résumé of a fake expert “depicted as a member of the National Academy of Sciences who had earned a Ph.D. in a pertinent field from one elite university and who was now on the faculty of another.” The subject was then shown a book excerpt by that “expert,” in which the risk of the issue at hand was portrayed as high or low, well-founded or speculative. The results were stark: When the scientist’s position stated that global warming is real and human-caused, for instance, only 23 percent of hierarchical individualists agreed the person was a “trustworthy and knowledgeable expert.” Yet 88 percent of egalitarian communitarians accepted the same scientist’s expertise. Similar divides were observed on whether nuclear waste can be safely stored underground and whether letting people carry guns deters crime. (The alliances did not always hold. Inanother study [16] (PDF), hierarchs and communitarians were in favor of laws that would compel the mentally ill to accept treatment, whereas individualists and egalitarians were opposed.)

Head-on attempts to persuade can sometimes trigger a backfire effect, where people not only fail to change their minds when confronted with the facts—they may hold their wrong views more tenaciously than ever.

In other words, people rejected the validity of a scientific source because its conclusion contradicted their deeply held views—and thus the relative risks inherent in each scenario. A hierarchal individualist finds it difficult to believe that the things he prizes (commerce, industry, a man’s freedom to possess a gun to defend his family [16]) (PDF) could lead to outcomes deleterious to society. Whereas egalitarian communitarians tend to think that the free market causes harm, that patriarchal families mess up kids, and that people can’t handle their guns. The study subjects weren’t “anti-science”—not in their own minds, anyway. It’s just that “science” was whatever they wanted it to be. “We’ve come to a misadventure, a bad situation where diverse citizens, who rely on diverse systems of cultural certification, are in conflict,” says Kahan [17].

And that undercuts the standard notion that the way to persuade people is via evidence and argument. In fact, head-on attempts to persuade can sometimes trigger a backfire effect, where people not only fail to change their minds when confronted with the facts—they may hold their wrong views more tenaciously than ever.

Take, for instance, the question of whether Saddam Hussein possessed hidden weapons of mass destruction just before the US invasion of Iraq in 2003. When political scientists Brendan Nyhan and Jason Reifler showed subjects fake newspaper articles [18] (PDF) in which this was first suggested (in a 2004 quote from President Bush) and then refuted (with the findings of the Bush-commissioned Iraq Survey Group report, which found no evidence of active WMD programs in pre-invasion Iraq), they found that conservatives were more likely than before to believe the claim. (The researchers also tested how liberals responded when shown that Bush did not actually “ban” embryonic stem-cell research. Liberals weren’t particularly amenable to persuasion, either, but no backfire effect was observed.)

Another study gives some inkling of what may be going through people’s minds when they resist persuasion. Northwestern University sociologist Monica Prasad [19] and her colleagues wanted to test whether they could dislodge the notion that Saddam Hussein and Al Qaeda were secretly collaborating among those most likely to believe it—Republican partisans from highly GOP-friendly counties. So the researchers set up a study [20] (PDF) in which they discussed the topic with some of these Republicans in person. They would cite the findings of the 9/11 Commission, as well as a statement in which George W. Bush himself denied his administration had “said the 9/11 attacks were orchestrated between Saddam and Al Qaeda.”

One study showed that not even Bush’s own words could change the minds of Bush voters who believed there was an Iraq-Al Qaeda link.

As it turned out, not even Bush’s own words could change the minds of these Bush voters—just 1 of the 49 partisans who originally believed the Iraq-Al Qaeda claim changed his or her mind. Far more common was resisting the correction in a variety of ways, either by coming up with counterarguments or by simply being unmovable:

Interviewer: [T]he September 11 Commission found no link between Saddam and 9/11, and this is what President Bush said. Do you have any comments on either of those?

Respondent: Well, I bet they say that the Commission didn’t have any proof of it but I guess we still can have our opinions and feel that way even though they say that.

The same types of responses are already being documented on divisive topics facing the current administration. Take the “Ground Zero mosque.” Using information from the political myth-busting site FactCheck.org [21], a team at Ohio State presented subjects [22] (PDF) with a detailed rebuttal to the claim that “Feisal Abdul Rauf, the Imam backing the proposed Islamic cultural center and mosque, is a terrorist-sympathizer.” Yet among those who were aware of the rumor and believed it, fewer than a third changed their minds.

A key question—and one that’s difficult to answer—is how “irrational” all this is. On the one hand, it doesn’t make sense to discard an entire belief system, built up over a lifetime, because of some new snippet of information. “It is quite possible to say, ‘I reached this pro-capital-punishment decision based on real information that I arrived at over my life,'” explains Stanford social psychologist Jon Krosnick [23]. Indeed, there’s a sense in which science denial could be considered keenly “rational.” In certain conservative communities, explains Yale’s Kahan, “People who say, ‘I think there’s something to climate change,’ that’s going to mark them out as a certain kind of person, and their life is going to go less well.”

This may help explain a curious pattern Nyhan and his colleagues found when they tried to test the fallacy [6] (PDF) that President Obama is a Muslim. When a nonwhite researcher was administering their study, research subjects were amenable to changing their minds about the president’s religion and updating incorrect views. But when only white researchers were present, GOP survey subjects in particular were more likely to believe the Obama Muslim myth than before. The subjects were using “social desirabililty” to tailor their beliefs (or stated beliefs, anyway) to whoever was listening.

Which leads us to the media. When people grow polarized over a body of evidence, or a resolvable matter of fact, the cause may be some form of biased reasoning, but they could also be receiving skewed information to begin with—or a complicated combination of both. In the Ground Zero mosque case, for instance, a follow-up study [24] (PDF) showed that survey respondents who watched Fox News were more likely to believe the Rauf rumor and three related ones—and they believed them more strongly than non-Fox watchers.

Okay, so people gravitate toward information that confirms what they believe, and they select sources that deliver it. Same as it ever was, right? Maybe, but the problem is arguably growing more acute, given the way we now consume information—through the Facebook links of friends, or tweets that lack nuance or context, or “narrowcast [25]” and often highly ideological media that have relatively small, like-minded audiences. Those basic human survival skills of ours, says Michigan’s Arthur Lupia, are “not well-adapted to our information age.”

A predictor of whether you accept the science of global warming? Whether you’re a Republican or a Democrat.

If you wanted to show how and why fact is ditched in favor of motivated reasoning, you could find no better test case than climate change. After all, it’s an issue where you have highly technical information on one hand and very strong beliefs on the other. And sure enough, one key predictor of whether you accept the science of global warming is whether you’re a Republican or a Democrat. The two groups have been growing more divided in their views about the topic, even as the science becomes more unequivocal.

So perhaps it should come as no surprise that more education doesn’t budge Republican views. On the contrary: In a 2008 Pew survey [26], for instance, only 19 percent of college-educated Republicans agreed that the planet is warming due to human actions, versus 31 percent of non-college educated Republicans. In other words, a higher education correlated with an increased likelihood of denying the science on the issue. Meanwhile, among Democrats and independents, more education correlated with greater acceptance of the science.

Other studies have shown a similar effect: Republicans who think they understand the global warming issue best are least concerned about it; and among Republicans and those with higher levels of distrust of science in general, learning more about the issue doesn’t increase one’s concern about it. What’s going on here? Well, according to Charles Taber and Milton Lodge of Stony Brook, one insidious aspect of motivated reasoning is that political sophisticates are prone to be more biased than those who know less about the issues. “People who have a dislike of some policy—for example, abortion—if they’re unsophisticated they can just reject it out of hand,” says Lodge. “But if they’re sophisticated, they can go one step further and start coming up with counterarguments.” These individuals are just as emotionally driven and biased as the rest of us, but they’re able to generate more and better reasons to explain why they’re right—and so their minds become harder to change.

That may be why the selectively quoted emails of Climategate were so quickly and easily seized upon by partisans as evidence of scandal. Cherry-picking is precisely the sort of behavior you would expect motivated reasoners to engage in to bolster their views—and whatever you may think about Climategate, the emails were a rich trove of new information upon which to impose one’s ideology.

Climategate had a substantial impact on public opinion, according to Anthony Leiserowitz [27], director of the Yale Project on Climate Change Communication [28]. It contributed to an overall drop in public concern about climate change and a significant loss of trust in scientists. But—as we should expect by now—these declines were concentrated among particular groups of Americans: Republicans, conservatives, and those with “individualistic” values. Liberals and those with “egalitarian” values didn’t lose much trust in climate science or scientists at all. “In some ways, Climategate was like a Rorschach test,” Leiserowitz says, “with different groups interpreting ambiguous facts in very different ways.”

Is there a case study of science denial that largely occupies the political left? Yes: the claim that childhood vaccines are causing an epidemic of autism.

So is there a case study of science denial that largely occupies the political left? Yes: the claim that childhood vaccines are causing an epidemic of autism. Its most famous proponents are an environmentalist (Robert F. Kennedy Jr. [29]) and numerous Hollywood celebrities (most notably Jenny McCarthy [30] and Jim Carrey). TheHuffington Post gives a very large megaphone to denialists. And Seth Mnookin [31], author of the new book The Panic Virus [32], notes that if you want to find vaccine deniers, all you need to do is go hang out at Whole Foods.

Vaccine denial has all the hallmarks of a belief system that’s not amenable to refutation. Over the past decade, the assertion that childhood vaccines are driving autism rateshas been undermined [33] by multiple epidemiological studies—as well as the simple fact that autism rates continue to rise, even though the alleged offending agent in vaccines (a mercury-based preservative called thimerosal) has long since been removed.

Yet the true believers persist—critiquing each new study that challenges their views, and even rallying to the defense of vaccine-autism researcher Andrew Wakefield, afterhis 1998 Lancet paper [34]—which originated the current vaccine scare—was retracted and he subsequently lost his license [35] (PDF) to practice medicine. But then, why should we be surprised? Vaccine deniers created their own partisan media, such as the website Age of Autism, that instantly blast out critiques and counterarguments whenever any new development casts further doubt on anti-vaccine views.

It all raises the question: Do left and right differ in any meaningful way when it comes to biases in processing information, or are we all equally susceptible?

There are some clear differences. Science denial today is considerably more prominent on the political right—once you survey climate and related environmental issues, anti-evolutionism, attacks on reproductive health science by the Christian right, and stem-cell and biomedical matters. More tellingly, anti-vaccine positions are virtually nonexistent among Democratic officeholders today—whereas anti-climate-science views are becoming monolithic among Republican elected officials.

Some researchers have suggested that there are psychological differences between the left and the right that might impact responses to new information—that conservatives are more rigid and authoritarian, and liberals more tolerant of ambiguity. Psychologist John Jost of New York University has further argued that conservatives are “system justifiers”: They engage in motivated reasoning to defend the status quo.

This is a contested area, however, because as soon as one tries to psychoanalyze inherent political differences, a battery of counterarguments emerges: What about dogmatic and militant communists? What about how the parties have differed through history? After all, the most canonical case of ideologically driven science denial is probably the rejection of genetics in the Soviet Union, where researchers disagreeing with the anti-Mendelian scientist (and Stalin stooge) Trofim Lysenko were executed, and genetics itself was denounced as a “bourgeois” science and officially banned.

The upshot: All we can currently bank on is the fact that we all have blinders in some situations. The question then becomes: What can be done to counteract human nature itself?

We all have blinders in some situations. The question then becomes: What can be done to counteract human nature?

Given the power of our prior beliefs to skew how we respond to new information, one thing is becoming clear: If you want someone to accept new evidence, make sure to present it to them in a context that doesn’t trigger a defensive, emotional reaction.

This theory is gaining traction in part because of Kahan’s work at Yale. In one study [36], he and his colleagues packaged the basic science of climate change into fake newspaper articles bearing two very different headlines—”Scientific Panel Recommends Anti-Pollution Solution to Global Warming” and “Scientific Panel Recommends Nuclear Solution to Global Warming”—and then tested how citizens with different values responded. Sure enough, the latter framing made hierarchical individualists much more open to accepting the fact that humans are causing global warming. Kahan infers that the effect occurred because the science had been written into an alternative narrative that appealed to their pro-industry worldview.

You can follow the logic to its conclusion: Conservatives are more likely to embrace climate science if it comes to them via a business or religious leader, who can set the issue in the context of different values than those from which environmentalists or scientists often argue. Doing so is, effectively, to signal a détente in what Kahan has called a “culture war of fact.” In other words, paradoxically, you don’t lead with the facts in order to convince. You lead with the values—so as to give the facts a fighting chance.


Links:
[1] https://motherjones.com/files/lfestinger.pdf
[2] http://www.powells.com/biblio/61-9781617202803-1
[3] http://motherjones.com/environment/2011/04/history-of-climategate
[4] http://motherjones.com/environment/2011/04/field-guide-climate-change-skeptics
[5] http://www.ncbi.nlm.nih.gov/pubmed/2270237
[6] http://www-personal.umich.edu/~bnyhan/obama-muslim.pdf
[7] https://motherjones.com/files/descartes.pdf
[8] http://www-personal.umich.edu/~lupia/
[9] http://www.stonybrook.edu/polsci/ctaber/
[10] http://people.virginia.edu/~jdh6n/
[11] https://motherjones.com/files/emotional_dog_and_rational_tail.pdf
[12] http://synapse.princeton.edu/~sam/lord_ross_lepper79_JPSP_biased-assimilation-and-attitude-polarization.pdf
[13] http://psp.sagepub.com/content/23/6/636.abstract
[14] http://www.law.yale.edu/faculty/DKahan.htm
[15] https://motherjones.com/files/kahan_paper_cultural_cognition_of_scientific_consesus.pdf
[16] http://digitalcommons.law.yale.edu/cgi/viewcontent.cgi?article=1095&context=fss_papers
[17] http://seagrant.oregonstate.edu/blogs/communicatingclimate/transcripts/Episode_10b_Dan_Kahan.html
[18] http://www-personal.umich.edu/~bnyhan/nyhan-reifler.pdf
[19] http://www.sociology.northwestern.edu/faculty/prasad/home.html
[20] http://sociology.buffalo.edu/documents/hoffmansocinquiryarticle_000.pdf
[21] http://www.factcheck.org/
[22] http://www.comm.ohio-state.edu/kgarrett/FactcheckMosqueRumors.pdf
[23] http://communication.stanford.edu/faculty/krosnick/
[24] http://www.comm.ohio-state.edu/kgarrett/MediaMosqueRumors.pdf
[25] http://en.wikipedia.org/wiki/Narrowcasting
[26] http://people-press.org/report/417/a-deeper-partisan-divide-over-global-warming
[27] http://environment.yale.edu/profile/leiserowitz/
[28] http://environment.yale.edu/climate/
[29] http://www.huffingtonpost.com/robert-f-kennedy-jr-and-david-kirby/vaccine-court-autism-deba_b_169673.html
[30] http://www.huffingtonpost.com/jenny-mccarthy/vaccine-autism-debate_b_806857.html
[31] http://sethmnookin.com/
[32] http://www.powells.com/biblio/1-9781439158647-0
[33] http://discovermagazine.com/2009/jun/06-why-does-vaccine-autism-controversy-live-on/article_print
[34] http://www.thelancet.com/journals/lancet/article/PIIS0140673697110960/fulltext
[35] http://www.gmc-uk.org/Wakefield_SPM_and_SANCTION.pdf_32595267.pdf
[36] http://www.scribd.com/doc/3446682/The-Second-National-Risk-and-Culture-Study-Making-Sense-of-and-Making-Progress-In-The-American-Culture-War-of-Fact

Climate change poses grave threat to security, says UK envoy (The Guardian)

Rear Admiral Neil Morisetti, special representative to foreign secretary, says governments can’t afford to wait for 100% certainty

The Guardian, Sunday 30 June 2013 18.19 BST

Flooding in Thailand in 2011

Flooding in Thailand in 2011. Photograph: Narong Sangnak/EPA

Climate change poses as grave a threat to the UK’s security and economic resilience as terrorism and cyber-attacks, according to a senior military commander who was appointed as William Hague’s climate envoy this year.

In his first interview since taking up the post, Rear Admiral Neil Morisetti said climate change was “one of the greatest risks we face in the 21st century”, particularly because it presented a global threat. “By virtue of our interdependencies around the world, it will affect all of us,” he said.

He argued that climate change was a potent threat multiplier at choke points in the global trade network, such as the Straits of Hormuz, through which much of the world’s traded oil and gas is shipped.

Morisetti left a 37-year naval career to become the foreign secretary’s special representative for climate change, and represents the growing influence of hard-headed military thinking in the global warming debate.

The link between climate change and global security risks is on the agenda of the UK’s presidency of the G8, including a meeting to be chaired by Morissetti in July that will include assessment of hotspots where climate stress is driving migration.

Morisetti’s central message was simple and stark: “The areas of greatest global stress and greatest impacts of climate change are broadly coincidental.”

He said governments could not afford to wait until they had all the information they might like. “If you wait for 100% certainty on the battlefield, you’ll be in a pretty sticky state,” he said.

The increased threat posed by climate change arises because droughts, storms and floods are exacerbating water, food, population and security tensions in conflict-prone regions.

“Just because it is happening 2,000 miles away does not mean it is not going to affect the UK in a globalised world, whether it is because food prices go up, or because increased instability in an area – perhaps around the Middle East or elsewhere – causes instability in fuel prices,” Morisetti said.

“In fact it is already doing so,” he added, noting that Toyota’s UK car plants had been forced to switch to a three-day week after extreme floods in Thailand cut the supply chain. Computer firms in California and Poland were left short of microchips by the same floods.

Morisetti is far from the only military figure emphasising the climate threat to security. America’s top officer tackling the threat from North Korea and China has said the biggest long-term security issue in the region is climate change.

In a recent interview, Admiral Samuel J Locklear III, who led the US naval action in Libya that helped topple Muammar Gaddafi, said a significant event related to the warming planet was “the most likely thing that is going to happen that will cripple the security environment, probably more likely than the other scenarios we all often talk about”.

There is a reason why the military are so clear-headed about the climate threat, according to Professor John Schellnhuber, a scientist who briefed the UN security council on the issue in February and formerly advised the German chancellor, Angela Merkel.

“The military do not deal with ideology. They cannot afford to: they are responsible for the lives of people and billions of pounds of investment in equipment,” he said. “When the climate change deniers took their stance after the Copenhagen summit in 2009, it is very interesting that the military people were never shaken from the idea that we are about to enter a very difficult period.”

He added: “This danger of the creation of violent conflicts is the strongest argument why we should keep climate change under control, because the international system is not stable, and the slightest thing, like the food riots in the Middle East, could make the whole system explode.”

The military has been quietly making known its concern about the climate threat to security for some time. General Wesley Clark, who commanded the Nato bombing of Yugoslavia during the Kosovo war, said in 2005: “Stopping global warming is not just about saving the environment, it’s about securing America for our children and our children’s children, as well.”

In the same year Chuck Hagel, now Obama’s defence secretary, said: “I don’t think you can separate environmental policy from economic policy or energy policy.”

Morisetti said there was also a direct link between climate change and the military because of the latter’s huge reliance on fossil fuels. “In Afghanistan, where we have had to import all our energy into the country along a single route that has been disrupted, the US military have calculated that for every 24 convoys there has been a casualty. There is a cost associated in bringing in that energy in both blood and treasure.

“So to drive up efficiency and to use alternative fuels, wind, solar, makes eminent sense to the military,” he said, noting that the use of solar blankets in Afghanistan meant fewer fuel resupply missions. “The principles of delivering your outputs more effectively, reducing your risks and reducing your costs reads across far more widely than just the military: most businesses would be looking for that too.”

Morisetti’s former employer, the Ministry of Defence, agrees that the climate threat is a serious one. The last edition of the Global Strategic Trends analysis published by the MoD’s Development, Concepts and Doctrine Centre concludes: “Climate change will amplify existing social, political and resource stresses, shifting the tipping point at which conflict ignites … Out to 2040, there are few convincing reasons to suggest that the world will become more peaceful.”

Schellnhuber was also clear about the consequences of failing to curb global warming. “The last 11,000 years – the Holocene – was characterised by the extreme stability of global climate. It is the only period when human civilisation could have developed at all,” he said. “But I don’t think a global, interconnected world can be managed in peace if climate change means we are leaving the Holocene. Let’s pray we will have a Lincoln or a Gorbachev to lead us.”

Cientistas sociais procuram modelo para onda de protestos no Brasil (Folha de S.Paulo)

23/06/2013 – 11h09

CASSIANO ELEK MACHADO
GRACILIANO ROCHA

Olhem para paris, diz Teresa Caldeira. Mas não a de Maio de 68: para a antropóloga brasileira radicada nos EUA, professora da Universidade da Califórnia em Berkeley, a análise das manifestações que tomaram o país na semana passada deve se pautar pelos distúrbios que eclodiram nas periferias francesas em 2005, quando cidades suburbanas na região metropolitana de Paris (“banlieues”) explodiram em uma onda de protestos sociais.

Especialista em antropologia urbana, Caldeira, 58, pesquisa a cultura da periferia, em especial a de São Paulo, e diz que se vários cientistas sociais se declararam surpresos, para ela não há novidade.

“Todos comparam com Istambul ou com a Primavera Árabe, mas deveriam olhar para o que houve em Paris há oito anos”, diz Caldeira. “Dá muito bem para entender o que está acontecendo e isso vem sendo articulado há muito tempo”, acredita a antropóloga, autora do livro “Cidade de Muro: Crime, Segregação e Cidadania” (Editora 34).

Ela lembra que o Movimento Passe Livre (MPL) existe há muitos anos e afirma que ele “articula todo o imaginário da produção cultural da periferia”.

“A Folha fez uma foto em 2010 de um grafite feito pelo MPL no Minhocão, em São Paulo, que dizia ‘A cidade só existe para quem pode se movimentar por ela’.”

Caldeira reproduziu a imagem em um artigo dela na revista “Public Culture” (Duke University Press, 2012) e a frase do grafite como uma ideia fundamental do movimento cultural da periferia. “Rap, literatura marginal, pixação, saraus, todos se fazem na base e rede e de circulação. E circular por São Paulo é um caos para quem não tem dinheiro.”

Opinião diferente tem o sociólogo francês Sebastian Roché. Em seu livro “Le Frisson de l’Émeute”, (Seuil, sem tradução no Brasil), ele afirma que as revoltas que inflamaram a França -cujo estopim foi a morte de dois adolescentes eletrocutados em uma perseguição policial- foram protagonizadas por jovens que se consideram vítimas da xenofobia por não terem a pele branca e, na maioria, filhos de imigrantes e muçulmanos.

“Os jovens muçulmanos, muito numerosos nas ‘banlieues’, não se sentem aceitos nem respeitados em suas crenças. Além disso, essa juventude foi abandonada à própria sorte. Nas ‘banlieues’, a taxa de desemprego oscila entre 25% e 40% entre jovens com menos de 25 anos”, frisa Roché.

Professor da celebrada Sciences Po (Instituto de Estudos Políticos), da Universidade de Grenoble e pesquisador do CNRS (Centro Nacional de Pesquisa Social), Roché diz ter acompanhado com atenção a onda de protestos no Brasil, e não vê “muitos pontos de comparação” entre o que aconteceu aqui e lá. Na França, diz ele, “não foram pobres destruindo o meio de vida de outros pobres”.

“A burguesia ou o governo não foram os alvos. Nenhum espaço do poder foi sitiado ou tomado. Ninguém se aproximou, por exemplo, do parlamento nem da sede do governo [como ocorreu no Brasil]. Aqui, os grupos operavam durante a noite, escondiam o rosto em capuzes e muitas vezes buscavam o confronto com a polícia. Não houve qualquer manifestação de massa, nenhum líder ou palavra de ordem emergiu.”

Teresa Caldeira, que no ano passado ganhou a prestigiosa bolsa Guggenheim de pesquisa, nos EUA, aponta outra foto dos movimentos recentes, que ela diz ter visto nas redes sociais, como icônica do que está acontecendo. Dois rapazes seguravam cartazes: um dizendo “O Brasil acordou” e outro “A periferia nunca dormiu”.

Também chamaram a atenção dela as faixas que faziam referências ao trabalho da polícia. “A PM está fazendo na Paulista o que faz todo dia na periferia”, dizia uma delas. “Há uma tensão de classes latente. E não me surpreende que os protestos tenham chegado agora na periferia”, diz ela, citando como exemplo as manifestações dos últimos dias em regiões como a estrada do M’Boi Mirim (na zona sul de São Paulo).

Ela aposta que, “tal como em Paris, em 2005, veremos agora a explosão da periferia”. Ainda que, segundo ela, a presença de classes A e B tenha tido importante papel na eclosão dos movimentos, os protestos veiculam uma insatisfação que vem sendo cozinhada nas periferias. “Uma coisa é de onde vem o caldo e a outra é a forma que a manifestação adquire. Na forma, parece um pouco com a Primavera Árabe: a maneira como circularam as informações e a insatisfação com as instituições políticas tradicionais”, diz.

“No conteúdo, é muito significativo que tenha estourado pelos R$ 0,20. Ninguém aguenta mais os ônibus da cidade. Conheço muita gente da periferia, devido às pesquisas, que todos os dias posta algo em mídias sociais contra o transporte público.”

Para o francês Roché, “a melhoria das condições de vida faz com que aqueles que se sentem excluídos se mobilizem coletivamente para reivindicar, como é o caso do Brasil”. “Na França, a questão é de exclusão social em um período de estagnação econômica, e a revolta de 2005 não gerou um modelo coletivo de massas e organizado. Não houve protesto contestador, mas sim apropriações individuais, como roubos e saques, ou então confrontos e destruição para exprimir a raiva. Nas ‘banlieues’, não houve reivindicação explícita.”

Ele afirma que, embora “revoltas possam ensinar muito aos governantes”, isso depende de eles “serem capazes de olhá-las de frente”. “Na França, nós não aprendemos muito. Em novembro de 2005, a França estava a um ano e meio das eleições presidenciais. O então ministro do Interior [Nicolas Sarkozy, presidente entre 2007-2012] viu naquilo uma oportunidade de reafirmar sua autoridade e estigmatizar as ‘banlieues’ e seus habitantes com vistas à eleição de 2007” -que ele terminaria vencendo. “Nenhuma análise política foi feita pelo Parlamento e menos ainda pelo ministério do Interior, proibido de refletir sobre sua atuação pelo próprio ministro.”

Para o sociólogo, as revoltas urbanas podem, ainda, exprimir um desejo de participação direta nas decisões públicas, no caso de países como Brasil e Turquia. “Nesses dois países, muitos jovens com acesso à educação apresentam reivindicações sobre o direito à diferença e que sejam levadas em conta suas demandas sociais pelo poder central desses países.”
O estudioso considera que “há progressos econômicos tanto no Brasil quanto na Turquia, e esses movimentos de contestação se dão em um contexto bem diferente do que ocorreu na França, cujo crescimento econômico tem sido mínimo ou nulo nos últimos anos”.

Para o sociólogo “o que está acontecendo no Brasil se parece mais com o Maio de 68”. “Naquela época a França vivia em pleno Les Trente Glorieuses [como ficaram conhecidas as três décadas de crescimento e prosperidade no pós-Guerra], e a juventude, com trabalho ou diplomas, mergulhou numa luta para que seu modo de vida e aspirações fossem reconhecidos pelo governo”, recorda.

Centro brasileiro aumenta em quatro vezes a precisão da previsão do tempo (JC/O Globo)

JC e-mail 4746, de 13 de Junho de 2013.

Novo modelo do CPTEC, que usa o supercomputador Tupã, consegue mapear com resolução de cinco quilômetros quadrados. Reportagem de O Globo

Os olhos da previsão do tempo no Brasil passaram a enxergar melhor. Com quatro vezes mais precisão, mais precisamente. O Centro de Previsão do Tempo e Estudos Climáticos (CPTEC/INPE) lançou uma atualização do modelo Brams de previsão, turbinado agora pela alta capacidade de processamento do supercomputador Tupã, instalado em Cachoeira Paulista. Antes, o Brams fazia previsões de até uma semana com nitidez de 20 quilômetros quadrados. Agora, a resolução é de 5 quilômetros quadrados para os mesmos sete dias.

Com a nova versão, o nível de detalhe da previsão, que antes se limitava a uma cidade ou região, desta vez consegue diferenciar um bairro do outro. A consulta ao novo modelo meteorológico é gratuita e está disponível no site do CPTEC.

Para cobrir toda a América do Sul, o Brams dividiu o território como num grande jogo de batalha naval, com 1360 por 1480 células de área. Como é um modelo em três dimensões, há também 55 níveis verticais para cada uma destas células. No total, são 110 milhões de pontos, processados simultaneamente nos 9.600 processadores do Tupã.

Segundo o CPTEC, a versão 5.0 do Brams coloca o Brasil em posição de competitividade com os principais centros operacionais do mundo. O centro de previsão do National Centers for Environmental Prediction (NCEP), por exemplo, gera previsões a partir de um modelo similar – o National Mesoscale Model – de 4 quilômetros, 70 níveis verticais e grade de 1371 x 1100 células, que cobre toda a região continental dos Estados Unidos.

Para desenvolver esta nova versão do modelo BRAMS, também utilizado para a previsão e monitoramento da poluição do ar, são usados dados de estações meteorológicas de todo o país, de satélites, boias oceânicas e imagens de avião.

http://oglobo.globo.com/ciencia/centro-brasileiro-aumenta-em-quatro-vezes-precisao-da-previsao-do-tempo-8667823#ixzz2W6YkxWkF

* * *

Inpe lança modelo de previsão de tempo com altíssima resolução

O novo padrão cobre toda a América do Sul

Uma nova versão do modelo regional Brams de previsão de tempo, que cobre toda a América do Sul, foi lançada pelo Centro de Previsão do Tempo e Estudos Climáticos (CPTEC)do Instituto Nacional de Pesquisas Espaciais (Inpe/MCTI). O Brams, versão 5.0, já está operacional para até sete dias.

O modelo gera previsões com resolução espacial de 5 quilômetros, enquanto o anterior fornecia previsões com resolução de 20 quilômetros. O avanço só foi possível devido À alta capacidade de processamento do novo supercomputador Cray, do Inpe, o Tupã, instalado no CPTEC, em Cachoeira Paulista (SP).

Os desenvolvimentos para tornar a nova versão do Brams operacional levaram cerca de um ano. Para cobrir toda a extensão da América do Sul, foram necessárias 1.360 x 1.480 células horizontais e 55 níveis verticais. As células de grade, num total de 110 milhões, aproximadamente, são processadas simultaneamente nos 9.600 processadores do Cray, em computação paralela.

Este esforço coordenado pelo Grupo de Modelagem Atmosférica e Interfaces (Gmai) colocou o CPTEC em posição de competitividade em relação aos principais centros operacionais do mundo. O centro de previsão do National Centers for Environmental Prediction (NCEP), por exemplo, gera previsões a partir de um modelo similar – o National Mesoscale Model – de 4 quilômetros, 70 níveis verticais e grade de 1.371 x 1.100 células, que cobre toda a região continental dos Estados Unidos.

Para desenvolver esta nova versão do modelo Brams, também utilizado para a previsão e monitoramento da poluição do ar, utilizou-se um modelo não hidrostático, que representa com maior precisão processos físicos de menor escala, como o desenvolvimento e a dissipação de nuvens e chuvas. Diversos avanços em parametrização (representações matemáticas de processos físicos) foram realizados para nuvens, radiação solar e processos e dinâmicas de superfície.

(Ascom do Inpe)

When Will My Computer Understand Me? (Science Daily)

June 10, 2013 — It’s not hard to tell the difference between the “charge” of a battery and criminal “charges.” But for computers, distinguishing between the various meanings of a word is difficult.

A “charge” can be a criminal charge, an accusation, a battery charge, or a person in your care. Some of those meanings are closer together, others further apart. (Credit: Image courtesy of University of Texas at Austin, Texas Advanced Computing Center)

For more than 50 years, linguists and computer scientists have tried to get computers to understand human language by programming semantics as software. Driven initially by efforts to translate Russian scientific texts during the Cold War (and more recently by the value of information retrieval and data analysis tools), these efforts have met with mixed success. IBM’s Jeopardy-winningWatson system and Google Translate are high profile, successful applications of language technologies, but the humorous answers and mistranslations they sometimes produce are evidence of the continuing difficulty of the problem.

Our ability to easily distinguish between multiple word meanings is rooted in a lifetime of experience. Using the context in which a word is used, an intrinsic understanding of syntax and logic, and a sense of the speaker’s intention, we intuit what another person is telling us.

“In the past, people have tried to hand-code all of this knowledge,” explained Katrin Erk, a professor of linguistics at The University of Texas at Austin focusing on lexical semantics. “I think it’s fair to say that this hasn’t been successful. There are just too many little things that humans know.”

Other efforts have tried to use dictionary meanings to train computers to better understand language, but these attempts have also faced obstacles. Dictionaries have their own sense distinctions, which are crystal clear to the dictionary-maker but murky to the dictionary reader. Moreover, no two dictionaries provide the same set of meanings — frustrating, right?

Watching annotators struggle to make sense of conflicting definitions led Erk to try a different tactic. Instead of hard-coding human logic or deciphering dictionaries, why not mine a vast body of texts (which are a reflection of human knowledge) and use the implicit connections between the words to create a weighted map of relationships — a dictionary without a dictionary?

“An intuition for me was that you could visualize the different meanings of a word as points in space,” she said. “You could think of them as sometimes far apart, like a battery charge and criminal charges, and sometimes close together, like criminal charges and accusations (“the newspaper published charges…”). The meaning of a word in a particular context is a point in this space. Then we don’t have to say how many senses a word has. Instead we say: ‘This use of the word is close to this usage in another sentence, but far away from the third use.'”

To create a model that can accurately recreate the intuitive ability to distinguish word meaning requires a lot of text and a lot of analytical horsepower.

“The lower end for this kind of a research is a text collection of 100 million words,” she explained. “If you can give me a few billion words, I’d be much happier. But how can we process all of that information? That’s where supercomputers and Hadoop come in.”

Applying Computational Horsepower

Erk initially conducted her research on desktop computers, but around 2009, she began using the parallel computing systems at the Texas Advanced Computing Center (TACC). Access to a special Hadoop-optimized subsystem on TACC’s Longhornsupercomputer allowed Erk and her collaborators to expand the scope of their research. Hadoop is a software architecture well suited to text analysis and the data mining of unstructured data that can also take advantage of large computer clusters. Computational models that take weeks to run on a desktop computer can run in hours on Longhorn. This opened up new possibilities.

“In a simple case we count how often a word occurs in close proximity to other words. If you’re doing this with one billion words, do you have a couple of days to wait to do the computation? It’s no fun,” Erk said. “With Hadoop on Longhorn, we could get the kind of data that we need to do language processing much faster. That enabled us to use larger amounts of data and develop better models.”

Treating words in a relational, non-fixed way corresponds to emerging psychological notions of how the mind deals with language and concepts in general, according to Erk. Instead of rigid definitions, concepts have “fuzzy boundaries” where the meaning, value and limits of the idea can vary considerably according to the context or conditions. Erk takes this idea of language and recreates a model of it from hundreds of thousands of documents.

Say That Another Way

So how can we describe word meanings without a dictionary? One way is to use paraphrases. A good paraphrase is one that is “close to” the word meaning in that high-dimensional space that Erk described.

“We use a gigantic 10,000-dimentional space with all these different points for each word to predict paraphrases,” Erk explained. “If I give you a sentence such as, ‘This is a bright child,’ the model can tell you automatically what are good paraphrases (‘an intelligent child’) and what are bad paraphrases (‘a glaring child’). This is quite useful in language technology.”

Language technology already helps millions of people perform practical and valuable tasks every day via web searches and question-answer systems, but it is poised for even more widespread applications.

Automatic information extraction is an application where Erk’s paraphrasing research may be critical. Say, for instance, you want to extract a list of diseases, their causes, symptoms and cures from millions of pages of medical information on the web.

“Researchers use slightly different formulations when they talk about diseases, so knowing good paraphrases would help,” Erk said.

In a paper to appear in ACM Transactions on Intelligent Systems and Technology, Erk and her collaborators illustrated they could achieve state-of-the-art results with their automatic paraphrasing approach.

Recently, Erk and Ray Mooney, a computer science professor also at The University of Texas at Austin, were awarded a grant from the Defense Advanced Research Projects Agency to combine Erk’s distributional, high dimensional space representation of word meanings with a method of determining the structure of sentences based on Markov logic networks.

“Language is messy,” said Mooney. “There is almost nothing that is true all the time. “When we ask, ‘How similar is this sentence to another sentence?’ our system turns that question into a probabilistic theorem-proving task and that task can be very computationally complex.”

In their paper, “Montague Meets Markov: Deep Semantics with Probabilistic Logical Form,” presented at the Second Joint Conference on Lexical and Computational Semantics (STARSEM2013) in June, Erk, Mooney and colleagues announced their results on a number of challenge problems from the field of artificial intelligence.

In one problem, Longhorn was given a sentence and had to infer whether another sentence was true based on the first. Using an ensemble of different sentence parsers, word meaning models and Markov logic implementations, Mooney and Erk’s system predicted the correct answer with 85% accuracy. This is near the top results in this challenge. They continue to work to improve the system.

There is a common saying in the machine-learning world that goes: “There’s no data like more data.” While more data helps, taking advantage of that data is key.

“We want to get to a point where we don’t have to learn a computer language to communicate with a computer. We’ll just tell it what to do in natural language,” Mooney said. “We’re still a long way from having a computer that can understand language as well as a human being does, but we’ve made definite progress toward that goal.”

People Are Overly Confident in Their Own Knowledge, Despite Errors (Science Daily)

June 10, 2013 — Overprecision — excessive confidence in the accuracy of our beliefs — can have profound consequences, inflating investors’ valuation of their investments, leading physicians to gravitate too quickly to a diagnosis, even making people intolerant of dissenting views. Now, new research confirms that overprecision is a common and robust form of overconfidence driven, at least in part, by excessive certainty in the accuracy of our judgments.

New research confirms that overprecision is a common and robust form of overconfidence driven, at least in part, by excessive certainty in the accuracy of our judgments. (Credit: © pressmaster / Fotolia)

The research, conducted by researchers Albert Mannes of The Wharton School of the University of Pennsylvania and Don Moore of the Haas School of Business at the University of California, Berkeley, revealed that the more confident participants were about their estimates of an uncertain quantity, the less they adjusted their estimates in response to feedback about their accuracy and to the costs of being wrong.

“The findings suggest that people are too confident in what they know and underestimate what they don’t know,” says Mannes.

The new findings are published in Psychological Science, a journal of the Association for Psychological Science.

Research investigating overprecision typically involves asking people to come up with a 90% confidence interval around a numerical estimate — such as the length of the Nile River — but this doesn’t always faithfully reflect the judgments we have to make in everyday life. We know, for example, that arriving 15 minutes late for a business meeting is not the same as arriving 15 minutes early, and that we ought to err on the side of arriving early.

Mannes and Moore designed three studies to account for the asymmetric nature of many everyday judgments. Participants estimated the local high temperature on randomly selected days and their accuracy was rewarded in the form of lottery tickets toward a prize. For some trials, they earned tickets if their estimates were correct or close to the actual temperature (above or below); in other trials, they earned tickets for correct guesses or overestimates; and in some trials they earned tickets for correct guesses or underestimates.

The results showed that participants adjusted their estimates in the direction of the anticipated payoff after receiving feedback about their accuracy, just as Mannes and Moore expected.

But they didn’t adjust their estimates as much as they should have given their actual knowledge of local temperatures, suggesting that they were overly confident in their own powers of estimation.

Only when the researchers provided exaggerated feedback — in which errors were inflated by 2.5 times — were the researchers able to counteract participants’ tendency towards overprecision.

The new findings, which show that overprecision is a common and robust phenomenon, urge caution:

“People frequently cut things too close — arriving late, missing planes, bouncing checks, or falling off one of the many ‘cliffs’ that present themselves in daily life,” observe Mannes and Moore.

“These studies tell us that you shouldn’t be too certain about what’s going to happen, especially when being wrong could be dangerous. You should plan to protect yourself in case you aren’t as right as you think you are.”

Journal Reference:

  1. A. E. Mannes, D. A. Moore. A Behavioral Demonstration of Overconfidence in JudgmentPsychological Science, 2013; DOI: 10.1177/0956797612470700

Roberto DaMatta: Como não perder no futebol? (OESP)

12 de junho de 2013 | 2h 10

ROBERTO DAMATTA – O Estado de S.Paulo

-Qual é o maior problema do “nosso” futebol?

– Todos! Não ganhamos nada!

– Mas exportamos o futebol do Brasil para todo o mundo. Todos jogam como nós.

– Tudo bem… Mas por que não conseguimos ganhar?

– Precisamente por causa disso. Todo sucesso vira fracasso. Quem ganha perde…

Ouvi isso na barca indo para o Rio, eu que continuo insistindo em morar em Niterói. Ora, morar em Niterói é como não saber que o futebol sofre de um pecado original: o nosso time não pode perder. E, no entanto, se um time fosse eternamente ganhador os estádios ficariam vazios.

Num espaço de tempo que hoje engloba uns 100 anos, contabilizamos muitos jogos e, em consequência, muitas perdas e ganhos. As derrotas, contudo, são mais lembradas porque nossa memória retém – como dizia Freud – mais a ferida e o sofrimento (o trauma) do que o gozo, o encantamento e a beleza de céu estrelado das experiências transitórias (aliás, Freud tem um belíssimo ensaio sobre esse assunto). O belo passa e o feio fica? De modo algum. Mas o bom é amarrado com teias de aranha, ao passo que o ruim deixa cicatrizes. Pensamos a vida como uma escada quando, de fato, ela é uma bola que gira sem parar e corre mais do que nós.

Notei num ensaio presunçoso que, em inglês, existe uma diferença entre jogar e jogar. Entre “to gamble” e “to play”; entre ir a um cassino para apostar ou jogar tênis e tocar um piano. Num caso é necessário algum tipo de habilidade sem a qual não há música ou disputa, mas nos jogos de azar basta ter sorte. Mas além de “gamble” e “play” existe a palavra “match” para designar o encontro equilibrado entre dois adversários.

Veja o leitor. Na roleta não há um “match”, porque as chances são da banca. É um jogo com aficionados, mas sem “atletas”. Ninguém compete com uma roleta, mas contra ela. No mundo do esporte, porém, a disputa se transforma em competição. A igualdade inicial é um ponto central da dualidade constitutiva do esporte. Ora, a dualidade é o eixo sobre o qual gira a reciprocidade contida das fórmulas da caridade, das boas maneiras, da vingança e do dar-para-receber como viu Marcel Mauss. A palavra “partida” designa isso e antigamente era usada para se falar do futebol que retorna com a força das paixões recalcadas.

Para nós, brasileiros, o verbo jogar engloba tanto o jogo de azar (como o famoso e até hoje milagrosamente ilegal “jogo do bicho” e as loterias bancadas pelo governo) quanto o encontro esportivo regrado e igualitário, essa disputa agônica constitutivamente ligada à probabilidade de vencer ou perder.

Mas se uma mesma palavra – jogo – junta o jogo de azar e a disputa esportiva – nem por isso lembramos que o futebol é imprevisível. Nossa leitura canônica do futebol é sempre a de uma luta na qual o time do nosso coração vai ganhar, daí as desilusões das derrotas. Podemos perder, sem dúvida, mas resistimos freudianamente a pensar nessa possibilidade. Temos perdido muito, sem dúvida, mas recusamos perpetrar a única coisa acertada diante da derrota: aceitá-la.

Surge, então, o problema cósmico do futebol no Brasil. Como admitir que perder e ganhar fazem parte da própria estrutura desse jogo, se nós – em princípio – não lemos na palavra jogo a possibilidade de derrota? A agonia e o prazer do futebol estão ligados precisamente a essa possibilidade, mas isso é afastado do nosso consciente. Quando vamos ao jogo, vamos à vitória e há motivos para isso. Um deles eu mencionei na semana passada: o futebol foi o primeiro elemento extraordinariamente positivo de uma autovisão que era permanentemente negativa. Como imaginar que um povo convencido de sua inferioridade natural como atrasado, porque era mestiço, pudesse disputar (e vencer) os brancos “adiantados” e “puros”, que inventaram a civilização e o futebol?

Quando começamos a dominar o futebol dele, fazendo um fato social total: algo com elementos econômicos, religiosos, culturais, morais, políticos, filosóficos e cósmicos – uma grande tela que projetava tudo -, descobrimos que o que vinha de fora podia ser canibalizado e tornar-se nosso. Era possível inverter a lógica colonial. A digestão do outro pela sua incorporação ou englobamento sociopolítico no nosso meio é o pano de fundo do roubo do fogo dos deuses pelos homens.

No entanto, é preciso uma nota cautelar. Roubamos o futebol, mas não a vitória perpétua. Confundir a atividade futebolista com o sucesso permanente é infantil. Na política, isso surge com o vencer a qualquer custo ou, como diz um professor de poder no poder, o Sr. Gilberto Carvalho, “o bicho vai pegar…”. Ou seja: temos que vencer com ou sem jogo o que, lamentavelmente, mas graças a Deus, é bem diferente do futebol. Escrevi essas péssimas linhas antes da vitória de 3 a 0 contra a França! Somos, de agora em diante, somente vencedores? Um lado meu espera que sim…

*   *   *

O futebol como filosofia

05 de junho de 2013

ROBERTO DAMATTA – O Estado de S.Paulo

O jogo é um modelo da vida. Ele exige temporadas, palcos, equipamentos (mesas, baralhos, dados, roletas, bolas, uniformes, redes, tacos) e regras de modo a garantir uma atenção apaixonada. E como tem início, meio e fim o jogo reduz a indiferença da vida. Com isso, faz com que meros passantes possam posar de campeões. O domingo pode não ter mesa farta, mas tem o jogo do Brasil com sua pompa e seus resplendores de esperança. Os jogos são uma das passagens secretas que permitem escapar de nós mesmos.

Dentre os esportes modernos, o futebol praticado no Brasil é certamente o mais denso. Simoni Lahud Guedes, uma estudiosa pioneira do futebol sugere que ele seria uma tela sobre a qual projetamos nossas indagações. Nascido na Inglaterra industrial dos 1860, o futebol ganhou regras fixas e, desde então, tem sido o sujeito predileto de intensas projeções simbólicas em todo o planeta.

No Brasil, ele acordou reações. Embora tivesse a chancela colonial de tudo o que vinha de fora e da poderosa Inglaterra, era uma atividade desconhecida. Um “esporte” (uma disputa governada por normas e pela necessidade imperiosa de saber vencer e perder), algo inusitado num Brasil que conhecia duelos e brigas que sempre acabavam mal.

Ademais, exercícios físicos e banhos frios não faziam parte da prática nacional. Entre nós, a barriguinha sempre foi prova de riqueza e da imobilidade física – expressiva do ideal de imobilidade social. Como receber essa inovação marcada pela disputa física veloz e igualitária, na qual perder e ganhar são – como na democracia – parte de sua estrutura? Onde encontrar um lugar para um jogo livre das restrições aristocráticas do nome de família, da cor da pele, e da “aparência”. Esse marco com o qual convivemos até hoje no Brasil?

O futebol sofreu muitos ataques em nome de um nacionalismo que se pensava frágil como porcelana. E, no entanto, como estamos vendo nessas vésperas de Copas, canibalizamos e digerimos o “foot-ball”, roubando-o dos ingleses. Hoje, há um estilo brasileiro de jogar e produzir esse esporte. De quinta coluna capaz de desvirtuar, ao lado da música e do cinema americanos, o estilo de vida e a língua pátria, o futebol acabou servindo como um instrumento básico de reflexão sobre o Brasil, conforme eu mesmo assinalei no livro Universo do Futebol, no qual, em 1982, agrupei um conjunto de ensaios socioantropológicos de colegas sobre esse esporte. Em 2006, no livro A Bola Corre Mais Que os Homens, reuni trabalhos nos quais apresentava uma saída para o dilema do esporte como alienação ou consciência do mundo insistindo como, no Brasil, o sucesso futebolístico foi o nosso primeiro instrumento de autoestima diante dos países “adiantados” e inatingíveis. O futebol foi o alento de um Brasil que se concebia como doente pela mistura de raças e que, até hoje, tem problemas em conviver consigo mesmo. Ele é a garantia do recomeço honrado na derrota e do gozo sem arrogância e corrupção na vitória.

Como prova do imprevisível destino das coisa sociais, o futebol não veio confirmar a dominação colonial. Pelo contrário, ele nos fez colonizadores e, mais que isso, filósofos por meio de toda uma literatura que a partir de Nelson Rodrigues, Jacinto de Thormes (Maneco Muller), José Lins do Rego e Armando Nogueira, entre outros, nos permitiu articular uma leitura positiva do mundo.

Literatura? Não seria um exagero? Digo que não e vou mais longe para acrescentar: o futebol criou entre nós uma filosofia, uma antropologia e uma teologia. O seu maior papel foi, como eu disse algumas vezes, o de ensinar democracia. Foi o de revelar com todas as letras que não se ganha sempre e que o mundo é instável como uma bola. Perder e vencer, ensina o futebol, fazem parte de uma mesma moeda.

Nelson Rodrigues fala de jogos bíblicos, do mesmo modo que nos abre a uma metafísica quando associa jogos e craques a destinos fechados ou ao afirmar que já no começo do mundo aquele gol seria perdido. Sua condenação da “objetividade burra” é uma crítica aguda de um senso comum hierarquizado e aristocrático que tenta tornar a própria vida algo oficial, possuída pelo Estado. Por outro lado, sua antropologia inaugura uma neoaristocracia nativa insonhável de negros e mestiços que deixam de ser híbridos enfermiços e passam – tal como ocorreu no jazz de uns Estados Unidos segregados – a príncipes, duques, condes e reis, apesar de nossos desejos inconfessáveis de fracasso. A sub-raça envenenada dos que queriam curar o Brasil se tornou a metarraça que, driblando os nossos subsociólogos – esses cartolas acadêmicos -, nos brindou com cinco Copas do Mundo. “A pátria em chuteiras” abria um novo espaço para esse futebol não branco, permitindo a países como o Brasil, uma redefinição inclusive muito mais abrangente e sem preconceitos de suas identidades nacionais.

Climate Researchers Discover New Rhythm for El Niño (Science Daily)

May 27, 2013 — El Niño wreaks havoc across the globe, shifting weather patterns that spawn droughts in some regions and floods in others. The impacts of this tropical Pacific climate phenomenon are well known and documented.

This is a schematic figure for the suggested generation mechanism of the combination tone: The annual cycle (Tone 1), together with the El Niño sea surface temperature anomalies (Tone 2) produce the combination tone. (Credit: Malte Stuecker)

A mystery, however, has remained despite decades of research: Why does El Niño always peak around Christmas and end quickly by February to April?

Now there is an answer: An unusual wind pattern that straddles the equatorial Pacific during strong El Niño events and swings back and forth with a period of 15 months explains El Niño’s close ties to the annual cycle. This finding is reported in the May 26, 2013, online issue of Nature Geoscience by scientists from the University of Hawai’i at Manoa Meteorology Department and International Pacific Research Center.

“This atmospheric pattern peaks in February and triggers some of the well-known El Niño impacts, such as droughts in the Philippines and across Micronesia and heavy rainfall over French Polynesia,” says lead author Malte Stuecker.

When anomalous trade winds shift south they can terminate an El Niño by generating eastward propagating equatorial Kelvin waves that eventually resume upwelling of cold water in the eastern equatorial Pacific. This wind shift is part of the larger, unusual atmospheric pattern accompanying El Niño events, in which a high-pressure system hovers over the Philippines and the major rain band of the South Pacific rapidly shifts equatorward.

With the help of numerical atmospheric models, the scientists discovered that this unusual pattern originates from an interaction between El Niño and the seasonal evolution of temperatures in the western tropical Pacific warm pool.

“Not all El Niño events are accompanied by this unusual wind pattern” notes Malte Stuecker, “but once El Niño conditions reach a certain threshold amplitude during the right time of the year, it is like a jack-in-the-box whose lid pops open.”

A study of the evolution of the anomalous wind pattern in the model reveals a rhythm of about 15 months accompanying strong El Niño events, which is considerably faster than the three- to five-year timetable for El Niño events, but slower than the annual cycle.

“This type of variability is known in physics as a combination tone,” says Fei-Fei Jin, professor of Meteorology and co-author of the study. Combination tones have been known for more than three centuries. They where discovered by violin builder Tartini, who realized that our ear can create a third tone, even though only two tones are played on a violin.

“The unusual wind pattern straddling the equator during an El Niño is such a combination tone between El Niño events and the seasonal march of the sun across the equator” says co-author Axel Timmermann, climate scientist at the International Pacific Research Center and professor at the Department of Oceanography, University of Hawai’i. He adds, “It turns out that many climate models have difficulties creating the correct combination tone, which is likely to impact their ability to simulate and predict El Niño events and their global impacts.”

The scientists are convinced that a better representation of the 15-month tropical Pacific wind pattern in climate models will improve El Niño forecasts. Moreover, they say the latest climate model projections suggest that El Niño events will be accompanied more often by this combination tone wind pattern, which will also change the characteristics of future El Niño rainfall patterns.

Journal Reference:

  1. Malte F. Stuecker, Axel Timmermann, Fei-Fei Jin, Shayne McGregor, Hong-Li Ren. A combination mode of the annual cycle and the El Niño/Southern Oscillation.Nature Geoscience, 2013; DOI: 10.1038/ngeo1826

Global Warming Caused by CFCs, Not Carbon Dioxide, Researcher Claims in Controversial Study (Science Daily)

May 30, 2013 — Chlorofluorocarbons (CFCs) are to blame for global warming since the 1970s and not carbon dioxide, according to a researcher from the University of Waterloo in a controversial new study published in the International Journal of Modern Physics B this week.

Annual global temperature over land and ocean. (Credit: Image by Q.-B. Lu)

CFCs are already known to deplete ozone, but in-depth statistical analysis now suggests that CFCs are also the key driver in global climate change, rather than carbon dioxide (CO2) emissions, the researcher argues.

“Conventional thinking says that the emission of human-made non-CFC gases such as carbon dioxide has mainly contributed to global warming. But we have observed data going back to the Industrial Revolution that convincingly shows that conventional understanding is wrong,” said Qing-Bin Lu, a professor of physics and astronomy, biology and chemistry in Waterloo’s Faculty of Science. “In fact, the data shows that CFCs conspiring with cosmic rays caused both the polar ozone hole and global warming.”

“Most conventional theories expect that global temperatures will continue to increase as CO2 levels continue to rise, as they have done since 1850. What’s striking is that since 2002, global temperatures have actually declined — matching a decline in CFCs in the atmosphere,” Professor Lu said. “My calculations of CFC greenhouse effect show that there was global warming by about 0.6 °C from 1950 to 2002, but the earth has actually cooled since 2002. The cooling trend is set to continue for the next 50-70 years as the amount of CFCs in the atmosphere continues to decline.”

The findings are based on in-depth statistical analyses of observed data from 1850 up to the present time, Professor Lu’s cosmic-ray-driven electron-reaction (CRE) theory of ozone depletion and his previous research into Antarctic ozone depletion and global surface temperatures.

“It was generally accepted for more than two decades that the Earth’s ozone layer was depleted by the sun’s ultraviolet light-induced destruction of CFCs in the atmosphere,” he said. “But in contrast, CRE theory says cosmic rays — energy particles originating in space — play the dominant role in breaking down ozone-depleting molecules and then ozone.”

Lu’s theory has been confirmed by ongoing observations of cosmic ray, CFC, ozone and stratospheric temperature data over several 11-year solar cycles. “CRE is the only theory that provides us with an excellent reproduction of 11-year cyclic variations of both polar ozone loss and stratospheric cooling,” said Professor Lu. “After removing the natural cosmic-ray effect, my new paper shows a pronounced recovery by ~20% of the Antarctic ozone hole, consistent with the decline of CFCs in the polar stratosphere.”

By demonstrating the link between CFCs, ozone depletion and temperature changes in the Antarctic, Professor Lu was able to draw almost perfect correlation between rising global surface temperatures and CFCs in the atmosphere.

“The climate in the Antarctic stratosphere has been completely controlled by CFCs and cosmic rays, with no CO2 impact. The change in global surface temperature after the removal of the solar effect has shown zero correlation with CO2 but a nearly perfect linear correlation with CFCs — a correlation coefficient as high as 0.97.”

Data recorded from 1850 to 1970, before any significant CFC emissions, show that CO2 levels increased significantly as a result of the Industrial Revolution, but the global temperature, excluding the solar effect, kept nearly constant. The conventional warming model of CO2, suggests the temperatures should have risen by 0.6°C over the same period, similar to the period of 1970-2002.

The analyses support Lu’s CRE theory and point to the success of the Montreal Protocol on Substances that Deplete the Ozone Layer.

“We’ve known for some time that CFCs have a really damaging effect on our atmosphere and we’ve taken measures to reduce their emissions,” Professor Lu said. “We now know that international efforts such as the Montreal Protocol have also had a profound effect on global warming but they must be placed on firmer scientific ground.”

“This study underlines the importance of understanding the basic science underlying ozone depletion and global climate change,” said Terry McMahon, dean of the faculty of science. “This research is of particular importance not only to the research community, but to policy makers and the public alike as we look to the future of our climate.”

Professor Lu’s paper, “Cosmic-Ray-Driven Reaction and Greenhouse Effect of Halogenated Molecules: Culprits for Atmospheric Ozone Depletion and Global Climate Change,” also predicts that the global sea level will continue to rise for some years as the hole in the ozone recovers increasing ice melting in the polar regions.

“Only when the effect of the global temperature recovery dominates over that of the polar ozone hole recovery, will both temperature and polar ice melting drop concurrently,” says Lu.

The peer-reviewed paper published this week not only provides new fundamental understanding of the ozone hole and global climate change but has superior predictive capabilities, compared with the conventional sunlight-driven ozone-depleting and CO2-warming models, Lu argues.

Journal Reference:

  1. Q.-B. Lu. Cosmic-Ray-Driven Reaction and Greenhouse Effect of Halogenated Molecules: Culprits for Atmospheric Ozone Depletion and Global Climate ChangeInternational Journal of Modern Physics B, 2013; 1350073 DOI: 10.1142/S0217979213500732

Subcommittee Reviews Legislation to Improve Weather Forecasting (Subcommittee on Environmen, House of Representatives, USA)

MAY 23, 2013

Washington, D.C. – The Subcommittee on Environment today held a hearing to examine ways to improve weather forecasting at the National Oceanic and Atmospheric Administration (NOAA).  Witnesses provided testimony on draft legislation that would prioritize weather-related research at NOAA, in accordance with its critical mission to protect lives and property through enhanced weather forecasting. The hearing was timely given the recent severe tornadoes in the mid-west and super-storms like Hurricane Sandy.

Environment Subcommittee Chairman Chris Stewart (R-Utah): “We need a world-class system of weather prediction in the United States – one, as the National Academy of Sciences recently put it, that is ‘second to none.’ We can thank the hard-working men and women at  NOAA and their partners throughout the weather enterprise for the great strides that have been made in forecasting in recent decades.  But we can do better. And it’s not enough to blame failures on programming or sequestration or lack of other resources. As the events in Moore, Oklahoma have demonstrated, we have to do better. But the good news is that we can.”

Experts within the weather community have raised concern that the U.S. models for weather prediction have fallen behind Europe and other parts of the world in predicting weather events.The Weather Forecasting Improvement Act, draft legislation discussed at today’s hearing, would build upon the down payment made by Congress following Hurricane Sandy and restore the U.S. as a leader in this field through expanded computing capacity and data assimilation techniques.

Rep. Stewart: “The people of Moore, Oklahoma received a tornado warning 16 minutes before the twister struck their town. Tornado forecasting is difficult but lead times for storms have become gradually better. The draft legislation would prioritize investments in technology being developed at NOAA’s National Severe Storms Laboratory in Oklahoma, which ‘has the potential to provide revolutionary improvements in… tornado… warning lead times and accuracy, reducing false alarms’ and could move us toward the goal of being able to ‘warn on forecast.’”

The following witnesses testified today:

Mr. Barry Myers, Chief Executive Officer, AccuWeather, Inc.

Mr. Jon Kirchner, President, GeoOptics, Inc.

Geoengineering: Can We Save the Planet by Messing with Nature? (Democracy Now!)

Video: http://www.democracynow.org/2013/5/20/geoengineering_can_we_save_the_planet

Clive Hamilton, professor of public ethics at Charles Sturt University in Canberra, Australia. He is the author of the new book, Earthmasters: The Dawn of the Age of Climate Engineering.

Overheated rhetoric on climate change doesn’t make for good policies (Washington Post)

By Lamar Smith, Published: May 19, 2013

Lamar Smith, a Republican, represents Texas’s 21st District in the U.S. House and is chairman of the House Committee on Science, Space and Technology.

Climate change is an issue that needs to be discussed thoughtfully and objectively. Unfortunately, claims that distort the facts hinder the legitimate evaluation of policy options. The rhetoric has driven some policymakers toward costly regulations and policies that will harm hardworking American families and do little to decrease global carbon emissions. The Obama administration’s decision to delay, and possibly deny, the Keystone XL pipeline is a prime example.

The State Department has found that the pipeline will have minimal impact on the surrounding environment and no significant effect on the climate. Recent expert testimony before the House Committee on Science, Space and Technology confirms this finding. In fact, even if the pipeline is approved and is used at maximum capacity, the resulting increase in carbon dioxide emissions would be a mere 12 one-thousandths of 1 percent (0.012 percent). There is scant scientific or environmental justification for refusing to approve the pipeline, a project that the State Department has also found would generate more than 40,000 U.S. jobs.

Contrary to the claims of those who want to strictly regulate carbon dioxide emissions and increase the cost of energy for all Americans, there is a great amount of uncertainty associated with climate science. These uncertainties undermine our ability to accurately determine how carbon dioxide has affected the climate in the past. They also limit our understanding of how anthropogenic emissions will affect future warming trends. Further confusing the policy debate, the models that scientists have come to rely on to make climate predictions have greatly overestimated warming. Contrary to model predictions, data released in October from the University of East Anglia’s Climate Research Unit show that global temperatures have held steady over the past 15 years, despite rising greenhouse gas emissions.

Among the facts that are clear, however, are that U.S. emissions contribute very little to global concentrations of greenhouse gas, and that even substantial cuts in these emissions are likely to have no effect on temperature. Data from the Energy Information Administration show, for example, that the United States cut carbon dioxide emissions by 12 percent between 2005 and 2012 while global emissions increased by 15 percent over the same period.

Using data from the Intergovernmental Panel on Climate Change (IPCC), a Science and Public Policy Institute paper published last month found that if the United States eliminated all carbon dioxide emissions, the overall impact on global temperature rise would be only 0.08 degrees Celsius by 2050.

Further confounding the debate are unscientific and often hyperbolic claims about the potential effects of a warmer world. In his most recent State of the Union address, President Obama said that extreme weather events have become “more frequent and intense,” and he linked Superstorm Sandy to climate change.

But experts at the National Oceanic and Atmospheric Administration have told the New York Times that climate change had nothing to do with Superstorm Sandy. This is underscored by last year’s IPCC report stating that there is “high agreement” among leading experts that trends in weather disasters, floods, tornados and storms cannot be attributed to climate change. While these claims may make for good political theater, their effect on recent public policy choices hurts the economy.

Last spring the Environmental Protection Agency proposed emissions standards that virtually prohibit new coal-fired power plants. As we await implementation of these strict new rules, additional regulations that will affect existing power plants, refineries and other manufactures are sure to follow. Analyses of these measures by the American Council for Capital Formation, which studies economic and environmental policy, show that they will raise both electricity rates and gas prices — costing jobs and hurting the economy — even as the EPA admits that these choices will have an insignificant impact on global climate change (a point former EPA administrator Lisa Jackson confessed during a Senate hearing in 2009).

Instead of pursuing heavy-handed regulations that imperil U.S. jobs and send jobs (and their emissions) overseas, we should take a step back from the unfounded claims of impending catastrophe and think critically about the challenge before us. Designing an appropriate public policy response to this challenge will require that we fully assess the facts and the uncertainties surrounding this issue, and that we set aside the hyped rhetoric.

Read more from PostOpinions: Greg Sargent: Now can we talk about climate change? The Post’s View: Carbon tax is best option Congress has Matthew Stepp: The limits of renewable energy Stephen Stromberg: In State of the Union, Obama threatens Congress on climate change. And that’s a good thing.

“eScience revoluciona a forma como se faz ciência” (Fapesp)

Novas ferramentas de computação possibilitam fazer ciência de forma melhor, mais rápida e com maior impacto, diz Tony Hey, vice-presidente da Microsoft Research (foto:E.Cesar/FAPESP)

16/05/2013

Por Elton Alisson

Agência FAPESP – Um software de visualização de dados astronômicos pela internet permite que cientistas em diversas partes do mundo acessem milhares de imagens de objetos celestes, coletadas por grandes telescópios espaciais, por observatórios e por instituições internacionais de pesquisa em astronomia.

Por meio desses dados, os usuários podem realizar análises temporais e combinar observações realizadas em vários comprimentos de onda de energia irradiada pelos corpos celestes, como raios X, radiação infravermelha, ultravioleta e gama e ondas de rádio, para elucidar os processos físicos que ocorrem no interior desses objetos e compartilhar suas conclusões.

Denominado World Wide Telescope, o software, que começou a ser desenvolvido em 2002 pela Microsoft Research, em parceria com pesquisadores da Universidade Johns Hopkins, nos Estados Unidos, é um exemplo de como as novas tecnologias da informação e comunicação (TICs) mudaram a forma como os dados científicos passaram a ser gerados, administrados e compartilhados, além da própria maneira como se faz ciência hoje, afirma Tony Hey, vice-presidente da Microsoft Research.

“Os telescópios espaciais, assim como as máquinas de sequenciamento genético e aceleradores de partículas, estão gerando um volume de dados até então nunca visto. Para lidar com esse fenômeno e possibilitar que os cientistas possam manipular e compartilhar esses dados, precisamos de uma série de tecnologias e ferramentas de ciência da computação que possibilitem fazer ciência de forma melhor, mais rápida e com maior impacto. É isso o que chamamos deeScience”, disse Hey durante o Latin American eScience Workshop 2013, realizado nos dias 14 e 15 de maio no Espaço Apas, em São Paulo.

Promovido pela FAPESP e pela Microsoft Research, o evento reuniu pesquisadores e estudantes da Europa, da América do Sul e do Norte, da Ásia e da Oceania para discutir avanços em diversas áreas do conhecimento possibilitados pela melhoria na capacidade de análise de grandes volumes de informações produzidas por projetos de pesquisa.

A cerimônia de abertura do evento foi presidida por Celso Lafer, presidente da FAPESP, e contou com a presença de Michel Levy, presidente da Microsoft Brasil, e de José Tadeu de Faria, superintendente do Ministério da Agricultura, Pecuária e Abastecimento no Estado de São Paulo, representando o ministro.

Também conhecida como ciência orientada por dados, a área de eScience integra pesquisas em computação a estudos nas mais variadas áreas por meio do desenvolvimento de softwares específicos para visualização e análise de informações.

A integração permite a interpretação dos dados, a formulação de teorias, testes por simulação e o levantamento de novas hipóteses de pesquisa com base em correlações difíceis de serem observadas sem o apoio da tecnologia da informação.

“Algumas tecnologias utilizadas na ciência da computação vão ajudar a resolver problemas científicos. Em contrapartida, a utilização dessas ferramentas para solucionar problemas científicos também possibilitará o próprio desenvolvimento da ciência da computação”, disse Hey, que foi professor da Universidade de Southampton, no Reino Unido.

Segundo Hey, a análise, visualização, prospecção (data mining, na expressão em inglês), preservação e compartilhamento de grandes volumes de dados representam grandes desafios não só na ciência hoje, mas também no setor privado.

Por isso, na opinião dele, é preciso treinar os cientistas para lidar com o big data – como é chamado o conjunto de soluções tecnológicas capaz de lidar com a acumulação contínua de dados pouco estruturados, capturados de diversas fontes e  da ordem de petabytes (quatrilhões de bytes) – tanto para realização de projetos científicos, como também para atuarem, eventualmente, em empresas. “O data scientist [cientista capaz de lidar com grandes volumes de dados] será um requisito imprescindível para o cientista”, disse Hey.

A ciência intensiva em dados não é nova, mas as escalas espaciais e temporais de estudos realizados atualmente sobre temas relacionados às mudanças climáticas globais, por exemplo, são cada vez maiores, exigindo novas ferramentas. Por meio de novas tecnologias da informação, também é possível analisar dados gerados em tempo real, como no monitoramento de hábitats.

De acordo com Hey, desde 1950 se começou a utilizar computadores para explorar, por meio de simulações, áreas da ciência até então inacessíveis. “No início, no entanto, os cientistas não sabiam o que era ciência da computação e os profissionais da computação não entendiam a complexidade dos problemas científicos”, disse.

“Foi necessária a realização de um trabalho conjunto, de longo prazo, para que os dois lados entendessem qual era a contribuição que cada um poderia dar em suas respectivas áreas, e iniciar o desenvolvimento de novos algoritmos, hardwaresoftware e da programação de linguagens para possibilitar a realização de experimentos em diversas áreas”, contou.

Oportunidades em temas ousados

Durante o evento da FAPESP e da Microsoft Research foram apresentados diversos projetos por pesquisadores que utilizam o eScience em diversos países, em áreas como energias renováveis, mudanças climáticas globais, transformações sociais, econômicas e políticas nas metrópoles contemporâneas, caracterização, conservação, recuperação e uso sustentável da biodiversidade, medicina e saúde pública.

Um desses projetos, coordenado pela professora Glaucia Mendes Souza, coordenadora do Programa FAPESP de Pesquisa em Bioenergia (BIOEN), pretende desenvolver um algoritmo para o sequenciamento do genoma da cana-de-açúcar e, com isso, possibilitar o desenvolvimento de variedades da planta com maior quantidade de sacarose e mais resistente a pragas e às mudanças climáticas.

“A colaboração entre a FAPESP e a Microsoft tem aberto para a comunidade científica do Estado de São Paulo inúmeras oportunidades de realizar pesquisas em temas ousados relacionados com o uso de tecnologias da informação em áreas como a de energia e meio ambiente”, disse Carlos Henrique de Brito Cruz, diretor científico da FAPESP, na sessão de abertura do workshop.

“Temos grandes expectativas em relação à eScience. Se soubermos utilizá-la adequadamente, ela poderá trazer grandes avanços não só em pesquisas mas também na própria maneira de se fazer ciência”, disse Brito Cruz.

Ele disse que a FAPESP planeja lançar em breve um programa voltado para apoiar pesquisas na área de eScience.

“Temos a clara convicção de que um papel importante da FAPESP é estar na vanguarda da inovação e do conhecimento, e consideramos muito importante o apoio à pesquisas em eScience, cuja aplicação em áreas como a de meio ambiente é inequívoca, mas que também apresenta um grande potencial de utilização nas Ciências Humanas, por exemplo”, disse Celso Lafer, presidente da FAPESP.

Levy destacou a parceria da Microsoft com a FAPESP e os investimentos em pesquisa e desenvolvimento realizados pela empresa no país. “A Microsoft tem aumentado seus investimentos na área de pesquisa e desenvolvimento no Brasil nos últimos anos e um dos mais importantes exemplos disso é a parceria bem sucedida que mantemos com a FAPESP”, afirmou.

Climate research nearly unanimous on human causes, survey finds (The Guardian)

Of more than 4,000 academic papers published over 20 years, 97.1% agreed that climate change is anthropogenic

, US environment correspondent

guardian.co.uk, Thursday 16 May 2013 00.01 BST

An iceberg melts in Greeland in 2007. Climate change. Environment. Global warming. Photograph: John McConnico/AP

‘Our findings prove that there is a strong scientific agreement about the cause of climate change, despite public perceptions to the contrary’. Photograph: John McConnico/AP

A survey of thousands of peer-reviewed papers in scientific journals has found 97.1% agreed that climate change is caused by human activity.

Authors of the survey, published on Thursday in the journal Environmental Research Letters, said the finding of near unanimity provided a powerful rebuttal to climate contrarians who insist the science of climate change remains unsettled.

The survey considered the work of some 29,000 scientists published in 11,994 academic papers. Of the 4,000-plus papers that took a position on the causes of climate change only 0.7% or 83 of those thousands of academic articles, disputed the scientific consensus that climate change is the result of human activity, with the view of the remaining 2.2% unclear.

The study described the dissent as a “vanishingly small proportion” of published research.

“Our findings prove that there is a strong scientific agreement about the cause of climate change, despite public perceptions to the contrary,” said John Cook of the University of Queensland, who led the survey.

Public opinion continues to lag behind the science. Though a majority of Americans accept the climate is changing, just 42% believed human activity was the main driver, in a poll conducted by the Pew Research Centre last October.

“There is a gaping chasm between the actual consensus and the public perception,” Cook said in a statement.

Guardian partners Climate Desk interview John Cook on his new paper

The study blamed strenuous lobbying efforts by industry to undermine the science behind climate change for the gap in perception. The resulting confusion has blocked efforts to act on climate change.

The survey was the most ambitious effort to date to demonstrate the broad agreement on the causes of climate change, covering 20 years of academic publications from 1991-2011.

In 2004, Naomi Oreskes, an historian at the University of California, San Diego,surveyed published literature, releasing her results in the journal Science. She too came up with a similar finding that 97% of climate scientists agreed on the causes of climate change.

She wrote of the new survey in an email: “It is a nice, independent confirmation, using a somewhat different methodology than I used, that comes to the same result. It also refutes the claim, sometimes made by contrarians, that the consensus has broken down, much less ‘shattered’.”

The Cook survey was broader in its scope, deploying volunteers from theSkepticalScience.com website to review scientific abstracts. The volunteers also asked authors to rate their own views on the causes of climate change, in another departure from Oreskes’s methods.

The authors said the findings could help close the gap between scientific opinion and the public on the causes of climate change, or anthropogenic global warming, and so create favourable conditions for political action on climate.

“The public perception of a scientific consensus on AGW [anthropogenic, ie man-made, global warming] is a necessary element in public support for climate policy,” the study said.

However, Prof Robert Brulle, a sociologist at Drexel University who studies the forces underlying attitudes towards climate change, disputed the idea that educating the public about the broad scientific agreement on the causes of climate change would have an effect on public opinion – or on the political conditions for climate action.

He said he was doubtful that convincing the public of a scientific consensus on climate change would help advance the prospects for political action. Having elite leaders call for climate action would be far more powerful, he said.

“I don’t think people really want to come around to grips with the fact that climate change is a highly ideological issue and it is not amenable to the information deficit model,” he said.

“The information deficit model, this idea that if you just pile on more information people will get convinced, is just completely inadequate, he said. “It strengthens the people who actually read and pay attention but it is certainly not going to change or shift the opinions of others.”

Jon Krosnick, professor in humanities and social sciences at Stanford university and an expert on public opinion on climate change, said: “I assume that sceptics would say that there is bias in the editorial process so that the papers ultimately published are not an accurate reflection of the opinions of scientists.”

“It’s happening now… The village is sinking” (The Guardian)

Residents of Newtok, Alaska, know they must evacuate, but who will pay the $130m cost of moving them?

› Children jump over ground affected by erosion in Newtok. Natural erosion has accelerated due to climate change, with large areas of land lost to the Ninglick River each year. Photograph: Brian Adams

Suzanne Goldenberg in Newtok, Alaska, with video by Richard Sprenger

One afternoon in the waning days of winter, the most powerful man in Newtok, Alaska, hopped on a plane and flew 1,000 miles to plead for the survival of his village. Stanley Tom, Newtok’s administrator, had a clear purpose for his trip: find the money to move the village on the shores of the Bering Sea out of the way of an approaching disaster caused by climate change.

Village administrator Stanley Tom stands at Mertarvik, the site of relocated Newtok. Photograph: Brian Adams Photography

Newtok was rapidly losing ground to erosion. The land beneath the village was falling into the river. Tom needed money for bulldozers to begin preparing a new site for the village on higher ground. He needed funds for an airstrip, He came back from his meetings in Juneau, the Alaskan state capital, with expressions of sympathy – but nothing in the way of the cash he desperately needed. “It’s really complicated,” he said. “There are a lot of obstacles.”

Those obstacles – financial, legal and a supremely frustrating bureaucratic process – had slowed down the move for so long that some in Newtok, which is about 400 miles south of the Bering Strait that separates the US from Russia, feared they would be stuck as the village went down around them, houses swallowed up by the river.

“It’s really alarming,” said Tom, slumped in an armchair a few hours after his return to the village. “I have a hard time sleeping, and I’m getting up early in the morning. I am worried about it every day.”

The uncertainty was tearing the village apart. It also began to turn the village against Tom.

Over the winter, a large group of villagers decided that their administrator was not up to the job. By the time he returned from this particular trip, the dissidents had voted to replace the village council and to sack Tom – a vote that he ignored.

“The way I see it, we need someone who knows how to do the work,” said Katherine Charles, one of Tom’s most vocal critics. “I feel like we are being neglected. We are still standing here and we don’t know when we are going to move. For years now we have been frustrated. I have to ask myself: why are we even still here?”

It’s been more than a decade since Tom took charge of running Newtok, and leading the village out of climate disaster to higher ground.

The ground beneath Newtok is disappearing. Natural erosion has accelerated due to climate change, with large areas of land lost to the Ninglick river each year. A study by the Army Corps of Engineers found the highest point in the village would be below water level by 2017. The proximity of the threat to Newtok means that its villages are likely to be America’s first climate refugees.

Officials in Anchorage say Tom has worked tirelessly to move the village out of the way of a rampaging river. Among the relatively small circle of bureaucrats and lawyers who concern themselves with the problems of small and remote indigenous Alaskan villages, the Newtok administrator has a stellar reputation. He has won leadership awards from Native American groups in the rest of the country.

Tom said he hoped to make a big push this summer, acquiring heavy equipment that locals could use to begin moving some of the existing houses over to the new village site at Mertarvik nine miles to the south.

“It’s really happening right now. The village is sinking and flooding and eroding,” he said. He said he was planning to move his own belongings to the new village site this summer – and that villagers should start doing the same.

But Tom, despite his lobbying missions to Juneau and strong reputation with government officials, has failed to inject federal and state officials with that same sense of urgency.

Melting permafrost, sea-level rise, erosion – these are some of the worst consequences of climate change for Alaska. But none of those elements in Newtok’s slow destruction are recognised as disasters under existing legislation.

That means there is no designated pot of money set aside for those affected communities – unlike cities or towns destroyed by floods or tornados.

We weren’t thinking of climate change when federal disaster relief legislation was passed.

Robin Bronen, a human rights lawyer in Anchorage. ‘This is completely a human rights issue’ Photograph: Richard Sprenger

“We weren’t thinking of climate change when federal disaster relief legislation was passed,” said Robin Bronen, a human rights lawyer in Anchorage who has made a dozen visits to Newtok. “Our legal system is not set up. The institutions that we have created to respond to disasters are not up to the task of responding to climate change.”

In Bronen’s view, Congress needed to rewrite existing disaster legislation to take account of climate change. Communities needed to be able to access those disaster funds — if not to rebuild in place, which is not feasible in Newtok’s case, then to move.

The authorities also had responsibility under the treaty agreements with indigenous Alaskan tribes to guarantee the safety and wellbeing of indigenous communities, she argued.

“This is completely a human rights issue,” Bronen said. “When you are talking about a people who have done the least to contribute to our climate crisis facing such dramatic consequences as a result of climate change, we have a moral and legal responsibility to respond and provide the funding needed so that these communities are not in danger.”

Until then, however, it was up to Tom to find new ways to prise funds out of an unresponsive bureaucracy. It turned out that he had a knack for it.

Government officials praised Tom for finding other sources of funds, such as development grants, and putting them to use for building the new village site. But it has been a laborious process for the remote village to find its way through the different funding agencies and a maze of competing regulations.

As Tom found out, each agency had its own set of rules. The state government would not build a school for fewer than 10 children. The federal government would not build an airstrip at a village without a post office. But the rules, from Newtok’s vantage point, appeared to have at least one point in common. They seemed to conspire against the village ever getting its move off the ground.

In 2011, Alaska’s government published a timetable for Newtok’s move, setting out dates for building an emergency centre, housing, an airstrip – all items on Tom’s list. Two years later, the plan is already behind schedule and the official who oversaw that original timetable said there was little chance of getting back on track.

Newtok is something that is probably going to play out over several decades unless it reaches a dire point where something has to be done immediately to keep the people safe

Larry Hartig, who heads Alaska’s Department of Environmental Conservation

“Newtok is something that is probably going to play out over several decades unless it reaches a dire point where something has to be done immediately to keep the people safe,” said Larry Hartig, who heads Alaska’s Department of Environmental Conservation.

Officially, the government of Alaska remains committed to helping Newtok and all the other indigenous Alaskan villages that are threatened by climate change.

Almost all of Alaska’s indigenous villages – more than 180 – are experiencing the effects of climate change, including severe flooding and erosion. Some may be able to hold back rivers and sea, but others will have to move. About half a dozen villages, including Newtok, face extreme risks.

A mosaic of sea ice shifts across the Bering Sea, west of Alaska on 5 February, 2008. On either side of the Bering Strait (top centre) the land is blanketed with snow. Anchorage is located in the middle right-hand side of the image, at the top of the Cook Inlet. The village of Newtok is located north of Nunivak Island (middle), close to the coast on the lowland plain of the Yukon-Kuskokwim Delta, one of the largest river deltas in the world, which mostly consists of tundra. Photograph: NASA/Aqua/MODIS

“I am not going to tell any community that they are not going to survive. If the residents want to survive, we will help them,” said Mead Treadwell, the state’s lieutenant governor.

But the cost of relocating just one village — Newtok — could run as high as $130m, according to an estimate by the Army Corps of Engineers. That’s more than $350,000 per villager. Multiply that by half a dozen, or several more times, and the cost of protecting indigenous Alaskan villages from climate change soon soars into the billions.

So far, Newtok has received a total of about $12m in state funds over the past four years, according to George Owletuck, a consultant hired by Tom to help with the move. Much of that has already gone, to build a barge landing, a few new homes, and an emergency evacuation centre – in case the village does not manage to move in time.

Officially, federal and state government agencies have spent some $27m getting Mertarvik ready, although a considerable share of that figure, some $6m, did not go directly to the relocation, said Sally Russell Cox, the state official overseeing the move. And there is still no major infrastructure completed at Mertarvik.

Would the government of Alaska commit to picking up the rest of the tab for Newtok and the other villages?

Alaska’s oil revenues have fallen off over the years. In 2012, the state slipped into second place for oil production behind North Dakota. Treadwell admitted the state government would not cover the entire cost of fortifying or moving all of the villages threatened by climate change.

“On the question of is there money to help them with one cheque? That is something there clearly is not,” he said.

Treadwell suggested some of the at-risk villages could raise funds by setting themselves up as hubs for oil companies hoping to drill in Arctic waters.

However, a number of oil companies have put their Arctic drilling plans on hold for 2013 and 2014. Treadwell admitted there was as yet no comprehensive climate change plan for Newtok and other villages. “I think it’s going to be piece by piece with each community and many different pots of money,” he said.

In the case of Newtok, Owletuck, the consultant, had big ideas for financing the move: growing fruit and vegetables hydroponically in green houses, or testing the possibilities of producing biofuels from algae.

He let it be known the village may even have found a mysterious benefactor. Owletuck said he’d had an approach from private individuals, whom he declined to name, wanting to donate $22m to the move.

None of those propositions have materialised, however. And after more than a decade of uncertainty about the future under climate change, the basic infrastructure of Newtok is coming apart.

The impact on Newtok

The rising sea due to erosion and climate change has dramatically altered Newtok Village and by 2027 is expected to cover nearly a third of the village. Historic shorelines digitized from USGS topographic maps and aerial photos. Source: Army Corps of Engineers

Snow covers up a lot of Newtok’s flaws: the open sewage pits, the broken board walk over mudflats, some of the abandoned snowmobile wrecks.

Newtok has for years been considered a “distressed village”, with average income of $16,000, well below the rest of the state. Fewer than half of adults in the village have paid work. But even within those dismal measures, conditions have sharply deteriorated in the years since the village has been planning to move.

Aside from the clinic and the school, most buildings are in a state of advanced dilapidation. The floor in the community hall sags like an old mattress. The community laundry is out of order.

In the cramped offices of the traditional council, where Tom works, the furniture dates from the 1970s or 1980s, mid-brown vinyl chairs where the casing has split open, revealing the dirty foam inside. It’s not unheard of to find families of 10 or 12 children living in houses of less than 800 sq ft – and none of those homes have flush toilets or running water.

Early mornings find the men of the household trudging out of their homes with 5 gallon buckets of waste, which get dumped at various spots on the edges of the village, including a small stream.

The diesel-powered generator was nearing the end of its life span. The water treatment plant was shut down last October after people began getting sick. Tom said there was contamination from leaking jet fuel at the airport.

For now, villagers are drawing water from the school, which had a separate system. But the school principal said he would have to cut that off in May to preserve the system for the schoolchildren.

Tom said there was nothing he could do. Government agencies would not fund improvements at the current village site, because of the plan to move. “There is no money to improve our community,” he said. “We are suspended from federal and state agencies and there is no way of improving our lives over here. The agencies do not want to work on both villages at once.”

By last October, frustration with the stalled move and conditions in the village exploded. Villagers accused their own council of failing to hold regular elections, and raised a petition to throw out the leaders and replace Tom.

Some accused him of presiding over a dictatorship in the village. Others speculated that he and the paid consultant, Oweituck, were plotting to rob the relocation funds.

One of the dissidents, a relative newcomer to the village, posted ferocious criticism of Tom on Facebook calling for rebellion.

The dissidents organised elections, voted out the old council and installed their own leaders. Tom ignored the result. “Let them cry all they want,” he said. “I don’t care. They are not going to help my community. I am way ahead of these guys.”

The upheavals in Newtok are sadly familiar to those who have worked with indigenous Alaskan villages confronting climate change. “I don’t think you would find one community that says they are happy with the pace that’s gone on,” said Patricia Cochran, director of the Alaska Native Science Commission.

“To be honest with you, I think the state and the feds have done a terrible job, not only in assessing the conditions that communities are living within but in responding to them,” she said. “Because these communities are listed as threatened and may potentially be relocated, they are not able to get any funds now for infrastructure that is being damaged right now.”

That leaves communities stuck in a limbo that can carry for years or even decades.

That’s what has become of Newtok. The effects are devastating, said Charles. Beyond all her anger she admitted was an all-enveloping fear. “Sometimes I get scared. I’m scared for my own family. How will I take care of them if the relocation doesn’t start right away?”

She had been waiting for years to see the beginnings of any new settlement in rural Alaska rising up on the rocky hill of Mertarvik: the airport, the barge landing, the school, the houses. None of it was there yet, and Charles said she was coming close to despair.

“It’s been going on for I don’t know how long, and I am beginning to lose hope.”

For Insurers, No Doubts on Climate Change (N.Y.Times)

Master Sgt. Mark Olsen/U.S. Air Force, via Associated Press. Damage in Mantoloking, N.J., after Hurricane Sandy. Natural disasters caused $35 billion in private property losses last year.

By EDUARDO PORTER

Published: May 14, 2013

If there were one American industry that would be particularly worried about climate change it would have to be insurance, right?

From Hurricane Sandy’s devastating blow to the Northeast to the protracted drought that hit the Midwest Corn Belt, natural catastrophes across the United States pounded insurers last year, generating$35 billion in privately insured property losses, $11 billion more than the average over the last decade.

And the industry expects the situation will get worse. “Numerous studies assume a rise in summer drought periods in North America in the future and an increasing probability of severe cyclones relatively far north along the U.S. East Coast in the long term,” said Peter Höppe, who heads Geo Risks Research at the reinsurance giant Munich Re. “The rise in sea level caused by climate change will further increase the risk of storm surge.” Most insurers, including the reinsurance companies that bear much of the ultimate risk in the industry, have little time for the arguments heard in some right-wing circles that climate change isn’t happening, and are quite comfortable with the scientific consensus that burning fossil fuels is the main culprit of global warming.

“Insurance is heavily dependent on scientific thought,” Frank Nutter, president of the Reinsurance Association of America, told me last week. “It is not as amenable to politicized scientific thought.”

Yet when I asked Mr. Nutter what the American insurance industry was doing to combat global warming, his answer was surprising: nothing much. “The industry has really not been engaged in advocacy related to carbon taxes or proposals addressing carbon,” he said. While some big European reinsurers like Munich Re and Swiss Re support efforts to reduce CO2 emissions, “in the United States the household names really have not engaged at all.” Instead, the focus of insurers’ advocacy efforts is zoning rules and disaster mitigation.

Last week, scientists announced that the concentration of heat-trapping carbon dioxide in the atmosphere had reached 400 parts per million — its highest level in at least three million years, before humans appeared on the scene. Back then, mastodons roamed the earth, the polar ice caps were smaller and the sea level was as much as 60 to 80 feet higher.

The milestone puts the earth nearer a point of no return, many scientists think, when vast, disruptive climate change is baked into our future. Pietr P. Tans, who runs the monitoring program at the National Oceanic and Atmospheric Administration, told my colleague Justin Gillis: “It symbolizes that so far we have failed miserably in tackling this problem.” And it raises a perplexing question: why hasn’t corporate America done more to sway its allies in the Republican Party to try to avert a disaster that would clearly be devastating to its own interests?

Mr. Nutter argues that the insurance industry’s reluctance is born of hesitation to become embroiled in controversies over energy policy. But perhaps its executives simply don’t feel so vulnerable. Like farmers, who are largely protected from the ravages of climate change by government-financed crop insurance, insurers also have less to fear than it might at first appear.

The federal government covers flood insurance, among the riskiest kind in this time of crazy weather. And insurers can raise premiums or even drop coverage to adjust to higher risks. Indeed, despite Sandy and drought, property and casualty insurance in the United States was more profitable in 2012 than in 2011, according to the Property Casualty Insurers Association of America.

But the industry’s analysis of the risks it faces is evolving. One sign of that is how some top American insurers responded to a billboard taken out by the conservative Heartland Institute, a prominent climate change denier that has received support from the insurance industry.

The billboard had a picture of Theodore Kaczynski, the Unabomber, who asked: “I still believe in global warming. Do you?”

Concerned about global warming and angry to be equated with a murderous psychopath, insurance companies like Allied World, Renaissance Re, State Farm and XL Groupdropped their support for Heartland.

Even more telling, Eli Lehrer, a Heartland vice president who at the time led an insurance-financed project, left the group and helped start the R Street Institute, a standard conservative organization in all respects but one: it believes in climate change and supports a carbon tax to combat it. And it is financed largely with insurance industry money.

Mr. Lehrer points out that a carbon tax fits conservative orthodoxy. It is a broad and flat tax, whose revenue can be used to do away with the corporate income tax — a favorite target of the right. It provides a market-friendly signal, forcing polluters to bear the cost imposed on the rest of us and encouraging them to pollute less. And it is much preferable to a parade of new regulations from the Environmental Protection Agency.

“We are having a debate on the right about a carbon tax for the first time in a long time,” Mr. Lehrer said.

Bob Inglis, formerly a Republican congressman from South Carolina who lost his seat in the 2010 primary to a Tea Party-supported challenger, is another member of this budding coalition. Before he left Congress, he proposed a revenue-neutral bill to create a carbon tax and cut payroll taxes.

Changing the political economy of a carbon tax remains an uphill slog especially in a stagnant economy. But Mr. Inglis notices a thaw. “The best way to do this is in the context of a grand bargain on tax reform,” he said. “It could happen in 2015 or 2016, but probably not before.”

He lists a dozen Republicans in the House and eight in the Senate who would be open to legislation to help avert climate change. He notes that Exelon, the gas and electricity giant, is sympathetic to his efforts — perhaps not least because a carbon tax would give an edge to gas over its dirtier rival, coal. Exxon, too, has also said a carbon tax would be the most effective way to reduce emissions. So why hasn’t the insurance industry come on board?

Robert Muir-Wood is the chief research officer of Risk Management Solutions, one of two main companies the insurance industry relies on to crunch data and model future risks. He argues that insurers haven’t changed their tune because — with the exception of 2004 and 2005, when a string of hurricanes from Ivan to Katrina caused damage worth more than $200 billion — they haven’t yet experienced hefty, sustained losses attributable to climate change.

“Insurers were ready to sign up to all sorts of actions against climate change,” Mr. Muir-Wood told me from his office in London. Then the weather calmed down.

Still, Mr. Muir-Wood notes that the insurance industry faces a different sort of risk: political action. “That is the biggest threat,” he said. When insurers canceled policies and raised premiums in Florida in 2006, politicians jumped on them. “Insurers in Florida,” he said, “became Public Enemy No. 1.”

And that’s the best hope for those concerned about climate change: that global warming isn’t just devastating for society, but also bad for business.

Climate slowdown means extreme rates of warming ‘not as likely’ (BBC)

19 May 2013 Last updated at 17:31 GMT

By Matt McGrath – Environment correspondent, BBC News

ice

The impacts of rising temperature are being felt particularly keenly in the polar regions

Scientists say the recent downturn in the rate of global warming will lead to lower temperature rises in the short-term.

Since 1998, there has been an unexplained “standstill” in the heating of the Earth’s atmosphere.

Writing in Nature Geoscience, the researchers say this will reduce predicted warming in the coming decades.

But long-term, the expected temperature rises will not alter significantly.

“The most extreme projections are looking less likely than before” – Dr Alexander Otto, University of Oxford

The slowdown in the expected rate of global warming has been studied for several years now. Earlier this year, the UK Met Office lowered their five-year temperature forecast.

But this new paper gives the clearest picture yet of how any slowdown is likely to affect temperatures in both the short-term and long-term.

An international team of researchers looked at how the last decade would impact long-term, equilibrium climate sensitivity and the shorter term climate response.

Transient nature

Climate sensitivity looks to see what would happen if we doubled concentrations of CO2 in the atmosphere and let the Earth’s oceans and ice sheets respond to it over several thousand years.

Transient climate response is much shorter term calculation again based on a doubling of CO2.

The Intergovernmental Panel on Climate Change reported in 2007 that the short-term temperature rise would most likely be 1-3C (1.8-5.4F).

But in this new analysis, by only including the temperatures from the last decade, the projected range would be 0.9-2.0C.

IceThe report suggests that warming in the near term will be less than forecast

“The hottest of the models in the medium-term, they are actually looking less likely or inconsistent with the data from the last decade alone,” said Dr Alexander Otto from the University of Oxford.

“The most extreme projections are looking less likely than before.”

The authors calculate that over the coming decades global average temperatures will warm about 20% more slowly than expected.

But when it comes to the longer term picture, the authors say their work is consistent with previous estimates. The IPCC said that climate sensitivity was in the range of 2.0-4.5C.

Ocean storage

This latest research, including the decade of stalled temperature rises, produces a range of 0.9-5.0C.

“It is a bigger range of uncertainty,” said Dr Otto.

“But it still includes the old range. We would all like climate sensitivity to be lower but it isn’t.”

The researchers say the difference between the lower short-term estimate and the more consistent long-term picture can be explained by the fact that the heat from the last decade has been absorbed into and is being stored by the world’s oceans.

Not everyone agrees with this perspective.

Prof Steven Sherwood, from the University of New South Wales, says the conclusion about the oceans needs to be taken with a grain of salt for now.

“There is other research out there pointing out that this storage may be part of a natural cycle that will eventually reverse, either due to El Nino or the so-called Atlantic Multidecadal Oscillation, and therefore may not imply what the authors are suggesting,” he said.

The authors say there are ongoing uncertainties surrounding the role of aerosols in the atmosphere and around the issue of clouds.

“We would expect a single decade to jump around a bit but the overall trend is independent of it, and people should be exactly as concerned as before about what climate change is doing,” said Dr Otto.

Is there any succour in these findings for climate sceptics who say the slowdown over the past 14 years means the global warming is not real?

“None. No comfort whatsoever,” he said.

World Bank turns to hydropower to square development with climate change (Washington Post)

Michael Reynolds/European Photopress Agency – World Bank President Jim Yong Kim attends the Fragility Forum this month in Washington. The forum discussed ways for fragile nations to improve their economies, their infrastructure and the well-being of their citizens.

By , Published: May 8, 2013

The World Bank is making a major push to develop large-scale hydropower projects around the globe, something it had all but abandoned a decade ago but now sees as crucial to resolving the tension between economic development and the drive to tame carbon use.

Major hydropower projects in Congo, Zambia, Nepal and elsewhere — all of a scale dubbed “transformational” to the regions involved — are a focus of the bank’s fundraising drive among wealthy nations. Bank lending for hydropower has scaled up steadily in recent years, and officials expect the trend to continue amid a worldwide boom in water-fueled electricity.

Such projects were shunned in the 1990s, in part because they can be disruptive to communities and ecosystems. But the World Bank is opening the taps for dams, transmission lines and related infrastructure as its president, Jim Yong Kim, tries to resolve a quandary at the bank’s core: how to eliminate poverty while adding as little as possible to carbon emissions.

“Large hydro is a very big part of the solution for Africa and South Asia and Southeast Asia. . . . I fundamentally believe we have to be involved,” said Rachel Kyte, the bank’s vice president for sustainable development and an influential voice among Kim’s top staff members. The earlier move out of hydro “was the wrong message. . . . That was then. This is now. We are back.”

It is a controversial stand. The bank backed out of large-scale hydropower because of the steep trade-offs involved. Big dams produce lots of cheap, clean electricity, but they often uproot villages in dam-flooded areas and destroy the livelihoods of the people the institution is supposed to help. A 2009 World Bank review of hydro­power noted the “overwhelming environmental and social risks” that had to be addressed but also concluded that Africa and Asia’s vast and largely undeveloped hydropower potential was key to providing dependable electricity to the hundreds of millions of people who remain without it.

“What’s the one issue that’s holding back development in the poorest countries? It’s energy. There’s just no question,” Kim said in an interview.

Advocacy groups remain skeptical, arguing that large projects, such as Congo’s long-debated network of dams around Inga Falls, may be of more benefit to mining companies or industries in neighboring countries than poor communities.

“It is the old idea of a silver bullet that can modernize whole economies,” said Peter Bosshard, policy director of International Rivers, a group that has organized opposition to the bank’s evolving hydro policy and argued for smaller projects designed around communities rather than mega-dams meant to export power throughout a region.

“Turning back to hydro is being anything but a progressive climate bank,” said Justin Guay, a Sierra Club spokesman on climate and energy issues. “There needs to be a clear shift from large, centralized projects.”

The major nations that support the World Bank, however, have been pushing it to identify such projects — complex undertakings that might happen only if an international organization is involved in sorting out the financing, overseeing the performance and navigating the politics.

The move toward big hydro comes amid Kim’s stark warning that global warming will leave the next generation with an “unrecognizable planet.” That dire prediction, however, has left him struggling to determine how best to respond and frustrated by some of the bank’s inherent limitations.

In his speeches, Kim talks passionately about the bank’s ability to “catalyze” and “leverage” the world to action by mobilizing money and ideas, and he says he is hunting for ideas “equal to the challenge” of curbing carbon use. He has criticized the “small bore” thinking that he says has hobbled progress on the issue.

However, the bank remains in the business of financing traditional fossil-fuel plants, including those that use the dirtiest form of coal, as well as cleaner but ­carbon-based natural gas infrastructures.

Among the projects likely to cross Kim’s desk in coming months, for example, is a 600-megawatt power plant in Kosovo that would be fired by lignite coal, the bottom of the barrel when it comes to carbon emissions.

The plant has strong backing from the United States, the World Bank’s major shareholder. It also meshes with one of the bank’s other long-standing imperatives: Give countries what they ask for. The institution has 188 members to keep happy and can go only so far in trying to impose its judgment over that of local officials. Kim, who in his younger days demonstrated against World Bank-enforced “orthodoxy” in economic policy, now may be hard-pressed to enforce an energy orthodoxy of his own.

Kosovo’s domestic supplies of lignite are ample enough to free the country from imported fuel. Kim said there is little question that Kosovo needs more electricity, and the new plant will allow an older, more polluting facility to be shut down.

“I would just love to never sign a coal project,” Kim said. “We understand it is much, much dirtier, but . . . we have 188 members. . . . We have to be fair in balancing the needs of poor countries . . . with this other bigger goal of tackling climate change.”

The bank is working on other ideas. Kim said he is considering how it might get involved in creating a more effective world market for carbon, allowing countries that invest in renewable energy or “climate friendly” agriculture to be paid for their carbon savings by industries that need to use fossil fuels. Existing carbon markets have been plagued with volatile pricing — Europe’s cost of carbon has basically collapsed — or rules that prevent carbon trading with developing countries.

“We’ve got to figure out a way to establish a stable price of carbon,” Kim said. “Everybody knows that.”

He has also staked hope for climate progress on developments in agriculture.

Hydropower projects, however, seem notably inside what Kim says is the bank’s sweet spot — complex, high-impact, green and requiring the sort of joint public and private financing Kim says the bank can attract.

The massive hydropower potential of the Congo River, estimated at about 40,000 megawatts, is such a target. Its development is on a list of top world infrastructure priorities prepared by the World Bank and other development agencies for the Group of 20 major economic powers.

Two smaller dams on the river have been plagued by poor performance and are being rehabilitated with World Bank assistance. A third being planned would represent a quantum jump — a 4,800-megawatt, $12 billion giant that would move an entire region off carbon-based electricity.

The African Development Bank has begun negotiations over the financing, and the World Bank is ready to step in with tens of millions of dollars in technical-planning help.

“In an ideal world, we start building in 2016. By 2020, we switch on the lights,” said Hela Cheikhrouhou, energy and environment director for the African Development Bank.

It is the sort of project that the World Bank had stayed away from for many years — not least because of instability in the country. But as the country tries to move beyond its civil war and the region intensifies its quest for the power to fuel economic growth, the bank seems ready to move. Kim will visit Congo this month for a discussion about development in fragile and war-torn states.

Kyte, the World Bank vice president, said the Inga project will be high on the agenda.

“People have been looking at the Inga dam for as long as I have been in the development business,” she said. “The question is: Did the stars align? Did you have a government in place? Did people want to do it? Are there investors interested? Do you have the ability to do the technical work? The stars are aligned now. Let’s go.”

Câmara de São Paulo aprova envio de torpedo para alertar chuvas (Folha de S.Paulo)

16/05/2013 – 17h01

GIBA BERGAMIM JR., DE SÃO PAULO

Atualizado às 17h54.

A Câmara de São Paulo aprovou um projeto que obriga a prefeitura a enviar mensagens de texto aos celulares dos paulistanos com alertas sobre a chegada de chuvas e de iminentes alagamentos.

De autoria do vereador Ricardo Young (PPS), o projeto agora precisa ser sancionado pelo prefeito Fernando Haddad (PT) para começar a valer.

Hoje, a única maneira de se informar sobre isso é acompanhando o noticiário nas rádios e emissoras de TV ou por meio do site do CGE (Centro de Gerenciamento de Emergências), da prefeitura, que monitora as chuvas na cidade.

O vereador diz que se inspirou em iniciativas do tipo nos Estados Unidos e Europa para dar informações sobre nevascas, por exemplo.

Em 2011, a prefeitura deu início a um projeto semelhante, mas apenas para os moradores da região da favela Pantanal, na zona leste de São Paulo, que sofreu com as enchentes durante quase dois meses inteiros durante o verão.

De acordo com o texto de Young, o município terá que celebrar convênios com empresas de telefonia móvel. As informações terão que ser passadas com antecedência de pelo menos duas horas aos paulistanos.

Segundo o parlamentar, o projeto permitiria que empresas, repartições públicas e escolas pudessem antecipar o fim do expediente para que os paulistanos cheguem em casa antes das chuvas.

PROJETOS

Também foi aprovado projeto dos vereadores Antonio Goulart (PSD) e Roberto Trípoli (PV) que permite que os animais de estimação sejam enterrados no mesmo jazigo de seus donos em cemitérios municipais. O projeto irá a segunda votação.

A um ano da Copa do Mundo, os vereadores também deram o título de cidadão paulistano para o presidente da Fifa, Joseph Blatter

Climate Change Will Cause Widespread Global-Scale Loss of Common Plants and Animals, Researchers Predict (Science Daily)

May 12, 2013 — More than half of common plants and one third of the animals could see a dramatic decline this century due to climate change, according to research from the University of East Anglia.

Frog. Plants, reptiles and particularly amphibians are expected to be at highest risk. (Credit: © Anna Omelchenko / Fotolia)

Research published today in the journal Nature Climate Change looked at 50,000 globally widespread and common species and found that more than one half of the plants and one third of the animals will lose more than half of their climatic range by 2080 if nothing is done to reduce the amount of global warming and slow it down.

This means that geographic ranges of common plants and animals will shrink globally and biodiversity will decline almost everywhere.

Plants, reptiles and particularly amphibians are expected to be at highest risk. Sub-Saharan Africa, Central America, Amazonia and Australia would lose the most species of plants and animals. And a major loss of plant species is projected for North Africa, Central Asia and South-eastern Europe.

But acting quickly to mitigate climate change could reduce losses by 60 per cent and buy an additional 40 years for species to adapt. This is because this mitigation would slow and then stop global temperatures from rising by more than two degrees Celsius relative to pre-industrial times (1765). Without this mitigation, global temperatures could rise by 4 degrees Celsius by 2100.

The study was led by Dr Rachel Warren from UEA’s school of Environmental Sciences and the Tyndall Centre for Climate Change Research. Collaborators include Dr.Jeremy VanDerWal at James Cook University in Australia and Dr Jeff Price, also at UEA’s school of Environmental Sciences and the Tyndall Centre. The research was funded by the Natural Environment Research Council (NERC).

Dr Warren said: “While there has been much research on the effect of climate change on rare and endangered species, little has been known about how an increase in global temperature will affect more common species.

“This broader issue of potential range loss in widespread species is a serious concern as even small declines in these species can significantly disrupt ecosystems.

“Our research predicts that climate change will greatly reduce the diversity of even very common species found in most parts of the world. This loss of global-scale biodiversity would significantly impoverish the biosphere and the ecosystem services it provides.

“We looked at the effect of rising global temperatures, but other symptoms of climate change such as extreme weather events, pests, and diseases mean that our estimates are probably conservative. Animals in particular may decline more as our predictions will be compounded by a loss of food from plants.

“There will also be a knock-on effect for humans because these species are important for things like water and air purification, flood control, nutrient cycling, and eco-tourism.

“The good news is that our research provides crucial new evidence of how swift action to reduce CO2 and other greenhouse gases can prevent the biodiversity loss by reducing the amount of global warming to 2 degrees Celsius rather than 4 degrees. This would also buy time — up to four decades — for plants and animals to adapt to the remaining 2 degrees of climate change.”

The research team quantified the benefits of acting now to mitigate climate change and found that up to 60 per cent of the projected climatic range loss for biodiversity can be avoided.

Dr Warren said: “Prompt and stringent action to reduce greenhouse gas emissions globally would reduce these biodiversity losses by 60 per cent if global emissions peak in 2016, or by 40 per cent if emissions peak in 2030, showing that early action is very beneficial. This will both reduce the amount of climate change and also slow climate change down, making it easier for species and humans to adapt.”

Information on the current distributions of the species used in this research came from the datasets shared online by hundreds of volunteers, scientists and natural history collections through the Global Biodiversity Information Facility (GBIF).

Co-author Dr Jeff Price, also from UEA’s school of Environmental Studies, said: “Without free and open access to massive amounts of data such as those made available online through GBIF, no individual researcher is able to contact every country, every museum, every scientist holding the data and pull it all together. So this research would not be possible without GBIF and its global community of researchers and volunteers who make their data freely available.”

Journal Reference:

  1. R. Warren, J. VanDerWal, J. Price, J. A. Welbergen, I. Atkinson, et al. Quantifying the benefit of early climate change mitigation in avoiding biodiversity lossNature Climate Change, 2013 DOI: 10.1038/nclimate1887