Arquivo da tag: ciência

Will one researcher’s discovery deep in the Amazon destroy the foundation of modern linguistics? (The Chronicle of Higher Education)

The Chronicle Review

By Tom Bartlett

March 20, 2012

Angry Words

chomsky everett

A Christian missionary sets out to convert a remote Amazonian tribe. He lives with them for years in primitive conditions, learns their extremely difficult language, risks his life battling malaria, giant anacondas, and sometimes the tribe itself. In a plot twist, instead of converting them he loses his faith, morphing from an evangelist trying to translate the Bible into an academic determined to understand the people he’s come to respect and love.

Along the way, the former missionary discovers that the language these people speak doesn’t follow one of the fundamental tenets of linguistics, a finding that would seem to turn the field on its head, undermine basic assumptions about how children learn to communicate, and dethrone the discipline’s long-reigning king, who also happens to be among the most well-known and influential intellectuals of the 20th century.

It feels like a movie, and it may in fact turn into one—there’s a script and producers on board. It’s already a documentary that will air in May on the Smithsonian Channel. A play is in the works in London. And the man who lived the story, Daniel Everett, has written two books about it. His 2008 memoir Don’t Sleep, There Are Snakes, is filled with Joseph Conrad-esque drama. The new book, Language: The Cultural Tool, which is lighter on jungle anecdotes, instead takes square aim at Noam Chomsky, who has remained the pre-eminent figure in linguistics since the 1960s, thanks to the brilliance of his ideas and the force of his personality.

But before any Hollywood premiere, it’s worth asking whether Everett actually has it right. Answering that question is not straightforward, in part because it hinges on a bit of grammar that no one except linguists ever thinks about. It’s also made tricky by the fact that Everett is the foremost expert on this language, called Pirahã, and one of only a handful of outsiders who can speak it, making it tough for others to weigh in and leading his critics to wonder aloud if he has somehow rigged the results.

More than any of that, though, his claim is difficult to verify because linguistics is populated by a deeply factionalized group of scholars who can’t agree on what they’re arguing about and who tend to dismiss their opponents as morons or frauds or both. Such divisions exist, to varying degrees, in all disciplines, but linguists seem uncommonly hostile. The word “brutal” comes up again and again, as do “spiteful,” “ridiculous,” and “childish.”

With that in mind, why should anyone care about the answer? Because it might hold the key to understanding what separates us from the rest of the animals.

Imagine a linguist from Mars lands on Earth to survey the planet’s languages (presumably after obtaining the necessary interplanetary funding). The alien would reasonably conclude that the languages of the world are mostly similar with interesting but relatively minor variations.

As science-fiction premises go it’s rather dull, but it roughly illustrates Chomsky’s view of linguistics, known as Universal Grammar, which has dominated the field for a half-century. Chomsky is fond of this hypothetical and has used it repeatedly for decades, including in a 1971 discussion with Michel Foucault, during which he added that “this Martian would, if he were rational, conclude that the structure of the knowledge that is acquired in the case of language is basically internal to the human mind.”

In his new book, Everett, now dean of arts and sciences at Bentley University, writes about hearing Chomsky bring up the Martian in a lecture he gave in the early 1990s. Everett noticed a group of graduate students in the back row laughing and exchanging money. After the talk, Everett asked them what was so funny, and they told him they had taken bets on precisely when Chomsky would once again cite the opinion of the linguist from Mars.

The somewhat unkind implication is that the distinguished scholar had become so predictable that his audiences had to search for ways to amuse themselves. Another Chomsky nugget is the way he responds when asked to give a definition of Universal Grammar. He will sometimes say that Universal Grammar is whatever made it possible for his granddaughter to learn to talk but left the world’s supply of kittens and rocks speechless—a less-than-precise answer. Say “kittens and rocks” to a cluster of linguists and eyes are likely to roll.

Chomsky’s detractors have said that Universal Grammar is whatever he needs it to be at that moment. By keeping it mysterious, they contend, he is able to dodge criticism and avoid those who are gunning for him. It’s hard to murder a phantom.

Everett’s book is an attempt to deliver, if not a fatal blow, then at least a solid right cross to Universal Grammar. He believes that the structure of language doesn’t spring from the mind but is instead largely formed by culture, and he points to the Amazonian tribe he studied for 30 years as evidence. It’s not that Everett thinks our brains don’t play a role—they obviously do. But he argues that just because we are capable of language does not mean it is necessarily prewired. As he writes in his book: “The discovery that humans are better at building human houses than porpoises tells us nothing about whether the architecture of human houses is innate.”

The language Everett has focused on, Pirahã, is spoken by just a few hundred members of a hunter-gatherer tribe in a remote part of Brazil. Everett got to know the Pirahã in the late 1970s as an American missionary. With his wife and kids, he lived among them for months at a time, learning their language from scratch. He would point to objects and ask their names. He would transcribe words that sounded identical to his ears but had completely different meanings. His progress was maddeningly slow, and he had to deal with the many challenges of jungle living. His story of taking his family, by boat, to get treatment for severe malaria is an epic in itself.

His initial goal was to translate the Bible. He got his Ph.D. in linguistics along the way and, in 1984, spent a year studying at the Massachusetts Institute of Technology in an office near Chomsky’s. He was a true-blue Chomskyan then, so much so that his kids grew up thinking Chomsky was more saint than professor. “All they ever heard about was how great Chomsky was,” he says. He was a linguist with a dual focus: studying the Pirahã language and trying to save the Pirahã from hell. The second part, he found, was tough because the Pirahã are rooted in the present. They don’t discuss the future or the distant past. They don’t have a belief in gods or an afterlife. And they have a strong cultural resistance to the influence of outsiders, dubbing all non-Pirahã “crooked heads.” They responded to Everett’s evangelism with indifference or ridicule.

As he puts it now, the Pirahã weren’t lost, and therefore they had no interest in being saved. They are a happy people. Living in the present has been an excellent strategy, and their lack of faith in the divine has not hindered them. Everett came to convert them, but over many years found that his own belief in God had melted away.

So did his belief in Chomsky, albeit for different reasons. The Pirahã language is remarkable in many respects. Entire conversations can be whistled, making it easier to communicate in the jungle while hunting. Also, the Pirahã don’t use numbers. They have words for amounts, like a lot or a little, but nothing for five or one hundred. Most significantly, for Everett’s argument, he says their language lacks what linguists call “recursion”—that is, the Pirahã don’t embed phrases in other phrases. They instead speak only in short, simple sentences.

In a recursive language, additional phrases and clauses can be inserted in a sentence, complicating the meaning, in theory indefinitely. For most of us, the lack of recursion in a little-known Brazilian language may not seem terribly interesting. But when Everett published a paper with that finding in 2005, the news created a stir. There were magazine articles and TV appearances. Fellow linguists weighed in, if only in some cases to scoff. Everett had put himself and the Pirahã on the map.

His paper might have received a shrug if Chomsky had not recently co-written a paper, published in 2002, that said (or seemed to say) that recursion was the single most important feature of human language. “In particular, animal communication systems lack the rich expressive and open-ended power of human language (based on humans’ capacity for recursion),” the authors wrote. Elsewhere in the paper, the authors wrote that the faculty of human language “at minimum” contains recursion. They also deemed it the “only uniquely human component of the faculty of language.”

In other words, Chomsky had finally issued what seemed like a concrete, definitive statement about what made human language unique, exposing a possible vulnerability. Before Everett’s paper was published, there had already been back and forth between Chomsky and the authors of a response to the 2002 paper, Ray Jackendoff and Steven Pinker. In the wake of that public disagreement, Everett’s paper had extra punch.

It’s been said that if you want to make a name for yourself in modern linguistics, you have to either align yourself with Chomsky or seek to destroy him. Either you are desirous of his approval or his downfall. With his 2005 paper, Everett opted for the latter course.

Because the pace of academic debate is just this side of glacial, it wasn’t until June 2009 that the next major chapter in the saga was written. Three scholars who are generally allies of Chomsky published a lengthy paper in the journal Language dissecting Everett’s claims one by one. What he considered unique features of Pirahã weren’t unique. What he considered “gaps” in the language weren’t gaps. They argued this in part by comparing Everett’s recent paper to work he published in the 1980s, calling it, slightly snidely, his earlier “rich material.” Everett wasn’t arguing with Chomsky, they claimed; he was arguing with himself. Young Everett thought Pirahã had recursion. Old Everett did not.

Everett’s defense was, in so many words, to agree. Yes, his earlier work was contradictory, but that’s because he was still under Chomsky’s sway when he wrote it. It’s natural, he argued, even when doing basic field work, cataloging the words of a language and the stories of a people, to be biased by your theoretical assumptions. Everett was a Chomskyan through and through, so much so that he had written the MSN Encarta encyclopedia entry on him. But now, after more years with the Pirahã, the scales had fallen from his eyes, and he saw the language on its own terms rather than those he was trying to impose on it.

David Pesetsky, a linguistics professor at MIT and one of the authors of the critical Languagepaper, thinks Everett was trying to gin up a “Star Wars-level battle between himself and the forces of Universal Grammar,” presumably with Everett as Luke Skywalker and Chomsky as Darth Vader.

Contradicting Everett meant getting into the weeds of the Pirahã language, a language that Everett knew intimately and his critics did not. “Most people took the attitude that this wasn’t worth taking on,” Pesetsky says. “There’s a junior-high-school corridor, two kids are having a fight, and everyone else stands back.” Everett wrote a lengthy reply that Pesetsky and his co-authors found unsatisfying and evasive. “The response could have been ‘Yeah, we need to do this more carefully,'” says Pesetsky. “But he’s had seven years to do it more carefully and he hasn’t.”

Critics haven’t just accused Everett of inaccurate analysis. He’s the sole authority on a language that he says changes everything. If he wanted to, they suggest, he could lie about his findings without getting caught. Some were willing to declare him essentially a fraud. That’s what one of the authors of the 2009 paper, Andrew Nevins, now at University College London, seems to believe. When I requested an interview with Nevins, his reply read, “I may be being glib, but it seems you’ve already analyzed this kind of case!” Below his message was a link to an article I had written about a Dutch social psychologist who had admitted to fabricating results, including creating data from studies that were never conducted. In another e-mail, after declining to expand on his apparent accusation, Nevins wrote that the “world does not need another article about Dan Everett.”

In 2007, Everett heard reports of a letter signed by Cilene Rodrigues, who is Brazilian, and who co-wrote the paper with Pesetsky and Nevins, that accuses him of racism. According to Everett, he got a call from a source informing him that Rodrigues, an honorary research fellow at University College London, had sent a letter to the organization in Brazil that grants permission for researchers to visit indigenous groups like the Pirahã. He then discovered that the organization, called FUNAI, the National Indian Foundation, would no longer grant him permission to visit the Pirahã, whom he had known for most of his adult life and who remain the focus of his research.

He still hasn’t been able to return. Rodrigues would not respond directly to questions about whether she had signed such a letter, nor would Nevins. Rodrigues forwarded an e-mail from another linguist who has worked in Brazil, which speculates that Everett was denied access to the Pirahã because he did not obtain the proper permits and flouted the law, accusations Everett calls “completely false” and “amazingly nasty lies.”

Whatever the reason for his being blocked, the question remains: Is Everett’s work racist? The accusation goes that because Everett says that the Pirahã do not have recursion, and that all human languages supposedly have recursion, Everett is asserting that the Pirahã are less than human. Part of this claim is based on an online summary, written by a former graduate student of Everett’s, that quotes traders in Brazil saying the Pirahã “talk like chickens and act like monkeys,” something Everett himself never said and condemns. The issue is sensitive because the Pirahã, who eschew the trappings of modern civilization and live the way their forebears lived for thousands of years, are regularly denigrated by their neighbors in the region as less than human. The fact that Everett is American, not Brazilian, lends the charge added symbolic weight.

When you read Everett’s two books about the Pirahã, it is nearly impossible to think that he believes they are inferior. In fact, he goes to great lengths not to condescend and offers defenses of practices that outsiders would probably find repugnant. In one instance he describes, a Pirahã woman died, leaving behind a baby that the rest of the tribe thought was too sick to live. Everett cared for the infant. One day, while he was away, members of the tribe killed the baby, telling him that it was in pain and wanted to die. He cried, but didn’t condemn, instead defending in the book their seemingly cruel logic.

Likewise, the Pirahã’s aversion to learning agriculture, or preserving meat, or the fact that they show no interest in producing artwork, is portrayed by Everett not as a shortcoming but as evidence of the Pirahã’s insistence on living in the present. Their nonhierarchical social system seems to Everett fair and sensible. He is critical of his own earlier attempts to convert the Pirahã to Christianity as a sort of “colonialism of the mind.” If anything, Everett is more open to a charge of romanticizing the Pirahã culture.

Other critics are more measured but equally suspicious. Mark Baker, a linguist at Rutgers University at New Brunswick, who considers himself part of Chomsky’s camp, mentions Everett’s “vested motive” in saying that the Pirahã don’t have recursion. “We always have to be a little careful when we have one person who has researched a language that isn’t accessible to other people,” Baker says. He is dubious of Everett’s claims. “I can’t believe it’s true as described,” he says.

Chomsky hasn’t exactly risen above the fray. He told a Brazilian newspaper that Everett was a “charlatan.” In the documentary about Everett, Chomsky raises the possibility, without saying he believes it, that Everett may have faked his results. Behind the scenes, he has been active as well. According to Pesetsky, Chomsky asked him to send an e-mail to David Papineau, a professor of philosophy at King’s College London, who had written a positive, or at least not negative, review of Don’t Sleep, There Are Snakes. The e-mail complained that Papineau had misunderstood recursion and was incorrectly siding with Everett. Papineau thought he had done nothing of the sort. “For people outside of linguistics, it’s rather surprising to find this kind of protection of orthodoxy,” Papineau says.

And what if the Pirahã don’t have recursion? Rather than ferreting out flaws in Everett’s work as Pesetsky did, Chomsky’s preferred response is to say that it doesn’t matter. In a lecture he gave last October at University College London, he referred to Everett’s work without mentioning his name, talking about those who believed that “exceptions to the generalizations are considered lethal.” He went on to say that a “rational reaction” to finding such exceptions “isn’t to say ‘Let’s throw out the field.'” Universal Grammar permits such exceptions. There is no problem. As Pesetsky puts it: “There’s nothing that says languages without subordinate clauses can’t exist.”

Except the 2002 paper on which Chomsky’s name appears. Pesetsky and others have backed away from that paper, arguing not that it was incorrect, but that it was “written in an unfortunate way” and that the authors were “trying to make certain things comprehensible about linguistics to a larger public, but they didn’t make it clear that they were simplifying.” Some say that Chomsky signed his name to the paper but that it was actually written by Marc Hauser, the former professor of psychology at Harvard University, who resigned after Harvard officials found him guilty of eight counts of research misconduct. (For the record, no one has suggested the alleged misconduct affected his work with Chomsky.)

Chomsky declined to grant me an interview. Those close to him say he sees Everett as seizing on a few stray, perhaps underexplained, lines from that 2002 paper and distorting them for his own purposes. And the truth, Chomsky has made clear, should be apparent to any rational person.

Ted Gibson has heard that one before. When Gibson, a professor of cognitive sciences at MIT, gave a paper on the topic at a January meeting of the Linguistic Society of America, held in Portland, Ore., Pesetsky stood up at the end to ask a question. “His first comment was that Chomsky never said that. I went back and found the slide,” he says. “Whenever I talk about this question in front of these people I have to put up the literal quote from Chomsky. Then I have to put it up again.”

Geoffrey Pullum, a professor of linguistics at the University of Edinburgh, is also vexed at how Chomsky and company have, in his view, played rhetorical sleight-of-hand to make their case. “They have retreated to such an extreme degree that it says really nothing,” he says. “If it has a sentence longer than three words then they’re claiming they were right. If that’s what they claim, then they weren’t claiming anything.” Pullum calls this move “grossly dishonest and deeply silly.”

Everett has been arguing about this for seven years. He says Pirahã undermines Universal Grammar. The other side says it doesn’t. In an effort to settle the dispute, Everett asked Gibson, who holds a joint appointment in linguistics at MIT, to look at the data and reach his own conclusions. He didn’t provide Gibson with data he had collected himself because he knows his critics suspect those data have been cooked. Instead he provided him with sentences and stories collected by his missionary predecessor. That way, no one could object that it was biased.

In the documentary about Everett, handing over the data to Gibson is given tremendous narrative importance. Everett is the bearded, safari-hatted field researcher boating down a river in the middle of nowhere, talking and eating with the natives. Meanwhile, Gibson is the nerd hunched over his keyboard back in Cambridge, crunching the data, examining it with his research assistants, to determine whether Everett really has discovered something. If you watch the documentary, you get the sense that what Gibson has found confirms Everett’s theory. And that’s the story you get from Everett, too. In our first interview, he encouraged me to call Gibson. “The evidence supports what I’m saying,” he told me, noting that he and Gibson had a few minor differences of interpretation.

But that’s not what Gibson thinks. Some of what he found does support Everett. For example, he’s confirmed that Pirahã lacks possessive recursion, phrases like “my brother’s mother’s house.” Also, there appear to be no conjunctions like “and” or “or.” In other instances, though, he’s found evidence that seems to undercut Everett’s claims—specifically, when it comes to noun phrases in sentences like “His mother, Itaha, spoke.”

That is a simple sentence, but inserting the mother’s name is a hallmark of recursion. Gibson’s paper, on which Everett is a co-author, states, “We have provided suggestive evidence that Pirahã may have sentences with recursive structures.”

If that turns out to be true, it would undermine the primary thesis of both of Everett’s books about the Pirahã. Rather than the hero who spent years in the Amazon emerging with evidence that demolished the field’s predominant theory, Everett would be the descriptive linguist who came back with a couple of books full of riveting anecdotes and cataloged a language that is remarkable, but hardly changes the game.

Everett only realized during the reporting of this article that Gibson disagreed with him so strongly. Until then, he had been saying that the results generally supported his theory. “I don’t know why he says that,” Gibson says. “Because it doesn’t. He wrote that our work corroborates it. A better word would be falsified. Suggestive evidence is against it right now and not for it.” Though, he points out, the verdict isn’t final. “It looks like it is recursive,” he says. “I wouldn’t bet my life on it.”

Another researcher, Ray Jackendoff, a linguist at Tufts University, was also provided the data and sees it slightly differently. “I think we decided there is some embedding but it is of limited depth,” he says. “It’s not recursive in the sense that you can have infinitely deep embedding.” Remember that in Chomsky’s paper, it was the idea that “open-ended” recursion was possible that separated human and animal communication. Whether the kind of limited recursion Gibson and Jackendoff have noted qualifies depends, like everything else in this debate, on the interpretation.

Everett thinks what Gibson has found is not recursion, but rather false starts, and he believes further research will back him up. “These are very short, extremely limited examples and they almost always are nouns clarifying other nouns,” he says. “You almost never see anything but that in these cases.” And he points out that there still doesn’t seem to be any evidence of infinite recursion. Says Everett: “There simply is no way, even if what I claim to be false starts are recursive instead, to say, “‘My mother, Susie, you know who I mean, you like her, is coming tonight.'”

The field has a history of theoretical disagreements that turn ugly. In the book The Linguistic Wars, published in 1995, Randy Allen Harris tells the story of another skirmish between Chomsky and a group of insurgent linguists called generative semanticists. Chomsky dismissed his opponents’ arguments as absurd. His opponents accused him of altering his theories when confronted and of general arrogance. “Chomsky has the impressive rhetorical talent of offering ideas which are at once tentative and fully endorsed, of appearing to take the if out of his arguments while nevertheless keeping it safely around,” writes Harris.

That rhetorical talent was on display in his lecture last October, in which he didn’t just disagree with other linguists, but treated their arguments as ridiculous and a mortal danger to the field. The style seems to be reflected in his political activism. Watch his 1969 debate on Firing Lineagainst William F. Buckley Jr., available on YouTube, and witness Chomsky tie his famous interlocutor in knots. It is a thorough, measured evisceration. Chomsky is willing to deploy those formidable skills in linguistic arguments as well.

Everett is far from the only current Chomsky challenger. Recently there’s been a rise in so-called corpus linguistics, a data-driven method of evaluating a language, using computer software to analyze sentences and phrases. The method produces detailed information and, for scholars like Gibson, finally provides scientific rigor for a field he believes has been mired in never-ending theoretical disputes. That, along with the brain-scanning technology that linguists are increasingly making use of, may be able to help resolve questions about how much of the structure of language is innate and how much is shaped by culture.

But Chomsky has little use for that method. In his lecture, he deemed corpus linguistics nonscientific, comparing it to doing physics by describing the swirl of leaves on a windy day rather than performing experiments. This was “just statistical modeling,” he said, evidence of a “kind of pathology in the cognitive sciences.” Referring to brain scans, Chomsky joked that the only way to get a grant was to propose an fMRI.

As for Universal Grammar, some are already writing its obituary. Michael Tomasello, co-director of the Max Planck Institute for Evolutionary Anthropology, has stated flatly that “Universal Grammar is dead.” Two linguists, Nicholas Evans and Stephen Levinson, published a paper in 2009 titled “The Myth of Language Universals,” arguing that the “claims of Universal Grammar … are either empirically false, unfalsifiable, or misleading in that they refer to tendencies rather than strict universals.” Pullum has a similar take: “There is no Universal Grammar now, not if you take Chomsky seriously about the things he says.”

Gibson puts it even more harshly. Just as Chomsky doesn’t think corpus linguistics is science, Gibson doesn’t think Universal Grammar is worthwhile. “The question is, ‘What is it?’ How much is built-in and what does it do? There are no details,” he says. “It’s crazy to say it’s dead. It was never alive.”

Such proclamations have been made before and Chomsky, now 83, has a history of outmaneuvering and outlasting his adversaries. Whether Everett will be yet another in a long line of would-be debunkers who turn into footnotes remains to be seen. “I probably do, despite my best intentions, hope that I turn out to be right,” he says. “I know that it is not scientific. But I would be a hypocrite if I didn’t admit it.”

How Do You Say ‘Disagreement’ in Pirahã? (N.Y.Times)

By JENNIFER SCHUESSLER. Published: March 21, 2012

Dan Everett. Essential Media & Entertainment/Smithsonian Channel

In his 2008 memoir, “Don’t Sleep, There Are Snakes,” the linguist Dan Everett recalled the night members of the Pirahã — the isolated Amazonian hunter-gatherers he first visited as a Christian missionary in the late 1970s — tried to kill him.

Dr. Everett survived, and his life among the Pirahã, a group of several hundred living in northwest Brazil, went on mostly peacefully as he established himself as a leading scholarly authority on the group and one of a handful of outsiders to master their difficult language.

His life among his fellow linguists, however, has been far less idyllic, and debate about his scholarship is poised to boil over anew, thanks to his ambitious new book, “Language: The Cultural Tool,” and a forthcoming television documentary that presents an admiring view of his research among the Pirahã along with a darkly conspiratorial view of some of his critics.

Members of the Pirahã people of Amazonian Brazil, who have an unusual language, as seen in “The Grammar of Happiness.” Essential Media & Entertainment/Smithsonian Channel

In 2005 Dr. Everett shot to international prominence with a paper claiming that he had identified some peculiar features of the Pirahã language that challenged Noam Chomsky’s influential theory, first proposed in the 1950s, that human language is governed by “universal grammar,” a genetically determined capacity that imposes the same fundamental shape on all the world’s tongues.

The paper, published in the journal Current Anthropology, turned him into something of a popular hero but a professional lightning rod, embraced in the press as a giant killer who had felled the mighty Chomsky but denounced by some fellow linguists as a fraud, an attention seeker or worse, promoting dubious ideas about a powerless indigenous group while refusing to release his data to skeptics.

The controversy has been simmering in journals and at conferences ever since, fed by a slow trickle of findings by researchers who have followed Dr. Everett’s path down to the Amazon. In a telephone interview Dr. Everett, 60, who is the dean of arts and sciences at Bentley University in Waltham, Mass., insisted that he’s not trying to pick a fresh fight, let alone present himself as a rival to the man he calls “the smartest person I’ve ever met.”

“I’m a small fish in the sea,” he said, adding, “I do not put myself at Chomsky’s level.”

Dan Everett in the Amazon region of Brazil with the Pirahã in 1981. Courtesy Daniel Everett

Still, he doesn’t shy from making big claims for “Language: The Cultural Tool,” published last week by Pantheon. “I am going beyond my work with Pirahã and systematically dismantling the evidence in favor of a language instinct,” he said. “I suspect it will be extremely controversial.”

Even some of Dr. Everett’s admirers fault him for representing himself as a lonely voice of truth against an all-powerful Chomskian orthodoxy bent on stopping his ideas dead. It’s certainly the view advanced in the documentary, “The Grammar of Happiness,” which accuses unnamed linguists of improperly influencing the Brazilian government to deny his request to return to Pirahã territory, either with the film crew or with a research team from M.I.T., led by Ted Gibson, a professor of cognitive science. (It’s scheduled to run on the Smithsonian Channel in May.)

A Pirahã man in the film “The Grammar of Happiness.” Essential Media & Entertainment/Smithsonian Channel

Dr. Everett acknowledged that he had no firsthand evidence of any intrigues against him. But Miguel Oliveira, an associate professor of linguistics at the Federal University of Alagoas and the M.I.T. expedition’s Brazilian sponsor, said in an interview that Dr. Everett is widely resented among scholars in Brazil for his missionary past, anti-Chomskian stance and ability to attract research money.

“This is politics, everybody knows that,” Dr. Oliveira said. “One of the arguments is that he’s stealing something from the indigenous people to become famous. It’s not said. But that’s the way they think.”

Claims of skullduggery certainly add juice to a debate that, to nonlinguists, can seem arcane. In a sense what Dr. Everett has taken from the Pirahã isn’t gold or rare medicinal plants but recursion, a property of language that allows speakers to embed phrases within phrases — for example, “The professor said Everett said Chomsky is wrong” — infinitely.

In a much-cited 2002 paper Professor Chomsky, an emeritus professor of linguistics at M.I.T., writing with Marc D. Hauser and W. Tecumseh Fitch, declared recursion to be the crucial feature of universal grammar and the only thing separating human language from its evolutionary forerunners. But Dr. Everett, who had been publishing quietly on the Pirahã for two decades, announced in his 2005 paper that their language lacked recursion, along with color terms, number terms, and other common properties of language. The Pirahã, Dr. Everett wrote, showed these linguistic gaps not because they were simple-minded, but because their culture — which emphasized concrete matters in the here and now and also lacked creation myths and traditions of art making — did not require it.

To Dr. Everett, Pirahã was a clear case of culture shaping grammar — an impossibility according to the theory of universal grammar. But to some of his critics the paper was really just a case of Dr. Everett — who said he began questioning his own Chomskian ideas in the early 1990s, around the time he began questioning his faith — fixing the facts around his new theories.

In 2009 the linguists Andrew Nevins, Cilene Rodrigues and David Pesetsky, three of the fiercest early critics of Dr. Everett’s paper, published their own in the journal Language, disputing his linguistic claims and expressing “discomfort” with his overall account of the Pirahã’s simple culture. Their main source was Dr. Everett himself, whose 1982 doctoral dissertation, they argued, showed clear evidence of recursion in Pirahã.

“He was right the first time,” Dr. Pesetsky, an M.I.T. professor, said in an interview. “The first time he had reasons. The second time he had no reasons.”

Some scholars say the debate remains stymied by a lack of fresh, independently gathered data. Three different research teams, including one led by Dr. Gibson that traveled to the Pirahã in 2007, have published papers supporting Dr. Everett’s claim that there are no numbers in the Pirahã language. But efforts to go recursion hunting in the jungle — using techniques that range from eliciting sentences to having the Pirahã play specially designed video games — have so far yielded no published results.

Still, some have tried to figure out ways to press ahead, even without direct access to the Pirahã. After Dr. Gibson’s team was denied permission to return to Brazil in 2010, its members devised a method that minimized reliance on Dr. Everett’s data by analyzing instead a corpus of 1,000 sentences from Pirahã stories transcribed by another missionary in the region.

Their analysis, presented at the Linguistic Society of America’s annual meeting in January, found no embedded clauses but did uncover “suggestive evidence” of recursion in a more obscure grammatical corner. It’s a result that is hardly satisfying to Dr. Everett, who questions it. But his critics, oddly, seem no more pleased.

Dr. Pesetsky, who heard the presentation, dismissed the whole effort as biased from the start by its reliance on Dr. Everett’s grammatical classifications and basic assumptions. “They were taking for granted the correctness of the hypothesis they were trying to disconfirm,” he said.

But to Dr. Gibson, who said he does not find Dr. Everett’s cultural theory of language persuasive, such responses reflect the gap between theoretical linguists and data-driven cognitive scientists, not to mention the strangely calcified state of the recursion debate.

“Chomskians and non-Chomskians are weirdly illogical at times,” he said. “It’s like they just don’t want to have a cogent argument. They just want to contradict what the other guy is saying.”

Dr. Everett’s critics fault him for failing to release his field data, even seven years after the controversy erupted. He countered that he is currently working to translate his decades’ worth of material and hopes to post some transcriptions online “over the next several months.” The bigger outrage, he insisted, is what he characterized as other scholars’ efforts to accuse him of “racist research” and interfere with his access to the Pirahã.

Dr. Rodrigues, a professor of linguistics at the Pontifical Catholic University in Rio de Janeiro, acknowledged by e-mail that in 2007 she wrote a letter to Funai, the Brazilian government agency in charge of indigenous affairs, detailing her objections to Dr. Everett’s linguistic research and to his broader description of Pirahã culture.

She declined to elaborate on the contents of the letter, which she said was written at Funai’s request and did not recommend any particular course of action. But asked about her overall opinion of Dr. Everett’s research, she said, “It does not meet the standards of scientific evidence in our field.”

Whatever the reasons for Dr. Everett’s being denied access, he’s enlisting the help of the Pirahã themselves, who are shown at the end of “The Grammar of Happiness” recording an emotional plea to the Brazilian government.

“We love Dan,” one man says into the camera. “Dan speaks our language.”

New report reveals how corporations undermine science with fake bloggers and bribes (io9)

BY ANNALEE NEWITZ

MAR 9, 2012 2:22 PM

You’ve probably heard about how the tobacco industry tried to suppress scientific evidence that smoking causes cancer by publishing shady research, bribing politicians, and pressuring researchers. But you may not have realized that tabacco’s dirty tricks are just the tip of the iceberg. In a disturbing new report published by the Union of Concerned Scientists about corporate corruption of the sciences, you’ll learn about how Monsanto hired a public relations team to invent fake people who harassed a scientific journal online, how Coca Cola offers bribes to suppress evidence that soft drinks harm kids’ teeth, and more. Here are some of the most egregious recent examples of corruption from this must-read report.

The report is a meaty assessment of corporate corruption in science that stretches back to incidents with Big Tobacco in the 1960s, up through contemporary examples. Here are just a few of those.

One way that corporations prevent negative information about their products from getting out is by harassing scientists and the journals that publish them. Here’s how Monsanto did it:

Dr. Ingacio Chapela of the University of California–Berkeley and graduate student David Quist published an article in Nature showing that DNA from genetically modified corn was contaminating native Mexican corn. The research spurred immediate backlash.Nature received a number of letters to the editor, including several comments on the Internet from “Mary Murphy” and “Andura Smetacek” accusing the scientists of bias. The backlash prompted Nature to publish an editorial agreeing that the report should not have been published. However, investigators eventually discovered that the comments from Murphy and Smetacek originated with The Bivings Group, a public relations firm that specializes in online communications and had worked for Monstanto. Mary Murphy and Andura Smetacek were found to be fictional names.

Corporations also form front organizations to hide their efforts to undermine science. That’s what happened when producers of unhealthy food got together to cast doubt on the FDA’s recommended health guidelines:

The Center for Consumer Freedom is a nonprofit that targets dietary guidelines recommended by the FDA, other government agencies, medical associations, and consumer advocacy organizations. The center has run ads and owns a website that accuses government agencies of overregulation, and has published articles claiming to refute evidence that high salt intake and other dietary guidelines are based on inadequate science. The center was founded with a $600,000 grant from Philip Morris, but has also received funding from Cargill, National Steak and Poultry, Monsanto, Coca-Cola, and Sutter Home Winery.

Sometimes corporations just go for it and buy off legit organizations, as Coca Cola did when they appear to have paid dentists to stop saying kids shouldn’t drink Coke:

In 2003, the American Academy of Pediatric Dentistry accepted a $1 million donation from Coca-Cola. That year, the group claimed that “scientific evidence is certainly not clear on the exact role that soft drinks play in terms of children’s oral disease.” The statement directly contradicted the group’s previous stance that “consumption of sugars in any beverage can be a significant factor…that contributes to the initiation and progression of dental caries.”

Corporations can also unduly influence federal agencies, as ReGen did when they wanted their device approved for trials by the FDA, despite serious medical problems:

ReGen Biologics attempted to gain FDA approval for clinical trials of Menaflex, a device it developed to replace knee cartilage. After an FDA panel rejected the device, the company enlisted four members of Congress from its home state of New Jersey to influence the evaluation process. In December 2007, Senator Frank Lautenberg, Senator Robert Menendez, and Representative Steve Rothman wrote to FDA Commissioner Andrew von Eschenbach asking him to personally look into Menaflex. Soon thereafter, the commissioner met with ReGen executives and heeded the company’s advice to have Dr. Daniel Shultz, head of the FDA’s medical devices division, oversee a new review. The FDA fast-tracked and approved the product despite serious concerns from the scientific community.

If bribery doesn’t work, you can always censor negative results, the way pharmaceutical company Boots did:

Boots commissioned Dr. Betty Dong, a scientist at the University of California–San Francisco, to test the effects of Synthroid, a replacement for thyroid hormone. Boots hoped to reveal that despite its high price, Synthroid was more effective than similar drugs. The company closely monitored the research, and when Dong found that the drug was no more effective than its competitors, instructed her not to publish the results. When she refused to comply, Boots threatened to sue. The company relented only after several years, during which consumers continued to pay for the costly product.

You can also try “refuting” scientific results with bad evidence, the way the formaldehyde industry did:

To counter a study that found that formaldehyde caused cancer in rats, a formaldehyde company commissioned its own study. That study-which found no association between the chemical and cancer-exposed only one-third the number of rats to formaldehyde for half as long as the original study. A formaldehyde association quickly publicized the results and argued before the Consumer Product Safety Commission (CPSC) that they indicated “no chronic health effects from exposure to the level of formaldehyde normally encountered in the home”

And then, if you’re Pfizer, you can just generate as much favorable research as you like to bolster sales of a drug, despite your discovery that the drug increases risk of suicide:

From 1998 to 2007, Pfizer discreetly facilitated the publication of 15 case studies, six case reports, and nine letters to the editor to boost off-label use of Neurontin, a drug prescribed to treat seizures in people who have epilepsy and nerve pain. The number of patients taking the drug rose from 430,000 to 6 million, making it one of Pfizer’s most profitable products. An investigation found that Pfizer had failed to publish negative results, selectively reported outcomes, and excluded specific patients from analysis. [Most importantly] Pfizer failed to note that the drug increased the risk of suicide.

Read the full report here, which includes sources for these stories, as well as an extensive section devoted to reforming scientific practices. There are ways we can avoid this kind of corruption, and they involve everything from federal reforms to corporate transparency.

[via Union of Concerned Scientists]

Science, Journalism, and the Hype Cycle: My piece in tomorrow’s Wall Street Journal (Discovery Magazine)

I think one of the biggest struggles a science writer faces is how to accurately describe the promise of new research. If we start promising that a preliminary experiment is going to lead to a cure for cancer, we are treating our readers cruelly–especially the readers who have cancer. On the other hand, scoffing at everything is not a sensible alternative, because sometimes preliminary experiments really do lead to great advances. In the 1950s, scientists discovered that bacteria can slice up virus DNA to avoid getting sick. That discovery led, some 30 years later, to biotechnology–to an industry that enabled, among other things, bacteria to produce human insulin.

This challenge was very much on my mind as I recently read two books, which I review in tomorrow’s Wall Street Journal. One is on gene therapy–a treatment that inspired wild expectations in the 1990s, then crashed, and now is coming back. The other is epigenetics, which seems to me to be in the early stages of the hype cycle. You can read the essay in full here. [see post below]

March 9th, 2012 5:33 PM by Carl Zimmer

Hope, Hype and Genetic Breakthroughs (Wall Street Journal)

By CARL ZIMMER

I talk to scientists for a living, and one of my most memorable conversations took place a couple of years ago with an engineer who put electrodes in bird brains. The electrodes were implanted into the song-generating region of the brain, and he could control them with a wireless remote. When he pressed a button, a bird singing in a cage across the lab would fall silent. Press again, and it would resume its song.

I could instantly see a future in which this technology brought happiness to millions of people. Imagine a girl blind from birth. You could implant a future version of these wireless electrodes in the back of her brain and then feed it images from a video camera.

As a journalist, I tried to get the engineer to explore what seemed to me to be the inevitable benefits of his research. To his great credit, he wouldn’t. He wasn’t even sure his design would ever see the inside of a human skull. There were just too many ways for it to go wrong. He wanted to be very sure that I understood that and that I wouldn’t claim otherwise. “False hope,” he warned me, “is a sinful thing.”

EPEGINE1

Stephen Voss. Gene therapy allowed this once-blind dog to see again.

Over the past two centuries, medical research has yielded some awesome treatments: smallpox wiped out with vaccines, deadly bacteria thwarted by antibiotics, face transplants. But when we look back across history, we forget the many years of failure and struggle behind each of these advances.

This foreshortened view distorts our expectations for research taking place today. We want to believe that every successful experiment means that another grand victory is weeks away. Big stories appear in the press about the next big thing. And then, as the years pass, the next big thing often fails to materialize. We are left with false hope, and the next big thing gets a reputation as the next big lie.

In 1995, a business analyst named Jackie Fenn captured this intellectual whiplash in a simple graph. Again and again, she had seen new advances burst on the scene and generate ridiculous excitement. Eventually they would reach what she dubbed the Peak of Inflated Expectations. Unable to satisfy their promise fast enough, many of them plunged into the Trough of Disillusionment. Their fall didn’t necessarily mean that these technologies were failures. The successful ones slowly emerged again and climbed the Slope of Enlightenment.

When Ms. Fenn drew the Hype Cycle, she had in mind dot-com-bubble technologies like cellphones and broadband. Yet it’s a good model for medical advances too. I could point to many examples of the medical hype cycle, but it’s hard to think of a better one than the subject of Ricki Lewis’s well-researched new book, “The Forever Fix”: gene therapy.

The concept of gene therapy is beguilingly simple. Many devastating disorders are the result of mutant genes. The disease phenylketonuria, for example, is caused by a mutation to a gene involved in breaking down a molecule called phenylalanine. The phenylalanine builds up in the bloodstream, causing brain damage. One solution is to eat a low-phenylalanine diet for your entire life. A much more appealing alternative would be to somehow fix the broken gene, restoring a person’s metabolism to normal.

In “The Forever Fix,” Ms. Lewis chronicles gene therapy’s climb toward the Peak of Inflated Expectations over the course of the 1990s. A geneticist and the author of a widely used textbook, she demonstrates a mastery of the history, even if her narrative sometimes meanders and becomes burdened by clichés. She explains how scientists learned how to identify the particular genes behind genetic disorders. They figured out how to load genes into viruses and then to use those viruses to insert the genes into human cells.

EPEGINE2

Stephen Voss. Alisha Bacoccini is tested on her ability to read letters, at UPenn Hospital, in Philadelphia, PA on Monday, June 23, 2008. Bacoccini is undergoing an experimental gene therapy trial to improve her sight.

By 1999, scientists had enjoyed some promising successes treating people—removing white blood cells from leukemia patients, for example, inserting working genes, and then returning the cells to their bodies. Gene therapy seemed as if it was on the verge of becoming standard medical practice. “Within the next decade, there will be an exponential increase in the use of gene therapy,” Helen M. Blau, the then-director of the gene-therapy technology program at Stanford University, told Business Week.

Within a few weeks of Ms. Blau’s promise, however, gene therapy started falling straight into the Trough. An 18-year-old man named Jesse Gelsinger who suffered from a metabolic disorder had enrolled in a gene-therapy trial. University of Pennsylvania scientists loaded a virus with a working version of an enzyme he needed and injected it into his body. The virus triggered an overwhelming reaction from his immune system and within four days Gelsinger was dead.

Gene therapy nearly came to a halt after his death. An investigation revealed errors and oversights in the design of Gelsinger’s trial. The breathless articles disappeared. Fortunately, research did not stop altogether. Scientists developed new ways of delivering genes without triggering fatal side effects. And they directed their efforts at one part of the body in particular: the eye. The eye is so delicate that inflammation could destroy it. As a result, it has evolved physical barriers that keep the body’s regular immune cells out, as well as a separate battalion of immune cells that are more cautious in their handling of infection.

It occurred to a number of gene-therapy researchers that they could try to treat genetic vision disorders with a very low risk of triggering horrendous side effects of the sort that had claimed Gelsinger’s life. If they injected genes into the eye, they would be unlikely to produce a devastating immune reaction, and any harmful effects would not be able to spread to the rest of the body.

Their hunch paid off. In 2009 scientists reported their first success with gene therapy for a congenital disorder. They treated a rare form of blindness known as Leber’s congenital amaurosis. Children who were once blind can now see.

As “The Forever Fix” shows, gene therapy is now starting its climb up the Slope of Enlightenment. Hundreds of clinical trials are under way to see if gene therapy can treat other diseases, both in and beyond the eye. It still costs a million dollars a patient, but that cost is likely to fall. It’s not yet clear how many other diseases gene therapy will help or how much it will help them, but it is clearly not a false hope.

Gene therapy produced so much excitement because it appealed to the popular idea that genes are software for our bodies. The metaphor only goes so far, though. DNA does not float in isolation. It is intricately wound around spool-like proteins called histones. It is studded with caps made of carbon, hydrogen and oxygen atoms, known as methyl groups. This coiling and capping of DNA allows individual genes to be turned on and off during our lifetimes.

The study of this extra layer of control on our genes is known as epigenetics. In “The Epigenetics Revolution,” molecular biologist Nessa Carey offers an enlightening introduction to what scientists have learned in the past decade about those caps and coils. While she delves into a fair amount of biological detail, she writes clearly and compellingly. As Ms. Carey explains, we depend for our very existence as functioning humans on epigenetics. We begin life as blobs of undifferentiated cells, but epigenetic changes allow some cells to become neurons, others muscle cells and so on.

Epigenetics also plays an important role in many diseases. In cancer cells, genes that are normally only active in embryos can reawaken after decades of slumber. A number of brain disorders, such as autism and schizophrenia, appear to involve the faulty epigenetic programming of genes in neurons.

Scientists got their first inklings about epigenetics decades ago, but in the past few years the field has become hot. In 2008 the National Institutes of Health pledged $190 million to map the epigenetic “marks” on the human genome. New biotech start-ups are trying to carry epigenetic discoveries into the doctor’s office. The FDA has approved cancer drugs that alter the pattern of caps on tumor-cell DNA. Some studies on mice hint that it may be possible to treat depression by taking a pill that adjusts the coils of DNA in neurons.

People seem to be getting giddy about the power of epigenetics in the same way they got giddy about gene therapy in the 1990s. No longer is our destiny written in our DNA: It can be completely overwritten with epigenetics. The excitement is moving far ahead of what the science warrants—or can ever deliver. Last June, an article on the Huffington Post eagerly seized on epigenetics, woefully mangling two biological facts: one, that experiences can alter the epigenetic patterns in the brain; and two, that sometimes epigenetic patterns can be passed down from parents to offspring. The article made a ridiculous leap to claim that we can use meditation to change our own brains and the brains of our children—and thereby alter the course of evolution: “We can jump-start evolution and leverage it on our own terms. We can literally rewire our brains toward greater compassion and cooperation.” You couldn’t ask for a better sign that epigenetics is climbing the Peak of Inflated Expectations at top speed.

The title “The Epigenetics Revolution” unfortunately adds to this unmoored excitement, but in Ms. Carey’s defense, the book itself is careful and measured. Still, epigenetics will probably be plunging soon into the Trough of Disillusionment. It will take years to see whether we can really improve our health with epigenetics or whether this hope will prove to be a false one.

The Forever Fix

By Ricki LewisSt. Martin’s, 323 pages, $25.99

The Epigenetics Revolution

By Nessa CareyColumbia, 339 pages, $26.95

—Mr. Zimmer’s books include “A Planet of Viruses and Evolution: Making Sense of Life,” co-authored with Doug Emlen, to be published in July.

Nature journal criticizes Canadian ‘muzzling’ (CBC News)

Time for Canadian government to set its scientists free, magazine says

The Canadian Press

Posted: Mar 2, 2012 7:08 AM ET

Last Updated: Mar 2, 2012 12:54 PM ET

One of the world's leading scientific journals is criticizing the Harper government for 'muzzling' federal scientists

One of the world’s leading scientific journals is accusing the Harper government of limiting its scientists from speaking publicly about their research.

The journal, Nature, says in an editorial in this week’s issue that it’s time for the Canadian government to set its scientists free.

Nature says Canada is headed in the wrong direction in not letting its scientists speak out freely.Nature says Canada is headed in the wrong direction in not letting its scientists speak out freely. (Nature)It notes that Canada and the United States have undergone role reversals in the past six years.

It says the U.S. has adopted more open practices since the end of George W. Bush’s presidency, while Canada has gone in the opposite direction.

Nature says policy directives on government communications released through access to information requests reveal the Harper government has little understanding of the importance of the free flow of scientific knowledge.

Two weeks ago, the Canadian Science Writers’ Association, the World Federation of Science Journalists and several other groups sent an open letter to Harper, calling on him to unmuzzle federal scientists.

The letter cited a couple of high-profile examples, including one last fall when Environment Canada barred Dr. David Tarasick from speaking to journalists about his ozone layer research when it was published in Nature.

O que você não quer ser quando crescer (Revista Fapesp)

HUMANIDADES | PERCEPÇÃO DA CIÊNCIA

Pesquisa mostra que menos de 3% dos adolescentes latino-americanos desejam seguir uma carreira científica
Carlos Haag
Edição Impressa 192 – Fevereiro de 2012

Boneco de Albert Einstein na Estação Ciência, em São Paulo. © EDUARDO CESAR

Mesmo vivendo num mundo imerso em tecnologia, o jovem, ao se deparar com a célebre pergunta “o que você quer ser quando crescer?”, dificilmente responderá “cientista”. Segundo a pesquisa Los estudiantes y la ciência, projeto do Observatório Ibero-americano de Ciência, Tecnologia e Sociedade (Ryct/Cyted), organizado pelo argentino Carmelo Polino, apenas 2,7% dos estudantes secundaristas (de 15 a 19 anos) da América Latina e Espanha pensam em seguir uma carreira nas áreas de ciências exatas ou naturais, como biologia, química, física, e matemática (as ciências agrícolas mal aparecem). Realizada entre 2008 e 2010, foram consultadas cerca de 9 mil escolas, privadas e particulares, em sete capitais: Assunção, São Paulo, Buenos Aires, Lima, Montevidéu, Bogotá e Madri. Curiosamente, 56% dos entrevistados se disseram interessados em se profissionalizar em ciências sociais e um quinto deles optou pelas engenharias. A equipe brasileira participante do projeto veio do Laboratório de Jornalismo da Unicamp (Labjor), coordenado pelo linguista Carlos Vogt, responsável pelo capítulo “Hábitos informativos sobre ciência e tecnologia” do livro, lançado em espanhol e disponível apenas para download pelo link www.oei.es/salactsi/libro-estudiantes.pdf.
“São dados preocupantes para sociedades em cujas economias há uma intensa necessidade de cientistas e engenheiros, mas há um baixo interesse dos jovens por essas profissões. E as razões alegadas igualmente são desanimadoras: 78% dos estudantes explicam sua opção por achar que as ciências exatas e as naturais são ‘muito difíceis’, quase metade dos alunos as considera ‘chatas’, enquanto um quarto deles afirma que esses campos oferecem oportunidades limitadas de emprego”, afirma Polino. “O número de alunos de ciências já está num patamar insuficiente para as necessidades da economia e indústria e, acima de tudo, para lidar com os problemas a serem enfrentados pelas sociedades no futuro.” Ainda segundo os entrevistados, o desânimo em face do desafio das ciências está ligado, em boa parte, à forma como elas são ensinadas, e reclamam que os recursos utilizados em sala de aula são limitados. Metade dos adolescentes tampouco acredita que as matérias científicas tenham aumentado sua apreciação pela natureza, nem que sejam fontes de solução para problemas de vida cotidiana.

“Há barreiras culturais, porque os jovens de hoje acham que para ter êxito na vida, ter dinheiro, não é preciso estudar muito. É possível escolher uma carreira de resultados econômicos mais rápidos. A cultura do esforço, que é a cultura da ciência, vem perdendo espaço. Temos a necessidade urgente de uma política pública de educação e comunicação da ciência”, avisa Polino. Em alguns pontos a nova pesquisa reforça algumas tendências observadas no estudo anterior do grupo, Percepção pública da ciência (ver “Imagens da ciência” na edição 95 de Pesquisa FAPESP; Leitores esquivos”, na 188; e “Avanços e desafios”, na 185), de 2004, mas a pesquisa recente, com o foco nos jovens, traz novos e preocupantes dados. “Num país como o nosso, cujo futuro depende dos avanços de ciência e tecnologia, e onde há uma grande carência de profissionais técnicos e engenheiros, esses números demandam atenção das autoridades e da sociedade em geral para despertar nesses jovens o interesse pelas carreiras científicas. Acima de tudo, é um paradoxo, porque vivemos num mundo estruturado pela presença da tecnologia em todos os espaços da vida das pessoas”, analisa Vogt. “Apreciamos as benesses do esforço científico, mas não nos interessamos em continuar esse trabalho. As facilidades são ofertadas, mas são ilusórias, porque se quisermos tomar posse dessas conquistas é preciso capacitação científica, capacidade de abstração, mesmo com todas essas dificuldades que advêm do estudo das ciências exatas e naturais.”

“Já existem obstáculos grandes para os jovens adentrarem o mundo das ciências, visto como hermético, uma coisa de iniciados com linguagem própria que pouco tem a ver com o mundo sensível em que vivemos, exigindo um alto grau de abstração, e nem sempre se pode encontrar com facilidade analogias na vida pessoal dos estudantes”, observa Vogt. “Imagine tudo isso num país como o nosso em que apenas 2% dos formados desejam seguir uma carreira no magistério. A situação de ensino é lamentável e, na maioria dos casos, quem dá aulas de ciências vem de campos alternativos, como engenheiros ou médicos, pouco interessados em facilitar ou renovar a maneira de ensinar.”

São, portanto, sutis as razões que levam um estudante a optar pela carreira científica. Segundo a pesquisa, 4 em cada 10 estudantes seguiriam a profissão por dois motivos: viajar muito e trabalhar com novas tecnologias. Para um terço dos interessados, o salário, que consideram atrativo, é também uma variável a ser levada em conta para essa escolha. Bem atrás, com menos de 18%, estão motivos como: descobrir coisas novas, solucionar problemas da humanidade e avançar o conhecimento. Bem abaixo, com menos de 5%, estão razões como exercer uma profissão socialmente prestigiada ou trabalhar com pessoas qualificadas. No campo dos fatores que desanimam os jovens, o grande “vilão” é a didática das ciências nas aulas, que afasta da cabeça dos estudantes o desejo de uma carreira científica ou um futuro laboratorial. Em seguida, para 6 em cada 10 alunos, a dificuldade em entender as matérias é um filtro negativo. O “tédio” assola metade dos jovens. Daí, outro fator que os desanima é a ideia de que escolher a área científica é seguir estudando “indefinidamente” algo que consideram “chato”. Em quarto lugar, com 24%, está o receio de que existam poucas oportunidades de conseguir um emprego na área.

Isso não impede os jovens de ver aqueles que escolheram a ciência para profissão como figuras socialmente prestigiadas, cujo trabalho está associado a fins altruístas e ao progresso, e a imagem dos cientistas que predomina é a de apaixonados pelo seu trabalho, com mentes abertas e um pensamento lógico, não vigorando mais o estereótipo do cientista “solitário” e “distante da realidade”. Há, porém, um ponto controverso: os jovens estão convencidos, em sua maioria, de que os cientistas são donos de uma inteligência superior, que embora possa ser vista como uma característica positiva e atrativa afugenta os jovens, que não se consideram capazes de alcançar os patamares dessas “figuras excepcionais”, afetando negativamente a escolha pela carreira científica. “É preciso analisar esses dados a partir do seu potencial, pois é possível mudar esse paradigma atual que reverta a situação, trazendo não apenas mais jovens para as carreiras científicas, como também melhorando a experiência de aprendizagem da educação secundária”, observa Polino.

Diante da afirmação “que a ciência traz mais benefícios do que riscos à vida das pessoas”, 7 em cada 10 entrevistados concordaram com a premissa. Mas diante da assertiva “a ciência e a tecnologia estão produzindo um estilo de vida artificial e desumanizado”, as posições são menos definidas e a resposta mais recorrente (21,5%) foi “não concordo, nem discordo”. O contexto social revelou aspectos interessantes: os jovens de escolas públicas são menos entusiastas das comodidades oferecidas pela tecnologia. “Não é de estranhar que os que têm menos acesso a ela percebam menos a sua importância em facilitar a vida das pessoas”, nota Polino. Diante das afirmações “contraditórias” de que a ciência está “tirando postos de trabalho” e que “a ciência trará mais chances de trabalho para as gerações futuras” os resultados revelam que mais jovens (37%) têm medo de perder seu emprego por causa da ciência do que são otimistas com o futuro (32%). Segundo os pesquisadores, as respostas seguem o padrão da juventude latino-americana, para quem a “meritocracia” no trabalho é mais mito do que realidade. Quando o meio ambiente entra em cena, tudo piora.

Em face das assertivas “ciência e tecnologia eliminarão a pobreza e a fome do mundo” e “a ciência e a tecnologia são responsáveis pela maior parte dos problemas ambientais”, 3 em cada 10 estudantes não acreditam no poder de “cura” científico e a cifra se repete na certeza de que a ciência está afetando o meio ambiente negativamente. Aqui também as mulheres mostram sua visão: elas são as mais céticas, com 5 em cada 10 rejeitando a capacidade da tecnologia em pôr fim às mazelas globais. No cômputo total, porém, há certo otimismo juvenil: 52% dos adolescentes estão abertos e favoráveis ao que a ciência e a tecnologia possam realizar em nossas sociedades, mostrando que não vigora mais a fé cega e absoluta diante de seus resultados, sendo bem mais moderados e conscientes dos riscos do que os adultos, o que, dizem os pesquisadores, se bem aproveitado pode servir de base a uma cidadania mais crítica e responsável. “Instalar uma usina em Angra sem consultar a sociedade é, hoje, algo impensável. Os jovens pressupõem que exista um sistema que enfatiza a democratização nos processos científicos, o que não implica votar em quem vai ou não para um laboratório”, observa Vogt. “Eles aceitam uma cultura científica que realize uma ligação entre razão e humanidade, entre ciência e sociedade.”

Isso talvez explique um dado curioso descoberto na pesquisa realizada pelo Labjor. Se o caminho do conhecimento científico principal continua a ser a televisão, seguida pela internet, a ficção científica, em livros, filmes, HQs ou games, ganhou um honroso terceiro lugar como fonte de informação sobre ciências para os jovens. “Ao lado da internet, esses meios diferenciados oferecem um grande potencial de atrair jovens para a ciência de forma lúdica e interessante, uma forma estratégica de atingir essa camada da população para a divulgação de assuntos científicos”, nota Vogt. Até porque em vários lugares pesquisados as instituições oficiais são pouco conhecidas ou mesmo ignoradas, assim como os locais onde se pode informar sobre ciência, como museus ou zoológicos. Assim, curiosamente, uma cidade como São Paulo, onde há uma concentração de centros de pesquisa, universidades, e onde o acesso à informação científica é favorecido pela presença de museus e uma oferta midiática rica, mostrou índices de consumo informativo da população abaixo da média.

Veja infográficos:
Evolução dos universitários formados por área do conhecimentoFrequência com que os jovens se informam sobre ciênciaO que afasta os jovens da ciência

Could Many Universities Follow Borders Bookstores Into Oblivion? (The Chronicle of Higher Education)

March 7, 2012, 7:44 pm
By Marc Parry

Atlanta — Higher education’s spin on the Silicon Valley garage. That was the vision laid out in September, when the Georgia Institute of Technology announced a new lab for disruptive ideas, the Center for 21st Century Universities. During a visit to Atlanta last week, I checked in to see how things were going, sitting down with Richard A. DeMillo, the center’s director and Georgia Tech’s former dean of computing, and Paul M.A. Baker, the center’s associate director. We talked about challenges and opportunities facing colleges at a time of economic pain and technological change—among them the chance that many universities might follow Borders Bookstores into oblivion.

Q. You recently wrote that universities are “bystanders” at the revolution happening around them, even as they think they’re at the center of it. How so?

Mr. DeMillo: It’s the same idea as the news industry. Local newspapers survived most of the last century on profits from classified ads. And what happened? Craigslist drove profits out of classified ads for local newspapers. If you think that it’s all revolving around you, and you’re going to be able to impose your value system on this train that’s leaving the station, that’s going to lead you to one set of decisions. Think of Carnegie Mellon, with its “Four Courses, Millions of Users” idea [which became the Open Learning Initiative], or Yale with the humanities courses, thinking that what the market really wants is universal access to these four courses at the highest quality. And really what the market is doing is something completely different. The higher-education market is reinventing what a university is, what a course is, what a student is, what the value is. I don’t know why anyone would think that the online revolution is about reproducing the classroom experience.

Q. So what is the revolution about?

Mr. DeMillo: You don’t know where events are going to take higher education. But if you want to be an important institution 20 years from now, you have to position yourself so that you can adapt to whatever those technology changes are. Whenever you have this kind of technological change, where there’s a large incumbency, the incumbents are inherently at a disadvantage. And we’re the incumbents.

Q. What are some of the most important changes happening now?

Mr. DeMillo: What you’re seeing, for example, is technology enabling a single master teacher to reach students on an individualized basis on a scale that is unprecedented. So when Sebastian Thrun offers his Intro to Robotics course and gets 150,000 students—that’s a big deal.

Why is it a big deal? Well, because people who want to learn robotics want to learn from the master. And there’s something about the medium that he uses that makes that connection intimate. It’s not the same kind of connection that you get by pointing a camera at the front of the room and letting someone write on a whiteboard. These guys have figured out how to design a way of explaining the material that connects with people at scale. So Stanford all of a sudden becomes a place with a network of stakeholders that’s several orders of magnitude larger than it was 10 years ago. Every one of those students in India that wants to connect to Stanford now—connect to a mentor—now has a way to connect by bypassing their local institutions. Every institution that can’t offer a robotics course now has a way of offering a robotics course.

I think what you see happening now with the massive open courses is going to fundamentally change the business models. It’s going to put the notion of value front and center. Why would I want a credential from this university? Why would I want to pay tuition to this university? It really ups the stakes.

Mr. Baker: There used to be something called Borders, you may remember. Think of Borders, the bookstore, “X, Y, Z University,” the bookstore. If you’ve got Amazon as an analogue for these massively open courses, there is still a model where people actually go into bookstores because sometimes they want to touch, or they like hanging out, or there’s other value offered by that. What it means is that the university needs to rethink what it’s doing, how it’s doing it.

And how it innovates in a way of surviving in the face of this. If I can do the Amazon equivalent of this open course, why should I come here? Well, maybe you shouldn’t. And that’s a client that is lost.

Mr. DeMillo: All you have to do is add up the amount of money spent on courses. Just take an introduction to computer science. Add up the amount of money that’s spent nationwide on introductory programming courses. It’s a big number, I’ll bet. What is the value received for that spend? If, in fact, there’s a large student population that can be served by a higher-quality course, what’s the argument for spending all that money on 6,000 introduction to programming courses?

Q. You really think that many universities could go the way of Borders?

Mr. DeMillo: Yeah. Well, you can see it already. We lost, in this university system, four institutions this year.

Mr. Baker: The University System of Georgia merged four institutions into other ones that were geographically within 50 miles. The programs essentially were replicated. And in an environment in which you’ve got reduced resources, you can’t afford to have essentially identical programs 50 miles apart.

Q. So what sort of learning landscape do you think might emerge?

Mr. DeMillo: One thing that you might see is highly tuned curricula, students being able to select from a range of things that they want to learn and a range of mentors that they want to interact with, whether you think of it as hacking degrees or pulling assessments from a menu of different universities. What does that mean for the individual university? It means that a university has to figure out where its true value sits in that landscape.

Mr. Baker: Another thing we’re looking at is development of a value index to try to calculate, to be vulgar, the return on investment. Our idea is to try to figure out ways of determining what constitutes value for a student, based on four or five personas. So for, let’s say, a mom returning at 50 who wants an education—she’s going to value certain things differently than a 17-year-old rocket scientist coming to Tech who wants to get through in three years and knows exactly what she wants to do.

Mr. Demillo: Jeff Selingo wrote a column about this, having one place to go to figure out the economic value of a degree from a university. It’s a great idea, but why focus only on the paycheck as an economic value? There are lots of indicators of value. Do students from this university go to graduate school by a disproportionately large number? Do they get fellowships? Are they people who stay in their profession for a long period of time? You start to build up a picture of what students tell you, of what alumni tell you, was the value of that education. Can we pull these metrics together and then say something interesting about our institution and by extension others?

Q. What other projects is your center working on right now?

Mr. DeMillo: The Khan Academy—small bursts of knowledge that may or may not be included in a curriculum—was a really interesting idea.

Can students generate this kind of material in a way that’s useful for other students? That’s the genesis of our TechBurstcompetition [in which students create short videos that explain a single topic].

It turns out there’s a lot of interest on the part of the students at Georgia Tech in teaching what they know to their peers. The interesting part of the project is the unexpected things that you get. We had a discussion yesterday about mistakes. This is student-generated stuff, so is it right? Not all the time. Which causes great angst on the part of traditionalists, because now we have Georgia Tech TechBurst video that has errors in it. If these were instructional videos that we were marketing, that would be a very big deal. But they’re not. They’re the start of a thread of conversation among students. There’s one on gerrymandering. So it’s a political-science video, it’s cutely produced, but in some sense it’s not exactly right. And so what you would expect is now other students will come along and annotate that video, and say, well, that’s not exactly what gerrymandering is. And you’ll start to see this students-teaching-students peer-tutoring process taking place in real time.

Q. What about the massive open online course Georgia Tech will run in the fall?

Mr. DeMillo: The idea of a massive open course is something that people normally apply to introductory courses. What happens when you look at a massive open advanced seminar? A seminar room with 10,000 students, 50,000 students—what does that even mean? We’ve got some people here that have been blogging for quite a while about advanced topics. In fact, one of the blogs—Godel’s Lost Letter, by Professor Dick Lipton of Georgia Tech, and Ken Regan of the University at Buffalo—is about advanced computer theory, so it’s a very mathematical blog. It’s in the top 0.1 percent of WordPress blogs. A typical day is 5,000 to 10,000 page views. A hot day is 100,000. The question is can we take this blogging format and turn it into an online seminar.

Q. How would that work?

Mr. DeMillo: The blog is essentially an expression of a master teacher’s understanding of a field to people that want to learn about it. We think that there are some very simple layers that can be built under the existing blogging format that can essentially turn it into a massive open online seminar. It’s also a way of conducting scientific research. When you think about what happens in this blog, it celebrates the process of scientific discovery. I’ll just give you one example. Last year about this time some industrial scientist claimed that he had solved one of the outstanding problems in this area. In the normal course of events, the scientist would have written up the paper, would have sent it to a conference. It would have been refereed. Nine months later the paper would have been presented at the conference. People would have talked about it. It would have been written up to submit to a journal. Refereeing would have taken a couple of years for that. Well, the paper got submitted to Lipton’s blog. It just caused a flurry of activity. So thousands and thousands of scientists flocked to this paper, and essentially speeded up the refereeing of the paper, shortening the time from five years to a couple of weeks. It turns out that people came to believe that the claim was not valid, and the paper was incorrect. But what an education for future research students. You get to see the process of scientific discovery in action.

This is an interesting bookend to the idea of a massive open course. Because the people that are thinking about the massive open online courses for introductory material have a set of considerations. Students are at different levels of achievement. Assessment is very important. The credentialing process is dictated by whether or not you want credit. If you go to the other end of the curriculum, and say, well, what happens when we try to do these advanced courses at scale, credentialing is completely different. Assessment is completely different. You can’t rely on the same automation that you could in the introductory courses. Social networks become extremely important if you’re going to do this stuff at scale, because one professor can’t deal with 100,000 readers. He has to have a network of trusted people who would be able to answer questions. The anticipation is that a whole new set of problems would come up with these kinds of courses.

This conversation has been edited and shortened.

The Importance Of Mistakes (NPR)

February 28, 2012
by ADAM FRANK

It takes a lot of cabling to make the Oscillation Project with Emulsion-Racking Apparatus (OPERA) run at the Gran Sasso National Laboratory (LNGS) in Italy.Alberto Pizzoli/AFP/Getty Images. It takes a lot of cabling to make the Oscillation Project with Emulsion-Racking Apparatus (OPERA) run at the Gran Sasso National Laboratory (LNGS) in Italy.

How do people handle the discovery of their own mistake? Some folks might shrug it off. Some folks might minimize its effect. Some folks might even step in with a lie. Most people, we hope, would admit the mistake. But how often do we expect them to announce it to the world from a hilltop. How often do we expect them to tell us — in the clearest language possible — that they screwed up, providing every detail possible about the nature of the mistake?

That’s exactly what’s required in science. As embarrassing as it might seem to most people, admitting a mistake is really the essence of scientific heroism.

Which brings us, first, to faster-than-light neutrinos and then to climate science.

Last week rumors began to circulate that the (potential) discovery of neutrinos traveling faster than the speed of light may get swept into the dustbin of scientific history. The news (rumors really) first circulated via Science Insider.

“According to sources familiar with the experiment, the 60 nanoseconds discrepancy appears to come from a bad connection between a fiber optic cable that connects to the GPS receiver used to correct the timing of the neutrinos’ flight and an electronic card in a computer.”

Oops.

The story goes on to say that once the cable was tightened the Einstein-busting result disappeared. While “sources familiar with the experiment” might not seem enough to start singing funeral dirges, (who was the source, Deep Neutrino?), CERN released its own statement that points in a similar direction. No one can say for sure yet, but it appears that the faster-than-light hoopla is likely to go away.

So what are we to make of this? A loose cable seems pretty lame on the face of it. “Dude, Everybody with a cable box and a 32-inch flat screen knows you got to check the cable!”

There is no doubt that, as mistakes go, researchers running the neutrino experiments would rather have something a bit more sexy to offer if their result was disproven. (How about tiny corrections due to seismic effects?) Still, I’m betting the OPERA experiment had a heck of a lot more cables than your TV so, perhaps, we should be more understanding.

More importantly, no matter how it happens making mistakes is exactly what scientists are supposed to do. “Our whole problem is to make mistakes as fast possible,” John Wheeler once said.

What make science so powerful is not just the admission of mistakes but also the detailing of mistakes. While the OPERA group might now wish they had waited a bit longer to make their announcement, there is no shame in the mistake in-and-of itself. If they step into the spotlight and tell the world what happened, then they deserve to be counted as heroes just as much as if they’d broken Einstein’s theory.

And that is where we can see the connection to climate, evolution and all the other fronts in the ever-expanding war on science. Last week at the AAAS meeting in Vancouver, Nina Fedoroff, a distinguished agricultural scientist and president of that body, made a bold and frightening statement (especially for someone in such a position of authority). Fedoroff told her audience, as The Guardian reported:

“‘We are sliding back into a dark era,’ she said. ‘And there seems little we can do about it. I am profoundly depressed at just how difficult it has become merely to get a realistic conversation started on issues such as climate change or genetically modified organisms.'”

See video: http://bcove.me/ajmi39pd

The spectacle of watching politicians fall over each other to distance themselves from research validated by armies of scientists is more than depressing. Our current understanding of climate, for example, represents the work of thousands of human beings all working to make mistakes as fast possible, all working to root out error as fast as possible. There is no difference between what happens in climate science or evolutionary biology and any other branch of science.

Honest people asking the best of themselves push forward in their own fields. They watch their work and those of their colleagues closely, always looking for mistakes, cracks in reasoning, subtle flaws in logic. When they are found, the process is set in motion: critique, defend, critique, root out. When science deniers trot out the same tired talking points, talking points with no scientific validity, they ignore (or fail to understand) their argument’s lack of credibility.

Eventually, science always finds its mistakes. Eventually we find some kind of truth, unless, of course, mistakes are forced on us from outside of science. That, however, is an error of another kind entirely.

When It Comes to Accepting Evolution, Gut Feelings Trump Facts (Science Daily)

ScienceDaily (Jan. 19, 2012) — For students to accept the theory of evolution, an intuitive “gut feeling” may be just as important as understanding the facts, according to a new study.

In an analysis of the beliefs of biology teachers, researchers found that a quick intuitive notion of how right an idea feels was a powerful driver of whether or not students accepted evolution — often trumping factors such as knowledge level or religion.

“The whole idea behind acceptance of evolution has been the assumption that if people understood it — if they really knew it — they would see the logic and accept it,” said David Haury, co-author of the new study and associate professor of education at Ohio State University.

“But among all the scientific studies on the matter, the most consistent finding was inconsistency. One study would find a strong relationship between knowledge level and acceptance, and others would find no relationship. Some would find a strong relationship between religious identity and acceptance, and others would find less of a relationship.”

“So our notion was, there is clearly some factor that we’re not looking at,” he continued. “We’re assuming that people accept something or don’t accept it on a completely rational basis. Or, they’re part of a belief community that as a group accept or don’t accept. But the findings just made those simple answers untenable.”

Haury and his colleagues tapped into cognitive science research showing that our brains don’t just process ideas logically — we also rely on how true something feels when judging an idea.

“Research in neuroscience has shown that when there’s a conflict between facts and feeling in the brain, feeling wins,” he says.

The researchers framed a study to determine whether intuitive reasoning could help explain why some people are more accepting of evolution than others. The study, published in the Journal of Research in Science Teaching, included 124 pre-service biology teachers at different stages in a standard teacher preparation program at two Korean universities.

First, the students answered a standard set of questions designed to measure their overall acceptance of evolution. These questions probed whether students generally believed in the main concepts and scientific findings that underpin the theory.

Then the students took a test on the specific details of evolutionary science. To show their level of factual knowledge, students answered multiple-choice and free-response questions about processes such as natural selection. To gauge their “gut” feelings about these ideas, students wrote down how certain they felt that their factually correct answers were actually true.

The researchers then analyzed statistical correlations to see whether knowledge level or feeling of certainty best predicted students’ overall acceptance of evolution. They also considered factors such as academic year and religion as potential predictors.

“What we found is that intuitive cognition has a significant impact on what people end up accepting, no matter how much they know,” said Haury. The results show that even students with greater knowledge of evolutionary facts weren’t likelier to accept the theory, unless they also had a strong “gut” feeling about those facts.

When trying to explain the patterns of whether people believe in evolution or not, “the results show that if we consider both feeling and knowledge level, we can explain much more than with knowledge level alone,” said Minsu Ha, lead author on the paper and a Ph.D. candidate in the School of Teaching and Learning.

In particular, the research shows that it may not be accurate to portray religion and science education as competing factors in determining beliefs about evolution. For the subjects of this study, belonging to a religion had almost no additional impact on beliefs about evolution, beyond subjects’ feelings of certainty.

These results also provide a useful way of looking at the perceived conflict between religion and science when it comes to teaching evolution, according to Haury. “Intuitive cognition not only opens a new door to approach the issue,” he said, “it also gives us a way of addressing that issue without directly questioning religious views.”

When choosing a setting for their study, the team found that Korean teacher preparation programs were ideal. “In Korea, people all take the same classes over the same time period and are all about the same age, so it takes out a lot of extraneous factors,” said Haury. “We wouldn’t be able to find a sample group like this in the United States.”

Unlike in the U.S., about half of Koreans do not identify themselves as belonging to any particular religion. But according to Ha, who is from Korea, certain religious groups consider the topic of evolution just as controversial as in the U.S.

To ensure that their results were relevant to U.S. settings, the researchers compared how the Korean students did on the knowledge tests with previous studies of U.S. students. “We found that the both groups were comparable in terms of the overall performance,” said Haury.

For teaching evolution, the researchers suggest using exercises that allow students to become aware of their brains’ dual processing. Knowing that sometimes what their “gut” says is in conflict with what their “head” knows may help students judge ideas on their merits.

“Educationally, we think that’s a place to start,” said Haury. “It’s a concrete way to show them, look — you can be fooled and make a bad decision, because you just can’t deny your gut.”

Ha and Haury collaborated on this study with Ross Nehm, associate professor of education at the Ohio State University. The research was funded by the National Science Foundation.

The right’s stupidity spreads, enabled by a too-polite left (Guardian)

Conservativism may be the refuge of the dim. But the room for rightwing ideas is made by those too timid to properly object

by George Monbiot, The Guardian

Self-deprecating, too liberal for their own good, today’s progressives stand back and watch, hands over their mouths, as the social vivisectionists of the right slice up a living society to see if its component parts can survive in isolation. Tied up in knots of reticence and self-doubt, they will not shout stop. Doing so requires an act of interruption, of presumption, for which they no longer possess a vocabulary.

Perhaps it is in the same spirit of liberal constipation that, with the exception of Charlie Brooker, we have been too polite to mention the Canadian study published last month in the journal Psychological Science, which revealed that people with conservative beliefs are likely to be of low intelligence. Paradoxically it was the Daily Mail that brought it to the attention of British readers last week. It feels crude, illiberal to point out that the other side is, on average, more stupid than our own. But this, the study suggests, is not unfounded generalisation but empirical fact.

It is by no means the first such paper. There is plenty of research showing that low general intelligence in childhood predicts greater prejudice towards people of different ethnicity or sexuality in adulthood. Open-mindedness, flexibility, trust in other people: all these require certain cognitive abilities. Understanding and accepting others – particularly “different” others – requires an enhanced capacity for abstract thinking.

But, drawing on a sample size of several thousand, correcting for both education and socioeconomic status, the new study looks embarrassingly robust. Importantly, it shows that prejudice tends not to arise directly from low intelligence but from the conservative ideologies to which people of low intelligence are drawn. Conservative ideology is the “critical pathway” from low intelligence to racism. Those with low cognitive abilities are attracted to “rightwing ideologies that promote coherence and order” and “emphasise the maintenance of the status quo”. Even for someone not yet renowned for liberal reticence, this feels hard to write.

This is not to suggest that all conservatives are stupid. There are some very clever people in government, advising politicians, running thinktanks and writing for newspapers, who have acquired power and influence by promoting rightwing ideologies.

But what we now see among their parties – however intelligent their guiding spirits may be – is the abandonment of any pretence of high-minded conservatism. On both sides of the Atlantic, conservative strategists have discovered that there is no pool so shallow that several million people won’t drown in it. Whether they are promoting the idea that Barack Obama was not born in the US, that man-made climate change is an eco-fascist-communist-anarchist conspiracy, or that the deficit results from the greed of the poor, they now appeal to the basest, stupidest impulses, and find that it does them no harm in the polls.

Don’t take my word for it. Listen to what two former Republican ideologues, David Frum and Mike Lofgren, have been saying. Frum warns that “conservatives have built a whole alternative knowledge system, with its own facts, its own history, its own laws of economics”. The result is a “shift to ever more extreme, ever more fantasy-based ideology” which has “ominous real-world consequences for American society”.

Lofgren complains that “the crackpot outliers of two decades ago have become the vital centre today”. The Republican party, with its “prevailing anti-intellectualism and hostility to science” is appealing to what he calls the “low-information voter”, or the “misinformation voter”. While most office holders probably don’t believe the “reactionary and paranoid claptrap” they peddle, “they cynically feed the worst instincts of their fearful and angry low-information political base”.

The madness hasn’t gone as far in the UK, but the effects of the Conservative appeal to stupidity are making themselves felt. This week the Guardian reported that recipients of disability benefits, scapegoated by the government as scroungers, blamed for the deficit, now find themselves subject to a new level of hostility and threats from other people.

These are the perfect conditions for a billionaires’ feeding frenzy. Any party elected by misinformed, suggestible voters becomes a vehicle for undisclosed interests. A tax break for the 1% is dressed up as freedom for the 99%. The regulation that prevents big banks and corporations exploiting us becomes an assault on the working man and woman. Those of us who discuss man-made climate change are cast as elitists by people who happily embrace the claims of Lord Monckton, Lord Lawson or thinktanks funded by ExxonMobil or the Koch brothers: now the authentic voices of the working class.

But when I survey this wreckage I wonder who the real idiots are. Confronted with mass discontent, the once-progressive major parties, as Thomas Frank laments in his latest book Pity the Billionaire, triangulate and accommodate, hesitate and prevaricate, muzzled by what he calls “terminal niceness”. They fail to produce a coherent analysis of what has gone wrong and why, or to make an uncluttered case for social justice, redistribution and regulation. The conceptual stupidities of conservatism are matched by the strategic stupidities of liberalism.

Yes, conservatism thrives on low intelligence and poor information. But the liberals in politics on both sides of the Atlantic continue to back off, yielding to the supremacy of the stupid. It’s turkeys all the way down.

Twitter: @georgemonbiot

The Top 10 Worst Things About Working in a Lab (Science)

By Adam Ruben

January 27, 2012

I have found that, no matter what the context, I will click on nearly any article with a number and a superlative in the title. I don’t really need to know anything about cheeseburgers that I don’t already know, but call an article “The Eight Best Cheeseburgers You’ve Never Heard Of” or “The Five Largest Cheeseburgers That Appeared in Films,” and suddenly I’ve got a bit of required reading to do.

And now, so do you.

Maybe you’re an ordinary person, not a scientist (we call you “Non-scis” behind your backs), and you’ve just clicked here for some light lunchtime reading. But if you’re a scientist, perhaps you can relate as we identify … drumroll please …

The top 10 worst aspects of working in a lab.

10. Your non-scientist friends don’t understand what you do.
Even when talking about their jobs to outsiders, your friends in other professions can summarize their recent accomplishments in understandable ways. For example, they can say, “I built an object,” or “I pleased a client,” or, if your friend works on Wall Street, “I ate a peasant.” But what can you say? “I cured … um, well, I didn’t really cure it, but I discovered … well, ‘discovered’ is too strong a word, so let’s just say I tested … well, the tests are ongoing and are causing new questions to arise, so … yeah. Stop looking at me.” At least you’re doing better than your friends with Ph.D.s in the humanities, who would answer, “I put sheets on my mom’s basement couch.”

9. The scientist who is already the most successful gets credit for everything anyone does.
If you discover something, your principal investigator (PI) gets credit. If you write a paper, your PI gets credit. If you submit a successful grant proposal, your PI gets credit (and money). And what do you get? If you’re lucky, you get to write more papers and grant proposals to bolster your PI’s curriculum vitae.

8. Lab equipment is expensive and delicate. And you, you’re not so coordinated. Nope. Not so much.
Oops! You could pay to replace this one broken piece, or you could hire another postdoc.

7. Sometimes experiments fail for a reason. Sometimes experiments fail for no reason.
As anyone who works in a lab knows, things that work perfectly for months or years can suddenly stop working, offering no explanation for the change. (In this way, lab experiments are like Internet Explorer®.) This abrupt and inexplicable failure changes your work to meta-work, as you stop asking questions about science and start asking questions about the consistency of your technique. You can waste years saying things like, “When I created the sample that worked, I flared my nostril in a weird way. So this week, I’ll try to repeat what I did last week but with more nostrils flarin’!”

6. Your schedule is dictated by intangible things.
Freaking cell lines, needing to be tended on a regular basis regardless of your dinner plans. Freaking galaxies visible only in the middle of the night. If it weren’t for your lab work you’d have such a vivacious social life! Sure. That’s why you have no social life. It’s the lab work.

5. Science on television has conditioned you to expect daily or weekly breakthroughs.
Have you ever had a breakthrough in the lab? Yeah, me neither. Sure, I’ve had successful experiments, which usually means that the controls worked and no one was injured. But a real, eureka, run-down-the-hallway-carrying-a-printout, burst-into-a-room-full-of-military-personnel-and-call-the-President-even-though-it’s-three-in-the-morning breakthrough? Not yet. Unless you count the programmable coffee maker that, after much cajoling, made decent coffee at the appropriate time. Maybe I should publish that.

4. Your work is dangerous.
People say their jobs are killing them, but you work with things that could actually kill you — things like caustic chemicals, infectious agents, highly electrified instruments, and angry PIs.

CREDIT: Hal Mayforth

3. Labs are not conducive to sex.
Unless you work in a sex lab, which may or may not be a real thing, it’s unlikely you can convince anyone to crawl under your lab bench with you (“Just ignore the discarded pipette tips, baby”) and, as protein biophysicists say, put their zinc fingers in your leucine zipper. But hey, prove me wrong, people.

2. You have to dress like a scientist.
When I worked at an amusement park, I had to wear a purple polo shirt tucked into khaki shorts with giant white sneakers, so I suppose things could be worse. But some of our (scientists’) uniform choices are pretty unflattering. Disposable shoe covers look like you stepped in two shower caps. Safety goggles trap humidity as though you’re cultivating a rainforest on your face. And white lab coats with collars and lapels make men look like nerds and women look like men who look like nerds.

1. You can feel time creeping inexorably toward your own death.
If you think I’m being melodramatic, you were obviously never a grad student or postdoc. As a grad student or postdoc, you spend longer than you’ve planned working on something less interesting than you’d believed, all while earning less money than you assumed reasonable with an endpoint that’s less tangible and less probable than you thought possible.

If this was the kind of article with a “Comments” section, you’d scroll there and see people berating the spoiled scientist for complaining about his work when there are far worse jobs in the world. You’d also see anonymous nastiness, blatant ignorance, and a rant about Ron Paul.

Luckily, there is no “Comments” section (thanks, Science!), so I can preemptively tell you that yes, I know there are worse jobs than “scientist” — “baby thrower,” for example, or “cow exploder.” But this is Science, so if you want to read about the top 10 worst aspects of being a cow exploder, go borrow a copy of Cow Exploder Digest. And wash your hands after reading it.

And yes, I know that there are great aspects of working in a lab as well. You get to work with your hands. You experience the beauty of a well-designed experiment. You can even ask questions about the universe and, occasionally, answer them. But since these last points were neither in list format nor preceded by an overreaching superlative, I’ll understand if you’ve already stopped reading.

Adam Ruben, Ph.D., is a practicing scientist and the author of Surviving Your Stupid, Stupid Decision to Go to Grad School.

O papel da confiança na decisão social (FAPESP)

08/12/2011

Por Mônica Pileggi

Estudo realizado no Mackenzie e publicado no The Journal of Neuroscience indica que cérebro não percebe injustiça de amigos em situações de decisão econômica (Wikimedia)

Agência FAPESP – Durante situações de decisão econômica, a amizade é uma das variáveis que modulam nosso cérebro, tornando o ser humano incapaz de se sentir injustiçado. Essa é uma das conclusões de uma pesquisa desenvolvida no Laboratório de Neurociência Cognitiva e Social da Universidade Presbiteriana Mackenzie (UPM) e publicada no The Journal of Neuroscience.

O trabalho, liderado pelo professor Paulo Sérgio Boggio, coordenador de pesquisa do Centro de Ciências Biológicas e da Saúde da UPM, foi realizado durante o mestrado “Estudo preliminar sobre potenciais cognitivos em tarefa de tomada de decisão social”, da psicóloga Camila Campanhã, que atualmente faz o doutorado na UPM, ambos com bolsas da FAPESP.

Segundo Campanhã, o estudo teve como objetivo estudar o papel da confiança na tomada de decisão social e suas bases neurobiológicas. Para isso, ela se baseou na teoria dos jogos, ramo da matemática aplicada que estuda situações estratégicas nas quais jogadores escolhem diferentes ações na tentativa de melhorar seu retorno.

Inicialmente desenvolvida como ferramenta para compreender comportamento econômico e depois usada até mesmo para definir estratégias nucleares, a teoria dos jogos é hoje aplicada em diversos campos acadêmicos. Tornou-se um ramo proeminente da matemática especialmente após a publicação, em 1944, de The Theory of Games and Economic Behavior de John von Neumann e Oskar Morgenstern.

Campanhã – cujo estudo foi realizado em colaboração com os pesquisadores Ludovico Minati, do Istituto Neurologico “Carlo Besta” (Itália), e Felipe Fregni, da Universidade Harvard (Estados Unidos) – conta que para a realização do experimento foi utilizado o Ultimatum Game, jogo utilizado na neuroeconomia e por estudiosos do comportamento social.

Composto por participantes da faixa etária de 18 a 25 anos, o jogo foi dividido em dois blocos. No primeiro, o computador enviou propostas econômicas justas e injustas de amigos (que se encontravam em ambientes diferentes). No segundo, as propostas foram feitas por integrantes do laboratório, desconhecidos dos participantes.

Os valores das propostas foram classificadas como justas (50:50), mais ou menos justas (70:30) e muito injustas (80:20 e 90:10). “Os participantes receberam a mesma quantidade de propostas justas e injustas, tanto do amigo como do desconhecido, enviadas pelo computador. Registramos toda a atividade eletroencefalográfica desses participantes durante o experimento”, disse Campanhã à Agência FAPESP.

Nesse tipo de experimento, caso a pessoa aceite a proposta, ambos recebem o valor combinado. Se ela recusar, os dois não recebem nada. “Do ponto de vista comportamental, observamos que as pessoas rejeitaram muito mais as propostas injustas do desconhecido do que as oferecidas pelo amigo – nas quais o amigo sairia ganhando mais. Além disso, essas pessoas pontuaram os amigos como mais justos do que os desconhecidos”, destacou.

O estudo apontou uma inversão positiva na atividade neuroelétrica para as propostas de amigos. “A expectativa era que os dados seriam negativos conforme se recebessem propostas injustas do amigo. No entanto, os participantes não perceberam essa injustiça”, disse Campanhã.

Segundo ela, a inversão de polaridade positiva está relacionada à satisfação de receber algo bom e justo, cuja recompensa está acima do esperado. Nesse caso, a dopamina é liberada. No sinal negativo há quebra de expectativa e a substância é inibida, gerando raiva.

“Ao realizarmos a análise para identificar a área do cérebro ativada naquele momento, observamos que o sinal elétrico apareceu no córtex pré-frontal medial anterior. Essa é uma área relacionada à habilidade de imaginar e tentar entender o que o outro está pensando e sentindo”, disse.

“Não significa que as pessoas não processam a injustiça, mas esse processo é diferente quando se confia em alguém. É como se não precisasse tentar entender o que se passa com a outra pessoa ou o que ela está sentindo”, disse.

O artigo Responding to Unfair Offers Made by a Friend: Neuroelectrical Activity Changes in the Anterior Medial Prefrontal Cortex (doi:10.1523/JNEUROSCI.1253-11.2011), de Camila Campanhã e outros, pode ser lido por assinantes da The Journal of Neuroscience emwww.jneurosci.org/content/31/43/15569.full.pdf+html?sid=94d0a3e8-79b9-47a8-89d8-24dcf41750e7. 

Human brains unlikely to evolve into a ‘supermind’ as price to pay would be too high (University of Warwick)

University of Warwick

Human minds have hit an evolutionary “sweet spot” and – unlike computers – cannot continually get smarter without trade-offs elsewhere, according to research by the University of Warwick.

Researchers asked the question why we are not more intelligent than we are given the adaptive evolutionary process. Their conclusions show that you can have too much of a good thing when it comes to mental performance.

The evidence suggests that for every gain in cognitive functions, for example better memory, increased attention or improved intelligence, there is a price to pay elsewhere – meaning a highly-evolved “supermind” is the stuff of science fiction.

University of Warwick psychology researcher Thomas Hills and Ralph Hertwig of the University of Basel looked at a range of studies, including research into the use of drugs like Ritalan which help with attention, studies of people with autism as well as a study of the Ashkenazi Jewish population.

For instance, among individuals with enhanced cognitive abilities- such as savants, people with photographic memories, and even genetically segregated populations of individuals with above average IQ, these individuals often suffer from related disorders, such as autism, debilitating synaesthesia and neural disorders linked with enhanced brain growth.

Similarly, drugs like Ritalan only help people with lower attention spans whereas people who don’t have trouble focusing can actually perform worse when they take attention-enhancing drugs.

Dr Hills said: “These kinds of studies suggest there is an upper limit to how much people can or should improve their mental functions like attention, memory or intelligence.

“Take a complex task like driving, where the mind needs to be dynamically focused, attending to the right things such as the road ahead and other road users – which are changing all the time.

“If you enhance your ability to focus too much, and end up over-focusing on specific details, like the driver trying to hide in your blind spot, then you may fail to see another driver suddenly veering into your lane from the other direction.

“Or if you drink coffee to make yourself more alert, the trade-off is that it is likely to increase your anxiety levels and lose your fine motor control. There are always trade-offs.

“In other words, there is a ‘sweet spot’ in terms of enhancing our mental abilities – if you go beyond that spot – just like in the fairy-tales – you have to pay the price.”

The research, entitled ‘Why Aren’t We Smarter Already: Evolutionary Trade-Offs and Cognitive Enhancements,’ is published in Current Directions in Psychological Science, a journal of the Association for Psychological Science.

O experimento na era da sua irreprodutibilidade técnica (Revista Piauí)

JC e-mail 4402, de 09 de Dezembro de 2011.

O desconforto ronda algumas áreas da ciência. Embora a possibilidade de replicação dos resultados por grupos independentes seja um dos pilares da ciência moderna, estudos de um número cada vez maior de campos se caracterizam pela impossibilidade ou inviabilidade de reprodução.

O impasse atinge tanto disciplinas relativamente novas e dependentes de computadores – como a genômica e a proteômica – quanto áreas consolidadas há mais tempo, como a biologia de campo. Na semana passada, a revista Science dedicou à questão uma série de cinco artigos que discutem o problema e propõem possíveis soluções.

Os campos que dependem de ferramentas computacionais para a coleta e análise de dados estão entre os que enfrentam de forma mais dramática os desafios da replicação de dados, por um motivo simples: nem todos os laboratórios dispõem dos equipamentos necessários para refazer esses experimentos. “Seria necessário um volume extraordinário de recursos para replicar de forma independente o Sloan Digital Sky Survey”, exemplifica o bioestatístico Roger Peng num dos artigos da série, referindo-se a um projeto ambicioso de mapeamento do céu que já obteve imagens tridimensionais de quase um milhão de galáxias.

O problema se repete em campos emergentes da biologia molecular, como genômica, proteômica, metabolômica e outras disciplinas com o mesmo sufixo, nas quais os pesquisadores lidam com uma grande quantidade de dados que só podem ser analisados com ferramentas computacionais poderosas. A dificuldade para reprodução desses estudos pode levar a prejuízos importantes, como mostrou o exemplo citado por John Ioannidis e Muin Khoury. Eles evocaram o caso de um estudo segundo o qual assinaturas gênicas específicas poderiam ser usadas para prever a eficácia da quimioterapia contra alguns tipos de câncer. As conclusões do estudo motivaram a realização de testes clínicos dos marcadores em questão, mas os ensaios não foram adiante depois que se constatou que era impossível replicar os resultados do estudo.

Mas não é apenas o acesso à tecnologia que limita a possibilidade de reprodução dos estudos. Mesmo as pesquisas com animais de laboratório podem apresentar dificuldades sérias de reprodução. Num dos artigos da Science, dois especialistas no estudo da cognição de primatas explicam que, nesse campo de estudo, as conclusões dos experimentos com animais de laboratório dificilmente podem ser extrapoladas para animais selvagens ou mesmo de outros laboratórios. “Diferentes populações cativas podem ter tido diferentes experiências relevantes para uma tarefa cognitiva específica”, explicam.

Na maioria dos casos, a transparência é a melhor receita para facilitar a reprodutibilidade dos estudos. No caso das pesquisas que envolvem a observação do comportamento de animais selvagens, por exemplo, os cientistas podem ajudar seus pares tornando públicos os registros feitos em campo com a ajuda de câmeras de vídeo ou rastreamento por satélite. Em outro artigo da série, Michael Ryan, da Universidade do Texas em Austin, propõe que, ao submeter um estudo para publicação, os pesquisadores sejam obrigados a mandar também os dados primários colhidos na pesquisa.

Transparência é também a chave para a replicabilidade dos estudos que dependem de ferramentas computacionais. No caso das ciências -ômicas, muitos dos dados gerados já são depositados em repositórios de acesso público. Mas isso não impede a dificuldade de replicação dos resultados, como lembram John Ioannidis e Muin Khoury: “é um desafio verificar que os dados e protocolos completos foram de fato depositados, que os arquivos estão em condições de ser acessados e que os resultados são replicáveis”, ponderam.

Os dois autores acreditam que as agências de fomento à pesquisa têm um papel importante no sentido de tornar os dados acessíveis. Eles sugerem que essas agências ofereçam bônus aos pesquisadores que disponibilizarem os dados primários de seus estudos e apliquem punições aos grupos que não tornarem acessíveis as informações necessárias para a replicação do estudo.

Um papel importante cabe também aos periódicos que publicam os artigos científicos. Roger Peng sugere que essas revistas exijam dos pesquisadores que submetam, junto com os artigos que envolvam ferramentas computacionais, não só os dados usados na análise, mas também o código-fonte dos programas usados em seu tratamento. Em seu artigo para a Science, ele disse estar estimulando a transparência dos dados no periódico Biostatistics, de cujo corpo editorial ele faz parte. Sempre que os autores o permitem, a revista publica on-line o código e os dados usados em seus artigos, que recebem uma classificação indicativa da transparência dos dados.

De qualquer forma, não custa lembrar que a preocupação com a transparência e a replicabilidade não deve substituir o rigor na coleta e análise dos dados. Como ressaltou Peng “o fato de uma análise ser reprodutível não garante a qualidade, correção ou validade dos resultados publicados”.
(Revista Piauí – 7/12)

CO2 may not warm the planet as much as thought (New Scientist)

19:00 24 November 2011 by Michael Marshall

The climate may be less sensitive to carbon dioxide than we thought – and temperature rises this century could be smaller than expected. That’s the surprise result of a new analysis of the last ice age. However, the finding comes from considering just one climate model, and unless it can be replicated using other models, researchers are dubious that it is genuine.

As more greenhouse gases enter the atmosphere, more heat is trapped and temperatures go up – but by how much? The best estimates say that if the amount of carbon dioxide in the atmosphere doubles, temperatures will rise by 3 °C. This is the “climate sensitivity”.

But the 3 °C figure is only an estimate. In 2007, the Intergovernmental Panel on Climate Change (IPCC) said the climate sensitivity could be anywhere between 2 and 4.5 °C. That means the temperature rise from a given release of carbon dioxide is still uncertain.

There have been several attempts to pin down the sensitivity. The latest comes from Andreas Schmittner of Oregon State University, Corvallis, and colleagues, who took a closer look at the Last Glacial Maximum around 20,000 years ago, when the last ice age was at its height.

Icy cold

They used previously published data to put together a detailed global map of surface temperatures. This showed that the planet was, on average, 2.2 °C cooler than today. We already know from ice cores that greenhouse gas levels in the atmosphere at the time were much lower than they are now.

Schmittner plugged the atmospheric greenhouse gas concentrations that existed during the Last Glacial Maximum into a climate model and tried to recreate the global temperature patterns. He found that he had to assume a relatively small climate sensitivity of 2.4 °C if the model was to give the best fit.

If climate sensitivity really is so low, global warming this century will be at the lower end of the IPCC’s estimates. Assuming we keep burning fossil fuels heavily, the IPCC estimates that temperatures will rise about 4 °C by 2100, compared with 1980 to 1999. Schmittner’s study suggests the warming would be closer to their minimum estimate for the “heavy burning” scenario, which is 2.4 °C.

Sensitive models

Past climates can help us work out the true climate sensitivity, says Gavin Schmidt of the NASA Goddard Institute of Space Studies in New York City. But he says the results of Schmittner’s study aren’t strong enough to change his mind about the climate sensitivity. “I don’t expect this to impact consensus estimates,” he says.

In particular, the model that Schmittner used in his analysis underestimates the cooling in Antarctica and the mid-latitudes. “The model estimate of the cooling during the Last Glacial Maximum is a clear underestimate,” Schmidt says. “A different model would give a cooler Last Glacial Maximum, and thus a larger sensitivity.”

Schmittner agrees it is too early to draw firm conclusions.Individual climate models all have their own quirks, so he wants to try the experiment with several models to find out if others repeat the result.

Even if the climate sensitivity really is as low as 2.4 °C, Schmittner says that doesn’t mean we are safe from climate change. The Last Glacial Maximum was only 2.2 °C cooler than today, yet there were huge ice sheets, plant life was different, andsea levels were 120 metres lower.

“Very small changes in temperature cause huge changes in certain regions,” Schmittner says. So even if we get a smaller temperature rise than we expected, the knock-on effects would still be severe.

Journal reference: Science, DOI: 10.1126/science.1203513

Science panel: Get ready for extreme weather (AP)

November 18, 2011|Seth Borenstein, AP Science Writer

FILE%20-%20Maarten%20van%20Aalst%2C%20leading%20climate%20specialist%20for%20the%20Red%20Cross%20and%20Red%20Crescent%2C%20speaks%20about%20how%20climate%20change%20will%20affect%20people%20and%20assets%20during%20the%20presentation%20of%20the%20Intergovernmental%20Panel%20on%20Climate%20Change%20%28IPCC%29%20report%20at%20a%20press%20conference%20at%20the%20European%20headquarters%20of%20the%20United%20Nations%20in%20Geneva%2C%20Switzerland%2C%20in%20this%20April%2011%2C%202007%20file%20photo.%20Top%20international%20climate%20scientists%20and%20disaster%20experts%20meeting%20in%20Africa%20had%20a%20sharp%20message%20Friday%20Nov.%2018%2C%202011%20for%20the%20worlds%20political%20leaders%3A%20Get%20ready%20for%20more%20dangerous%20and%20unprecedented%20extreme%20weather%20caused%20by%20global%20warming.%20%28AP%20Photo/Keystone%2C%20Salvatore%20Di%20Nolfi%2C%20File%29Maarten van Aalst, leading climate specialist for the Red Cross and Red Crescent, speaks about how climate change will affect people and assets during the presentation of the Intergovernmental Panel on Climate Change (IPCC) report at a press conference at the European headquarters of the United Nations in Geneva, Switzerland, in this April 11, 2007 file photo. Top international climate scientists and disaster experts meeting in Africa had a sharp message Friday Nov. 18, 2011 for the worlds political leaders: Get ready for more dangerous and unprecedented extreme weather caused by global warming. (AP Photo/Keystone, Salvatore Di Nolfi, File)

Think of the Texas drought, floods in Thailand and Russia’s devastating heat waves as coming attractions in a warming world. That’s the warning from top international climate scientists and disaster experts after meeting in Africa.

The panel said the world needs to get ready for more dangerous and “unprecedented extreme weather’’ caused by global warming. These experts fear that without preparedness, crazy weather extremes may overwhelm some locations, making some places unlivable.

The Nobel Prize-winning Intergovernmental Panel on Climate Change issued a special report on global warming and extreme weather Friday after meeting in Kampala, Uganda. This is the first time the group of scientists has focused on the dangers of extreme weather events such as heat waves, floods, droughts and storms. Those are more dangerous than gradual increases in the world’s average temperature.

For example, the report predicts that heat waves that are now once-in-a-generation events will become hotter and happen once every five years by mid-century and every other year by the end of the century. And in some places, such as most of Latin America, Africa and a good chunk of Asia, they will likely become yearly bakings.

And the very heavy rainstorms that usually happen once every 20 years will happen far more frequently, the report said. In most areas of the U.S. and Canada, they are likely to occur three times as often by the turn of the century, if fossil fuel use continues at current levels. In Southeast Asia, where flooding has been dramatic, it is likely to happen about four times as often as now, the report predicts.

One scientist points to this year’s drought and string of 100 degree days in Texas and Oklahoma, which set an all-time record for hottest month for any U.S. state this summer.

“I think of it as a wake-up call,’’ said one of the study’s authors, David Easterling, head of global climate applications for the U.S. National Oceanic and Atmospheric Administration. “The likelihood of that occurring in the future is going to be much greater.’’

The report said world leaders have to prepare better for weather extremes.

“We need to be worried,’’ said one of the study’s lead authors, Maarten van Aalst, director of the International Red Cross/Red Crescent Climate Centre in the Netherlands. “And our response needs to anticipate disasters and reduce risk before they happen rather than wait until after they happen and clean up afterward. … Risk has already increased dramatically.’’

New climate emails leaked ahead of talks (CBS)

November 22, 2011 2:15 PM

The Climatic Research Unit at the University of East Anglia in Norwich, England. (AP)  

LONDON – The British university whose leaked emails caused a global climate science controversy in 2009 says it has discovered a potentially much larger data breach.

University of East Anglia spokesman Simon Dunford said that while academics didn’t have the chance yet to examine the roughly 5,000 emails apparently dumped into the public domain Tuesday, a small sample examined by the university “appears to be genuine.”

The university said in a statement that the emails did not appear to be the result of a new hack or leak. Instead, the statement said that the emails appeared to have been stolen two years ago and held back until now “to cause maximum disruption” to the imminent U.N. climate talks next week in Durban, South Africa.

If that is confirmed, the timing and nature of the leak would follow the pattern set by the so-called “Climategate” emails, which caught prominent scientists stonewalling critics and discussing ways to keep opponents’ research out of peer-reviewed journals.

Those hostile to mainstream climate science claimed the exchanges proved that the threat of global warming was being hyped, and their publication helped destabilize the failed U.N. climate talks in Copenhagen, Denmark, which followed several weeks later.

Although several reviews have since vindicated the researchers’ science, some of their practices – in particular efforts to hide data from critics – have come under strong criticism.

The content of the new batch of emails couldn’t be immediately verified – The Associated Press has not yet been able to secure a copy – but climate skeptic websites carried what they said were excerpts.

Although their context couldn’t be determined, the excerpts appeared to show climate scientists talking in conspiratorial tones about ways to promote their agenda and freeze out those they disagree with. There are several mentions of “the cause” and discussions of ways to shield emails from freedom of information requests.

Penn State University Prof. Michael Mann – a prominent player in the earlier controversy whose name also appears in the latest leak – described the latest leak as “a truly pathetic episode,” blaming agents of the fossil fuel industry for “smear, innuendo, criminal hacking of websites, and leaking out-of-context snippets of personal emails.”

He said the real story in the emails was “an attempt to dig out 2-year-old turkey from Thanksgiving ’09. That’s how desperate climate change deniers have become.”

Bob Ward, with the London School of Economics’ Grantham Research Institute on Climate Change, said in an email that he wasn’t surprised by the leak.

“The selective presentation of old email messages is clearly designed to mislead the public and politicians about the strength of the evidence for man-made climate change,” he said. “But the fact remains that there is very strong evidence that most the indisputable warming of the Earth over the past half century is due to the burning of fossil fuels and other human activities.”

The source of the latest leaked emails was unclear. The perpetrator of the original hack has yet to be unmasked, although British police have said their investigation is still active.

Climate researchers cleared of malpractice
An End to Climategate? Penn State Clears Michael Mann
Why climate change skeptics remain skeptical

From Shore to Forest, Projecting Effects of Climate Change (N.Y. Times)

By 

While the long-term outlook for grape-growers in the Finger Lakes region is favorable, it is less than optimal for skiers and other winter sports enthusiasts in the Adirondacks. Fir and spruce trees are expected to die out in the Catskills, and New York City’s backup drinking water supply may well be contaminated as a result of seawater making its way farther up the Hudson River.

These possibilities — modeled deep into this century — are detailed in a new assessment of the impact that climate change will have in New York State. The 600-page report, published on Wednesday, was commissioned by the New York State Energy Research and Development Authority, a public-benefit corporation, and is a result of three years of work by scientists at state academic institutions, including Columbia and Cornell Universities and the City University of New York.

Its authors say it is the most detailed study that looks at how changes brought about by a warming Earth — from rising temperatures to more precipitation and global sea level rise — will affect the economy, the ecology and even the social fabric of the state.

Cynthia Rosenzweig, a senior research scientist at Columbia’s Earth Institute, said the report was much broader in scope than earlier efforts by New York City that tried to evaluate how best to prepare for climate change.

“New York City’s report focuses on how climate change will affect critical structures” like bridges and sewage systems, she said. “This report also looks at public health, agriculture, transportation and economics.”

The authors drew on results from global climate models and then created projections for variables like rainfall and temperatures for seven regions across the state. Then they tried to assess how those alterations would play out in specific terms. They also developed adaptation recommendations for different economic sectors.

If carbon emissions continue to increase at their current pace, for example, temperatures are expected to rise across the state by 3 degrees Fahrenheit by the 2020s and by as much as 9 degrees by the 2080s. That would have profound effects on agriculture across the state, the report found. For example, none of the varieties of apples currently grown in New York orchards would be viable. Dairy farms would be less productive as cows faced heat stress. And the state’s forests would be transformed; spruce-fir forests and alpine tundra would disappear as invasive species like kudzu, an aggressive weed, gained more ground.

If the Greenland and West Antarctic ice sheets melt, as the report says could happen, the sea level could rise by as much as 55 inches, which means that beach communities would frequently be inundated by flooding.

“In 2020, nearly 96,000 people in the Long Beach area alone may be at risk from sea-level rise,” the report said, referring to just one oceanfront community on the South Shore of Long Island. “By 2080, that number may rise to more than 114,500 people. The value of property at risk in the Long Beach area under this scenario ranges from about $6.4 billion in 2020 to about $7.2 billion in 2080.”

The report found that the effects of climate change would fall disproportionately on the poor and the disabled.

In coastal areas in New York City and along rivers in upstate New York, it said, there is a high amount of low-income housing that would be in the path of flooding.

Art DeGaetano, a professor of earth and atmospheric sciences at Cornell, said that its findings need not be interpreted as totally devastating.

“It would be all bad if you wanted a static New York, with the same species of bird and the same crops,” he said, “but there will be opportunities as well. We expect, for example, that New York State will remain water-rich and we may be able to capitalize when other parts of the country are having severe drought.”

The next step, the authors said, is for them to meet with state agencies and try to work with them to carry out some of the report’s recommendations of ways to cope with climate change

One would be to get the state to routinely incorporate projections of increased sea levels and heavy downpours when building big infrastructure projects. They also suggested protecting and nursing natural barriers to sea-level rise, like coastal wetlands, and changing building codes in certain area for things like roof strength and foundation depth in areas that would be hit hardest by storms.

“If there is one thing we learned from Hurricane Irene,” Dr. Rosenzweig said referring to the tropical storm that pummeled the state this past summer, “we have a lot more we could be doing to prepare.”

Rajendra Pachauri: “A ciência foi deixada de lado na COP” (O Estado de São Paulo)

JC e-mail 4398, de 05 de Dezembro de 2011.

Se ela estivesse no centro do debate sobre mudanças climáticas, ações não poderiam ser adiadas, afirma o cientista indiano Rajendra Pachauri, presidente do Painel Intergovernamental sobre Mudanças Climáticas (IPCC).

O cientista indiano Rajendra Pachauri, de 71 anos, presidente do Painel Intergovernamental sobre Mudanças Climáticas (IPCC), acompanha com frustração a 17.ª Conferência do Clima da ONU, a COP-17. O pesquisador, que concedeu entrevista ao Estado em uma pequena sala VIP no centro de convenções de Durban, avalia que a ciência e os alertas dados pelos cientistas não estão no centro das negociações climáticas.

Para ele, não é necessariamente fundamental garantir a segunda fase do Protocolo de Kyoto nessa reunião, mas é preciso que haja avanços independentemente do acordo que seja escolhido. “Gostaria que houvesse uma forma de tornar a ciência sobre clima uma parte mais central nas discussões nas negociações. Porque pelo menos assim você poderia dizer que não se pode adiar as ações por muito tempo. E tomar medidas pode ser realmente atraente, e não caro”, afirma. A reunião segue até sexta-feira.

Apesar dos alertas do IPCC e do recente relatório especial sobre eventos extremos que mostram os impactos das mudanças climáticas, o avanço nas negociações é muito lento. Como o senhor avalia essa situação?
Nas negociações que estão acontecendo aqui, nós não podemos perder de vista a ciência das mudanças climáticas. Você mencionou corretamente que recentemente publicamos um relatório especial sobre eventos extremos e desastres e como podemos avançar na adaptação (preparação para esses eventos). Eu gostaria de ver uma discussão muito mais focada nessas questões, e o que a comunidade global pode fazer para lidar com esses impactos.

O senhor acha importante focar mais em adaptação?
Acho que precisamos lidar com os dois, adaptação e mitigação (cortes de emissões de gases-estufa). Porque nós não teremos capacidade de nos adaptar a todos os impactos. Podemos nos adaptar a algumas situações, mas, depois de um certo tempo, fica muito difícil e caro fazer isso. Então, precisamos olhar para a mitigação também. Neste ano apresentamos um relatório sobre energias renováveis que claramente mostra que é possível usar muito mais energias renováveis e, com mais pesquisas em seu desenvolvimento, os custos podem cair. O que estou dizendo é que eu gostaria que houvesse uma forma de tornar a ciência sobre clima uma parte mais central nas discussões nas negociações. Porque pelo menos assim você poderia dizer que não se pode adiar as ações por muito tempo. E que tomar medidas pode ser realmente atraente, e não caro.

A ciência então não está no centro da discussão hoje?
Não parece estar. Não estou envolvido diretamente na negociação, mas a acompanho e não vejo a ciência no centro do debate.

As pessoas costumam dizer que os cientistas do IPCC eram radicais e pessimistas. Mas em dois anos novos estudos podem mostrar que a situação é ainda mais perigosa do que o previsto?
Não sei, ainda estamos trabalhando nesse relatório. No relatório especial sobre eventos extremos e desastres, nós apontamos as áreas em que ainda não temos muitas evidências e também as em que as evidências claramente mostram que ondas de calor aumentarão, assim como os eventos de precipitações extremas e a elevação do nível do mar – e isso é uma ameaça a áreas costeiras. Trouxemos todas as informações com um grande cuidado, de forma robusta. Ninguém pode dizer que alguém dentro do IPCC é radical.

Ao contrário, eu ia perguntar se os cientistas não estavam sendo cautelosos demais no relatório de 2007.
Temos muito mais evidências hoje, muito mais pesquisas publicadas sobre mudanças climáticas. E o IPCC funciona com a avaliação de pesquisas publicadas (o IPCC não faz as próprias pesquisas). E, agora, temos muito mais estudos do que nos anos anteriores ao quarto relatório do IPCC, de 2007. Certamente, estamos num nível muito melhor agora. É claro que em algumas partes do mundo temos grandes lacunas, não temos estudos em todos os locais e isso ocorre principalmente nos países mais vulneráveis.

A discussão em Durban tem focado muito no Protocolo de Kyoto. Em sua opinião, é importante ter um segundo período de compromisso de Kyoto? Ou podemos fazer outro tipo de acordo?
É muito difícil dizer, há uma diversidade enorme de opções que podem ser aceitas. Mas eu gostaria de ver um avanço, qualquer que seja a direção tomada, com Kyoto ou outra coisa. E, de novo, se houvesse um foco na ciência, talvez as pessoas veriam que há urgência em agir e as decisões seriam tomadas mais rapidamente.

O senhor acredita que países emergentes como China, Índia e Brasil devem fazer mais, já que são grandes emissores?
Não poderia dizer isso, mas de vou lembrar que as responsabilidades de acordo com a convenção do clima são comuns, mas diferenciadas (os países industrializados, maiores emissores históricos, têm as maiores responsabilidades). E é por isso que essas negociações acontecem. Se olharmos nos últimos 20 anos desde que a convenção do clima foi criada, o mundo não fez o suficiente. E as emissões ainda estão aumentando. Então, não tenho muita certeza se o que tivemos até agora foi realmente muito efetivo. E talvez o que precisamos agora é de algo mais efetivo, que vá de encontro ao objetivo de evitar interferência antropocêntrica (humana) no sistema climático. Que é o objetivo principal da Convenção do Clima da ONU.

Em 2009, pouco antes da COP-15, tivemos o episódio que ficou conhecido como Climategate, quando e-mails de cientistas foram expostos. O senhor tem medo de hackers ou grampos telefônicos?
Tudo isso é crime, e uma pessoa não pode ficar com medo e deixar de fazer o que se espera dela. Temos de continuar nosso trabalho e é isso que estamos tentando fazer.

Mas o senhor recebe ameaças?
Sim, mas eu prefiro não falar sobre elas.

O que o IPCC aprendeu com o erro do Himalaia?
Em primeiro lugar, deixe-me colocar esse erro em perspectiva. Tínhamos 3 mil páginas de relatório e milhares de dados. E uma única informação em que cometemos o erro, de que as geleiras do Himalaia desapareceriam em 2035, não estava no sumário técnico, no sumário para os tomadores de decisão nem no relatório síntese. Estava apenas no relatório principal, que é essencialmente científico, não é para os tomadores de decisão. Então, de nenhuma maneira estávamos tentando chamar a atenção dos tomadores de decisão para esse dado errado. Francamente, não sabíamos do erro. Agora temos procedimentos mais fortes, mais passos a seguir, um protocolo de correção. Tudo isso vai nos ajudar a lidar com uma situação como essa muito melhor no futuro.

O senhor acha que o trabalho se tornando mais burocrático, com mais revisões e correções, pode afastar os cientistas do IPCC?
Nossa instituição tem uma responsabilidade com a sociedade. Então, se nós não temos um sistema em que um erro possa ser corrigido, então claramente há um déficit. É nossa responsabilidade buscar um sistema em que erros, uma vez que apareçam, possam ser investigados e depois corrigidos. E não tínhamos isso no passado.

E os cientistas continuam querendo se ligar ao IPCC, é importante para suas carreiras?
Absolutamente. Não sei a respeito das carreiras, mas com certeza pelo senso de orgulho profissional. Para o 5.º Relatório do IPCC tivemos um número recorde de nomeações. Cerca de 3 mil nomeações, das quais elegemos 831. O número foi pelo menos 50% maior do que tivemos no 4.º relatório. E isso mostra que a comunidade científica se entusiasma em trabalhar com o IPCC.

Acha que a crise econômica está impactando as negociações e as ações dos governos?
Eu acho que sim. Mas é por isso que eu acho que a primazia da ciência deve ser mantida. Vamos encarar a questão: a crise econômica deve ser resolvida em dois, três, quatro anos, algo assim. Mas o problema das mudanças climáticas está aqui para todo o sempre. Então, não podemos nos cegar por considerações de curto prazo.

O que o senhor espera da Rio+20?
O que vai ser a reunião é difícil de prever. Mas eu espero que marque um ponto de virada em nossa forma de pensar e em nossas atitudes. Já é hora de olhar para as implicações no longo prazo do que estamos fazendo e tomar algumas decisões. Eu espero a Rio+20 marque uma mudança na forma de pensar da raça humana.

Secretária da ONU diz que ações são respostas à ciência – A secretária executiva da Convenção do Clima da ONU, Christiana Figueres, negou ao Estado que a ciência não ocupe posição central nas negociações. “Se a ciência não dissesse que temos um problema, não estaríamos aqui. A convenção existe precisamente em resposta à ciência e a convenção sempre está atenta aos seus progressos.” Segundo ela, a convenção acompanhará o 5º Relatório do IPCC, a ser publicado em 2013 e 2014.
(O Estado de São Paulo – 4/12)

US will not air climate change episode of Frozen Planet (New Statesman)

Posted by Samira Shackle – 17 November 2011 13:38

BBC defends decision to give world TV channels the option of dropping the final episode of David Attenborough’s series.

The final episode of David Attenborough's Frozen Planet will not be aired in the US.The final episode of David Attenborough’s Frozen Planet will not be aired in the US. Photograph: Getty Images

An episode of David Attenborough’s Frozen Planet series that looks at climate change will not be aired in the US, where many are sceptical about global warming.

Seven episodes of the multi-million-pound nature documentary series will be aired in Britain. However, the series has been sold to 30 world TV networks as a package of only six episodes. These networks then have the option of buying the seventh “companion” episode — which explores the effect man is having on the natural world — as well as behind the scenes footage.

The six-episode series has been sold to 30 broadcasters, ten of which have declined to use the climate change episode, “On Thin Ice”, including the US.

In America, the series is being aired by the Discovery channel, which insists that the final episode has been dropped because of a “scheduling issue”.

Regardless of their reasoning, environmental campaigners have criticised the BBC’s decision to market the episode separately as “unhelpful”. And it has caused controversy across the board. The Telegraph‘s headline (“BBC drops Frozen Planet’s climate change episode to sell show better abroad”) sums up how the news has been received.

However, the BBC have defended the decision, claiming that it is more to do with a difference in style in this episode than its content. Caroline Torrance, BBC Worldwide’s Director of Programme Investment, wrote in a blog that the first six episodes “have a clear story arc charting a year in our polar regions”, adding:

Although it is filmed by the same team and to the same production standard, this programme is necessarily different in style.

Having a presenter in vision requires many broadcasters to have the programme dubbed, ultimately giving some audiences a very different experience.

Audiences are currently enjoying incredible footage of the natural world; it would be a shame for them to leave without a sense of the danger it faces.

Corporations spending billions to exert ‘undue influence’ to prevent global climate action: report (Canada.com)

BY MIKE DE SOUZA, POSTMEDIA NEWS NOVEMBER 23, 2011

Oilsands file photo
 Oilsands file photo. Photograph by: Bruce Edwards, The Journal, File, Edmonton Journal

A handful of multinational corporations are “exerting undue influence” on the political process in Canada, the U.S. and other key nations to delay international action on climate change, alleges a new report released Tuesday by Greenpeace International.

The report documents a series of alleged lobbying and marketing efforts led by major corporations and industry associations, representing oil and gas companies as well as other major sources of pollution in Canada, the U.S., Europe and South Africa, which is hosting an international climate-change summit that begins next Monday.

South of Canada’s borders, industry stakeholders are investing about $3.5 billion per year to lobby the U.S. government on a variety of issues, as well as financing American politicians who “deny” scientific evidence linking human activity to dangerous changes in the atmosphere that contribute to global warming, estimates the report, titled: Who’s holding us back? How carbon intensive industry is preventing effective climate legislation.

“Carbon-intensive corporations and their networks of trade associations are blocking policies that aim to transition our societies into green, sustainable, low risk economies,” said the report, authored by Greenpeace staff from around the world, based on national lobbying registries and other public records from government and industry.

“These polluting corporations often exert their influence behind the scenes, employing a variety of techniques, including using trade associations and think-tanks as front groups; confusing the public through climate denial or advertising campaigns; making corporate political donations; as well as making use of the ‘revolving door’ between public servants and carbon-intensive corporations.”

The report raises questions about activities of energy industry companies including Shell, Koch Industries and Eskom, as well as BASF — a chemical products company, BHP Billiton — a mining company, and ArcelorMittal, a steel company created from a merger that followed the takeover of Canadian-based Dofasco by Europe-based Arcelor.

Most nations at the upcoming international summit in Durban, South Africa, have publicly said they hope to extend targets to reduce pollution under the Kyoto Protocol, the world’s only legally-binding treaty on global warming. But Canada, along with Japan and Russia, has openly indicated that it plans to walk away from the agreement which set targets for developed nations between 2008 and 2012 as a first step toward stabilizing greenhouse gas emissions in the atmosphere.

“Canada goes to Durban with a number of countries sharing the same objectives and that is to put Kyoto behind us and to encourage all nations and all major emitting countries to embrace a new agreement to reduce greenhouse gas in a material way,” Environment Minister Peter Kent said Tuesday in the House of Commons in response to questions from NDP environment critic Megan Leslie.

Representatives of the Canadian Association of Petroleum Producers, one of the lobby groups singled out in the report, have explained it supports balanced climate and energy policies that allow for growth of all energy sources to meet rising demands in the decades to come. But meantime, the association says its member companies are already adapting to new policies and pollution taxes from jurisdictions such as Alberta and British Columbia, while investing in new technologies to prepare for stronger standards in the future.

Natural deposits in Western Canada, also known as the oilsands, are believed to contain one of the largest reserves of oil in the world, but they require large amounts of energy, land and water to extract the fuel from the ground, with an annual global warming footprint that has almost tripled since 1990. The annual greenhouse gas emissions from this sector are now greater than those of all cars on Canadian roads and almost as much as the pollution from all light-duty trucks or sport utility vehicles driven in Canada.

The Canadian lobby group has opposed policies in jurisdictions such as the U.S. and the European Union that would discourage consumption of fuel derived from the oilsands or other sources that have a heavier footprint than conventional sources of oil.

The report highlights say the federal and Alberta governments have also been partners in a taxpayer-funded “advocacy strategy” led by Canada’s Foreign Affairs Department to fight international climate-change policies and “promote the interests of oil companies.”

Prime Minister Stephen Harper’s government and its Liberal predecessors have repeatedly pledged to regulate pollution from the industry without following through on their commitments. Kent also promised to introduce a plan to tackle emissions from the oilsands sector this year, but later retreated on the commitment.

“The reason that Canada has actually made it in here (the report), is because the Harper government has acted with and on behalf of tarsands companies to undermine international action on climate change,” said Greenpeace Canada climate and energy campaigner Keith Stewart. “When we look at this globally, if we’re serious about avoiding climate catastrophe, we can’t afford to let the Harper government and the tarsands industry grow the markets of dirty oil at the expense of cleaner alternatives.”

The report highlighted a pattern of industry lobby groups and chambers of commerce running advertising campaigns against any proposals to tackle climate change by warning people in the general public that their respective countries were acting alone and would kill jobs by adopting measures to reduce pollution. It also noted that some companies, which claim to defend action on climate change, are actively supporting industry associations that are seeking to undermine progress on the issue.

The Greenpeace report also coincides with the mysterious release on Tuesday of emails from a British-based climate research unit that was at the heart of controversy prior to a 2009 climate change summit when the stolen correspondence was used by climate skeptics to allege an international conspiracy by scientists to mislead the planet about the consequences of rising greenhouse gas emissions.

A series of independent inquiries have dismissed the conspiracy theories and cleared the scientists involved of any wrongdoing, but those responsible for stealing the emails were never caught.

mdesouza(at)postmedia.com

Twitter.com/mikedesouza

© Copyright (c) Postmedia News

SUMMARY OF THE 34TH SESSION OF THE INTERGOVERNMENTAL PANEL ON CLIMATE CHANGE (Earth Negotiations Bulletin)

Volume 12 Number 522 – Monday, 21 November 2011

The 34th session of the Intergovernmental Panel on Climate Change (IPCC) was held from 18-19 November 2011 in Kampala, Uganda. The session was attended by more than two hundred participants, including representatives from governments, the United Nations, and intergovernmental and observer organizations. Participants focused primarily on the workstreams resulting from the consideration of the InterAcademy Council (IAC) Review of the IPCC processes and procedures, namely those on: procedures, conflict of interest policy, and communications strategy.

The Panel adopted the revised Procedures for the Preparation, Review, Acceptance, Adoption, Approval and Publication of IPCC Reports, as well as the Implementation Procedures and Disclosure Form for the Conflict of Interest Policy. The Panel also formally accepted the Summary for Policy Makers (SPM) of the Special Report on Managing the Risks of Extreme Events and Disasters to Advance Climate Change Adaptation (SREX), approved by WGs I and II at their joint meeting from 14-17 November 2011. Delegates also addressed issues such as the programme and budget, matters related to other international bodies, and progress reports.

A BRIEF HISTORY OF THE IPCC

The IPCC was established in 1988 by the World Meteorological Organization (WMO) and the UN Environment Programme (UNEP). Its purpose is to assess scientific, technical and socio-economic information relevant to understanding the risks associated with human-induced climate change, its potential impacts, and options for adaptation and mitigation. The IPCC does not undertake new research, nor does it monitor climate-related data, but it conducts assessments on the basis of published and peer-reviewed scientific and technical literature.

The IPCC has three Working Groups (WGs): WGI addresses the scientific aspects of the climate system and climate change; WGII addresses the vulnerability of socio-economic and natural systems to climate change, impacts of climate change and adaptation options; and WGIII addresses options for limiting greenhouse gas emissions and mitigating climate change. Each WG has two Co-Chairs and six Vice-Chairs, except WGIII, which for the Fifth Assessment cycle has three Co-Chairs. The Co-Chairs guide the WGs in fulfilling the mandates given to them by the Panel and are assisted in this task by Technical Support Units (TSUs).

The IPCC also has a Task Force on National Greenhouse Gas Inventories (TFI). TFI oversees the IPCC National Greenhouse Gas Inventories Programme, which aims to develop and refine an internationally agreed methodology and software for the calculation and reporting of national greenhouse gas emissions and removals, and to encourage the use of this methodology by parties to the United Nations Framework Convention on Climate Change (UNFCCC). The Task Group on Data and Scenario Support for Impact and Climate Analysis (TGICA) is an entity set up to address WG needs for data, especially WGII and WGIII. The TGICA facilitates distribution and application of climate change related data and scenarios, and oversees a Data Distribution Centre, which provides data sets, scenarios of climate change and other environmental and socio-economic conditions, and other materials.

The IPCC Bureau is elected by the Panel for the duration of the preparation of an IPCC assessment report (approximately six years). Its role is to assist the IPCC Chair in planning, coordinating and monitoring the work of the IPCC. The Bureau is composed of climate change experts representing all regions. Currently, the Bureau comprises 31 members: the Chair of the IPCC, the Co-Chairs of the three WGs and the Bureau of the TFI (TFB), the IPCC Vice-Chairs, and the Vice-Chairs of the three WGs. The IPCC Secretariat is located in Geneva, Switzerland, and is hosted by the WMO.

IPCC PRODUCTS: Since its inception, the IPCC has prepared a series of comprehensive assessments, special reports and technical papers that provide scientific information on climate change to the international community and are subject to extensive review by experts and governments.

The IPCC has so far undertaken four comprehensive assessments of climate change, each credited with playing a key role in advancing negotiations under the UNFCCC: the First Assessment Report was completed in 1990; the Second Assessment Report in 1995; the Third Assessment Report in 2001; and the Fourth Assessment Report (AR4) in 2007. At its 28th session in 2008, the IPCC decided to undertake a Fifth Assessment Report (AR5) to be completed in 2014.

The latest Assessment Reports are structured into three volumes, one for each WG. Each volume is comprised of a SPM, a Technical Summary and an underlying assessment report. All assessment sections of the reports undergo a thorough review process, which takes place in three stages: a first review by experts; a second review by experts and governments; and a third review by governments. Each SPM is approved line-by-line by each respective WG. The Assessment Report also includes a Synthesis Report (SYR), highlighting the most relevant aspects of the three WG reports, and a SPM of the SYR, which is approved line-by-line by the Panel. More than 450 lead authors, 800 contributing authors, 2500 expert reviewers and 130 governments participated in the elaboration of the AR4.

In addition to the comprehensive assessments, the IPCC produces special reports, methodology reports and technical papers, focusing on specific issues related to climate change. Special reports prepared by the IPCC include: Aviation and the Global Atmosphere (1999); Land Use, Land-use Change and Forestry (2000); Methodological and Technical Issues in Technology Transfer (2000); Safeguarding the Ozone Layer and the Global Climate System (2005); Carbon Dioxide Capture and Storage (2005); Renewable Energy Sources and Climate Change Mitigation (SRREN) (2011); and, most recently, the Special Report on Managing the Risks of Extreme Events and Disasters to Advance Climate Change Adaptation (SREX) (2011). Technical papers have been prepared on Climate Change and Biodiversity (2002) and on Climate Change and Water (2008), among others.

The IPCC also produces methodology reports or guidelines to assist countries in reporting on greenhouse gases. The IPCC Guidelines for National Greenhouse Gas Inventories were first released in 1994 and a revised set was completed in 1996. Additional Good Practice Guidance reports were approved by the Panel in 2000 and 2003. The latest version, the IPCC Guidelines on National Greenhouse Gas Inventories, was approved by the Panel in 2006.

For all this work and its efforts to “build up and disseminate greater knowledge about manmade climate change, and to lay the foundations that are needed to counteract such change,” the IPCC was awarded the Nobel Peace Prize, jointly with former US Vice President Al Gore, in December 2007.

IPCC-28: This session was held from 9-10 April 2008, in Budapest, Hungary, with discussions centering on the future of the IPCC, including key aspects of its work programme such as WG structure, main type and timing of future reports, and the future structure of the IPCC Bureau and the TFB. At this session, the IPCC agreed to prepare the AR5 and to retain the current structure of its WGs. In order to enable significant use of new scenarios in the AR5, the Panel requested the Bureau to ensure delivery of the WGI report by early 2013 and completion of the other WG reports and the SYR at the earliest feasible date in 2014. The Panel also agreed to prepare the SRREN Report, to be completed by 2010. Earth Negotiations Bulletin coverage of IPCC 28 can be found at:http://www.iisd.ca/climate/ipcc28

IPCC-29: This session, which commemorated the IPCC’s 20th anniversary, was held from 31 August to 4 September 2008, in Geneva, Switzerland. At this time, the Panel elected the new IPCC Bureau and the TFB, and re-elected Rajendra Pachauri (India) as IPCC Chair. The Panel also continued its discussions on the future of the IPCC and agreed to create a scholarship fund for young climate change scientists from developing countries with the funds from the Nobel Peace Prize. It also asked the Bureau to consider a scoping meeting on the SREX, which took place from 23-26 March 2009 in Oslo, Norway. Earth Negotiations Bulletin coverage of IPCC-29 can be found at: http://www.iisd.ca/climate/ipcc29

IPCC-30: This session was held from 21-23 April 2009 in Antalya, Turkey. At the meeting, the Panel focused mainly on the near-term future of the IPCC and provided guidance for an AR5 scoping meeting, which was held in Venice, Italy, from 13-17 July 2009. The Panel also gathered climate change experts to propose the chapter outlines of WG contributions to the AR5. Earth Negotiations Bulletincoverage of IPCC 30 can be found at: http://www.iisd.ca/climate/ipcc30

IPCC-31: This session was held from 26-29 October 2009 in Bali, Indonesia. Discussions focused on approval of the proposed AR5 chapter outlines developed by participants at the Venice scoping meeting. The Panel also considered progress on the implementation of decisions taken at IPCC 30 regarding the involvement of scientists from developing countries and countries with economies in transition, use of electronic technologies, and the longer-term future of the IPCC. Earth Negotiations Bulletin coverage of IPCC 31 can be found at: http://www.iisd.ca/climate/ipcc31

INTERACADEMY COUNCIL REVIEW: In response to public criticism of the IPCC related to inaccuracies in the AR4 and the Panel’s response, as well as questions about the integrity of some of its members, UN Secretary-General Ban Ki-moon and IPCC Chair Rajendra Pachauri requested the IAC to conduct an independent review of the IPCC processes and procedures and to present recommendations to strengthen the IPCC and ensure the on-going quality of its reports. The IAC presented its results in a report in August 2010. The IAC Review makes recommendations regarding: management structure; a communications strategy, including a plan to respond to crises; transparency, including criteria for selecting participants and the type of scientific and technical information to be assessed; and consistency in how the WGs characterize uncertainty.

IPCC-32: This session, held from 11-14 October 2010 in Busan, Republic of Korea, addressed the recommendations of the IAC Review. The Panel adopted a number of decisions in response to the IAC Review, including on the treatment of grey literature and uncertainty, and on a process to address errors in previous reports. To address recommendations that required further examination, the Panel established task groups on processes and procedures, communications, conflict of interest policy, and management and governance. The Panel also accepted a revised outline for the AR5 SYR. Earth Negotiations Bulletin coverage of IPCC 32 can be found at:http://www.iisd.ca/climate/ipcc32

SRREN: The eleventh session of WGIII met from 5-8 May 2011 in Abu Dhabi, United Arab Emirates, and approved the Special Report on Renewable Energy Sources and Climate Change Mitigation (SRREN) and its SPM. Discussions focused, among others, on chapters addressing sustainable development, biomass and policy. Key findings of the SRREN include that the technical potential for renewable energies is substantially higher than projected future energy demand, and that renewable energies play a crucial role in all mitigation scenarios.

IPCC-33: The session, held from 10-13 May 2011 in Abu Dhabi, United Arab Emirates, focused primarily on follow-up actions to the IAC Review of the IPCC processes and procedures. The Panel decided to establish an Executive Committee, adopted a Conflict of Interest Policy, and introduced several changes to the rules of procedure. The Panel also endorsed the actions of WGIII in relation to SRREN and its SPM and considered progress on the preparation of the AR5. Earth Negotiations Bulletin coverage of IPCC 33 can be found at: http://www.iisd.ca/vol12/enb12500e.html

SREX: The First joint session of IPCC WGs I and II, which took place on 14-17 November in Kampala, Uganda, accepted the Special Report on Managing the Risks of Extreme Events and Disasters to Advance Climate Change Adaptation (SREX) and approved its SPM. The SREX addressed the interaction of climatic, environmental and human factors leading to adverse impacts of climate extremes and disasters, options for managing the risks posed by impacts and disasters, and the important role that non-climatic factors play in determining impacts.

IPCC-34 REPORT

IPCC Chair Rajendra Pachauri opened the 34th session of the Intergovernmental Panel on Climate Change on Friday, 18 November 2011, highlighting ongoing work related to the Fifth Assessment Report (AR5) and progress in the implementation of the InterAcademy Council (IAC) recommendations. He also referred to the communications strategy and the need to ensure policy relevance and reach out to policymakers. Pachauri said it was critically important that the results of the Special Report on Renewable Energy Sources and Climate Change Mitigation (SRREN) and the Special Report on Managing the Risks of Extreme Events and Disasters to Advance Climate Change Adaptation (SREX) be presented to the United Nations Framework Convention on Climate Change (UNFCCC) Conference of the Parties (COP) in Durban, South Africa. He emphasized the significance of the meeting being held in Africa, given the findings related to climate change impacts and development challenges in the region, and thanked Uganda for hosting the meeting and Norway for its support.

Norwegian Ambassador Thorbjørn Gaustadsæther highlighted that the SREX is an important tool for understanding, taking actions, and making decisions on managing the risks of extreme events and disasters. He noted that extreme weather events and their negative impacts are apparent everywhere, including in Uganda, for fishermen on the Lake Victoria who experience reduced catch, as well as in his native Norway, which experiences dramatic flooding, shrinking Arctic ice and other events. He said the SREX would be presented to governments at the Durban UNFCCC meeting and would provide a good basis for them to take action. He thanked the Ugandan government for its hospitality and said Norway was pleased to have contributed to the organization of the meeting.

Peter Gilruth, on behalf of UNEP Executive Director Achim Steiner, stressed the potential of the SREX, including as a foundation on which the disaster risk reduction and the climate change communities can build stronger bridges, and as a basis for environment and development work. He noted various UNEP initiatives and assessment reports, including the Programme of Research on Climate Change Vulnerability, Impacts and Adaptation, the fifth Global Environmental Outlook and the Emissions Gap Assessment, and invited delegates to participate in the “Eye on Earth” summit in December to build partnerships on knowledge sharing.

Florin Vladu, on behalf of Christiana Figueres, Executive Secretary of the UNFCCC, updated the plenary on developments in the negotiating process, highlighting the achievements of the Cancun Agreements in establishing an institutional infrastructure, but noting a failure to address the future of the Kyoto Protocol and a mitigation framework. Vladu said that in Durban countries face a challenge to find a viable way forward, but expressed hope that the conference will help build confidence in post-2012 climate finance through clarity on long-term finance and making the Green Climate Fund operational. Vladu highlighted that the UNFCCC process has benefited from an active research dialogue with the IPCC, most recently in the form of a presentation on the SRREN at the Subsidiary Body for Scientific and Technological Advice (SBSTA) session in June 2011. He also noted the special role of the IPCC in the UNFCCC review of the adequacy of the goal of limiting average global temperature below 2 degrees Celsius and the overall progress towards achieving this goal, which is scheduled to commence in 2013. On SREX, he said the report would contribute both to the work of SBSTA, and Adaptation Framework, and work programme on loss and damage, once those become operational.

Noting that this has been a transformative year for the IPCC, Jeremiah Lengoasa, on behalf of World Meteorological Organization (WMO) Secretary-General Michel Jarraud, reaffirmed support for the work of the Panel and emphasized the importance of the IPCC’s work and procedures remaining relevant and timely. He welcomed the AR5 preparations moving ahead as scheduled and stressed that the AR5 will provide a strong basis for decision-making, including in relation to water resources, agriculture and food security. He also highlighted the role of the WMO Global Framework for Climate Services, to be launched in the near future, to further assist in decision-making.

Maria Mutagamba, Minister for Water and Environment, Uganda, expressed warm greetings from the people of Uganda and welcomed delegates to the country traditionally known as the Pearl of Africa. She said that it is with great pride that Uganda continues to participate actively in the work of the IPCC and hosts this meeting, and thanked Norway, which co-funded the session. She said that Uganda has already started experiencing extreme weather events attributed to climate change such as severe droughts, floods and increased frequency of landslides. Highlighting the inevitability of climate change, she noted that her country has adaptation policies in place. On mitigation, she underlined Uganda’s early efforts under the Clean Development Mechanism. She further noted the need to strengthen national meteorological and hydrological services in developing countries and thus expressed support for the WMO Global Framework for Climate Services. She also suggested the IPCC continue to consider the role of indigenous knowledge in areas where peer-reviewed literature is unavailable or insufficient as well as issues of technology transfer to developing countries and dissemination of information.

The Panel then observed a minute of silence for the untimely and sad passing away of Mama Konate, UNFCCC SBSTA Chair and IPCC colleague.

APPROVAL OF THE DRAFT REPORT OF THE 33RD SESSION

The draft report of IPCC-33 (IPCC-XXXIV/Doc. 2, Rev.1) was adopted on Friday morning with a minor editorial amendment. Belgium noted the lack of reference in the meeting minutes to the Expert Meeting on Geoengineering and the participation of media representatives in at that meeting.

SPECIAL REPORT ON EXTREME EVENTS AND DISASTERS

This issue (IPCC-XXXIV/Doc. 21) was taken up by the plenary on Friday morning. The IPCC plenary formally accepted the actions taken at the Joint Session of Working Groups I and II on the SREX, including approving its Summary for Policy Makers (SPM). Underscoring the importance and usefulness of the SREX, Austria said that, among others, this landmark report introduces terminology to be understood both by the risk management and the climate change community, identifies a range of practices and options to reduce risk, and provides clarity on what the most vulnerable sectors, groups and areas are, making it of tremendous use for taking appropriate actions.

PREPARATION OF THE FIFTH ASSESSMENT REPORT (AR5)

The item (IPCC-XXXIV/Doc. 5) was presented to the plenary on Friday afternoon. Chair Pachauri recalled that the Panel had issued a clear mandate to start very early with the AR5 Synthesis Report (SYR), and Leo Meyer, Head of the SYR Technical Support Unit (TSU), reported on process and management issues related to the SYR (IPCC-XXXIV/Doc. 5). Meyer noted, inter alia: the inclusion of the IPCC Vice-Chairs on the SYR writing team since they have responsibilities related to cross-cutting issues; the possibility of a workshop on UNFCCC Article 2, which could feed into the UNFCCC review of the adequacy of the Convention’s ultimate goal; and the suggestion to reduce the time of eight weeks allowed for government comments on the final draft of the SPM to six weeks given the compressed timeline of the SYR.

On the time frame, the US suggested, and the Panel agreed, to seven weeks instead of the six weeks proposed for government comments.

With regard to a possible workshop on UNFCCC Article 2, Chair Pachauri suggested inviting general comments by governments. Emphasizing the importance of the IPCC retaining distance from the policy process, the US, supported by New Zealand, Canada, Saudi Arabia and others, opposed the suggestion. Saudi Arabia underscored that the issue of Article 2 is very sensitive. The Panel agreed to have the Bureau consider the matter at its next meeting.

REVIEW OF THE IPCC PROCESSES AND PROCEDURES

CONFLICT OF INTEREST POLICY: This issue (IPCC-XXXIV/Doc. 8, Rev. 1) was first addressed in the plenary on Friday and then in several meetings of a contact group co-chaired by Andrej Kranjc (Slovenia) and Jongikhaya Witi (South Africa), with Samuel Duffett (UK) as Rapporteur. The workstream on the Conflict of Interest (COI) Policy arose in response to the recommendations made in the IAC Review to develop and adopt a rigorous COI Policy. At IPCC-33 delegates adopted a COI Policy and extended the mandate of the Task Group on COI in order to develop proposals for annexes to the COI Policy covering Implementation Procedures and the Disclosure Form.

Contact group discussions focused on the draft Implementation Procedures prepared by the Task Group. During the group’s first meeting, Co-Chair Kranjc noted that the Task Group held four teleconferences in between sessions and that the WGs already have experience applying the COI Policy on an interim basis. Rapporteur Duffett then explained the proposed decision-making process on COI, noting there would be different procedures for Bureaux members and non-Bureaux members.

The discussions centered on several issues, including: which body determines whether an individual has a COI; the role of the COI Expert Advisory Group; which body is responsible for the final decision in cases of COI; cases of tolerance of COI for non-Bureaux members; and principles for considering COI issues.

On a body to determine whether an individual has a COI, the proposal of the Task Group was to form a special committee comprised of representatives from each of the six WMO regional groups. Some participants noted that implementation of COI policies is a relatively simple and technical procedure and in most cases there is no COI, so it would be an additional burden to establish a new committee and conduct elections for its members. In this regard, they suggested making use of existing bodies and assigning this function to the Executive Committee. They also suggested that the Executive Committee members would be the ones most interested in maintaining the integrity of the IPCC. Others expressed concern about Bureaux members who are part of the Executive Committee making decisions on their own COI. A compromise was reached on establishing a COI Committee composed of voting members of the Executive Committee and representatives of WMO and UNEP, with a recusal clause.

Delegates also developed principles for considering COI issues, introducing those in relation to exploring options for resolution of COI and an appeals procedure. The group added a provision requiring members of bodies involved in considering COI issues to recuse themselves from a discussion on their own COI.

The Task Group proposed that the Expert Advisory Group, which would be comprised of three representatives from WMO and UNEP, review COI forms of Bureaux nominees. However, some expressed a concern about this approach and a change was introduced that the COI Committee consults the Expert Advisory Group when it deems necessary.

Further discussion took place on which body would be responsible for a final decision on COI. An opinion was expressed that all final decisions should be made in plenary; however, others raised concerns about maintaining the confidentiality of personal information in that case. The contact group elaborated on an appeals procedure, assigning a function to the IPCC Bureau to review a COI determination on request by the individual in question.

On COI in relation to non-Bureaux members, several supported some flexibility in this regard as there are too few experts in some areas and those are often involved with industries or organizations. Delegates developed the relevant procedures on the tolerance of COI in such cases.

In the final plenary, the Panel adopted the Implementation Procedures and Disclosure Form for the COI Policy with minor editorial corrections. Chair Pachauri said COI was clearly one of the trickiest and most complex issues to address in relation to the IAC Review.

The US expressed its satisfaction with an “excellent” outcome on COI, in particular regarding the creation of a body that will implement the COI Policy effectively and very soon, composed of those with a strong interest in ensuring the integrity of its outcomes.

Canada noted that the contact group discussions were exceedingly positive and that the Implementation Procedures for the COI Policy will provide an effective process to promote transparency. The Netherlands underlined the enormous importance of the documents on COI for the transparency and integrity of the Panel, and its acceptance by the outside world. Thanking all members of the Task Group, Australia congratulated the plenary on a “groundbreaking” COI mechanism for many international organizations, both in substance and in the procedure of how it was developed.

Secretary Christ asked the plenary how the set of documents on COI should be integrated into IPCC regulations and suggested a paragraph be added that states these documents constitute an appendix to the Principles Governing the IPCC Work. To this, the US replied that more consideration is needed before the documents are elevated to the level of principles and suggested leaving them as standalone documents. The Panel agreed to the suggestion.

Final Decision: In its decision, the Panel, inter alia:

adopts the COI Implementation Procedures and decides that the Procedures will apply to individuals who are subject to the COI Policy;
decides to establish a COI Committee comprising all elected members of the Executive Committee and two additional members with appropriate legal expertise from UNEP and WMO, appointed by those organizations;
decides to establish an Expert Advisory Group on COI and invites the Secretary-General of WMO and the Executive Director of UNEP to select members of the COI Expert Advisory Group and to facilitate the establishment of the COI committee as soon as possible;
notes that the WG and Task Force Bureaux have adopted interim arrangements for dealing with COI issues and that those arrangements are broadly consistent with the COI Policy;
decides that, to ensure a smooth transition, the existing interim arrangements will continue to operate, with respect to individuals who are not Bureau members until the Executive Committee decides that the implementation procedures apply to those individuals;
requests IPCC and TFI Bureaux members to submit a COI Form to the Secretariat within three months;
decides to receive a report on the operation of the COI Expert Advisory Group and the COI Committee within twelve months of their establishment and to review their operations, as appropriate, within twelve months after the next Bureaux election(s); and
notes that the COI Committee will develop its own methods of working and will apply those on an interim basis pending approval by the Panel, and decides that the COI Committee should submit its methods of working to the Panel within twelve months of its establishment.
Implementation Procedures: The Procedures address the following:

The overall purpose of the Implementation Procedures is to ensure that COIs are identified, communicated to the relevant parties and manage to avoid any adverse impact of IPCC balance, products and processes, and also to protect the individual, the IPCC and the public interest.
In their scope, the Implementation Procedures apply to all COIs and all individuals defined in the COI Policy, and compliance with the COI Policy and the Procedures is mandatory.
The Implementation Procedures further set out the review process on COI for IPCC and Task Force Bureaux members prior to and after their appointment. According to this process, the COI Disclosure Forms for all nominees should be submitted to the Secretariat to be reviewed by a COI Committee. The COI Committee may request advice from the Expert Advisory Group on COI. If the COI Committee determines that a nominee has a COI that cannot be resolved, the individual will not be eligible for election to the Bureau.
The Implementation Procedures also outline the review process for Coordinating Lead Authors, Lead Authors, Review Editors and TSUs prior to and after their appointment. In this case, Disclosure Forms are submitted to relevant TSUs and reviewed by WG or Task Force Bureaux. The document defines exceptional circumstances in which a COI in relation to non-Bureaux members may be tolerated, that is when an individual can provide a unique contribution and when a COI can be managed. Such cases should be disclosed. The document also outlines the process to deal with a COI after the appointment of non-Bureaux members, including updating information, review and an appeal procedure.
The Implementation Procedures set out principles for considering COI issues that are applied to all bodies involved in advising on and deciding COI issues. In this regard, they require those bodies to consult the relevant individual regarding potential COIs and explore the resolution options as well as provide for an appeal procedure. The document also requires members of the bodies involved in consideration of COI issues to recuse themselves when being a subject of consideration.
The Implementation Procedures further contain provisions on processing and storage of information to ensure confidentiality of submitted information.
The document further sets out the composition and functions of the COI Committee and Expert Advisory Group on COI.
Annex B to the Implementation Procedures also contains a COI Disclosure Form.
PROCEDURES: This issue (IPCC-XXXIV/Doc. 9, Add. 1) was first introduced in the plenary on Friday and then taken up by a contact group co-chaired by Eduardo Calvo (Peru) and Øyvind Christophersen (Norway), with Arthur Petersen (Netherlands) as Rapporteur. Work centered on the finalization of revisions to the Appendix A to the Principles Governing IPCC Work: Procedures for the Preparation, Review, Acceptance, Adoption, Approval and Publication of IPCC Reports, which started at IPCC-32. The Panel adopted the revised Procedures Appendix in plenary on Saturday, completing the work of the Task Group on Procedures.

Discussions in the contact group centered on the production and treatment of guidance material, the selection of participants to IPCC workshops and expert meetings, matters related to the transparency, quality and efficiency of the review process, anonymous expert review, and SPM approval sessions.

On guidance material, Belgium and others called for stating that guidance material needs to be taken into account in the preparation of the reports in addition to stating what guidance material is, while others cautioned against excessively normative language. The group agreed leave the text as is.

On the selection of participants to IPCC workshops and expert meetings, the group addressed text related to the distinction between these two types of meetings.

On matters related to the transparency, quality and efficiency of the review process, the group considered the Revised Guidance Note on the Role of Review Editors (IPCC-XXXIV/Doc. 9, Add.1) prepared by the WG and TFI Bureaux. The group also addressed the current practice of expanding the number of Review Editors per chapter. After some discussion, the group agreed that there was a need to limit the number of Review Editors to four per chapter.

On text related to open invitations for expert reviewers, recommendations were made to circulate second in addition to First Order Draft Reports by WG/TFB Co-Chairs for review. In relation to inviting as wide a group of experts as possible, Review Editors were added to a list of potentially nominated experts. Text was also added on notifying Government Focal Points when this process starts.

On anonymous expert review, the group discussed the need to ensure the appropriate flexibility and agreed to add text that clarifies that the procedures do not prescribe WGs and the TFI to use either anonymous or named expert reviews. In order to document past experience with anonymous expert reviews by WGIII and the TFI during the AR4, the group agreed to include the Note by the Task Group on Procedures on IPCC Anonymous Expert Review: Past experiences and arguments in favor or against (Appendix 3 of IPCC-XXXIV/Doc. 9) in an annex to the Report of IPCC-34.

On the process for the SPM approval, the group addressed text on the process for sending government comments to the Second Order Draft prior to the plenary approval session of the SPM, bringing the procedures in line with current practice.

During the final plenary, Austria noted that, although important progress was made, there is a need to further strengthen the Procedures, in particular related to the calibrated uncertainty language of assessments, to increase transparency and traceability of the decisions of authors so these can be understood in the future. He also proposed further addressing the management and working rules for the writing teams so they are the same across WGs. With regard to calibrated language, New Zealand drew attention to the existing Guidance Paper on Uncertainties and cautioned against having the Panel decide on this, stressing that this should be the province of the WGs.

The European Union (EU) asked for clarification on whether participating organizations are also considered in the round of comments by governments for SPM approval. Co-Chair Christophersen responded that this was not brought up or considered by the group. The EU noted that it would be useful to introduce this in the future given the EU’s particular character. Australia proposed, and the Panel agreed, to record the EU’s concern in the minutes of the meeting along with Austria’s suggestion.

Final Decision: The decision on Procedures addresses the following:

On the IPCC guidance material, the Panel decides that guidance material is a category of IPCC supporting material aimed to guide and assist in the preparation of IPCC reports and Technical Papers. The Panel also clarifies who is responsible and who may commission guidance material.
On selection of participants to IPCC Workshops and Expert Meetings, the Panel elaborates on the distinction between these two types of meetings, including their composition, and establishes that the WG/TFI Bureaux or the IPCC Chair will report to the IPCC Bureau and Panel on the process of selection of participants, including a description of how the selection criteria have been applied.
On matters related to transparency, quality and efficiency of the review process, the IPCC welcomes the revised Guidance Note on Review Editors and finds that the recommendations of the IAC on the Review Editors have been taken adequately into account. The Panel also encourages the implementation of this revised Guidance Note in the AR5 and invites the WG Co-Chairs to monitor progress in their WG progress reports. In addition, the Panel decides that to provide a balanced and complete assessment of current information, each WG/TFI Bureau should normally select two to four Review Editors per chapter and per technical summary of each Report. Furthermore, it decides that the WG/TFI Bureaux shall seek the participation of reviewers encompassing the range of scientific, technical and socio-economic views, expertise, and geographical representation, and shall actively undertake to promote and invite as wide a range of experts as possible.
On anonymous expert review, the Panel decides: not to amend the IPCC Procedures; not to preclude a different approach in the future; and to include the Note by the Task Group on Procedures on IPCC Anonymous Expert Review: Past experiences and arguments in favor or against (Appendix 3 of IPCC-XXXIV/Doc. 9) in an annex to the Report of IPCC-34.
On the process for the SPM approval, the Panel specifies the process for governments submitting written comments prior to the plenary approval session.
GOVERNANCE AND MANAGEMENT: This item (IPCC-XXXIV/Doc. 19) was taken up in the opening plenary on Friday. IPCC Chair Pachauri explained that both Co-Chairs of the Task Group on Governance and Management, David Warrilow (UK) and Taha Zatari (Saudi Arabia) were unable to come to Kampala, and that Task Group Co-Chair Warrilow suggested postponing the consideration of the matter until IPCC-35 and proposed holding IPCC-35 in the middle of 2012 rather than in the second half of the year. The UK explained that this will provide for a prompt response to the IAC recommendations and will allow moving forward with the AR5. The UK also proposed that if holding an earlier session is not possible, two sessions could be held next year instead of one. Several countries highlighted that an earlier meeting should not coincide with preparatory meetings for the United Nations Conference on Sustainable Development (Rio+20) and the Conference itself.

Delegates agreed to postpone the consideration of the item until IPCC-35.

COMMUNICATIONS STRATEGY: This item (IPCC-XXXIV/Doc. 20) was addressed in plenary on Friday. Secretary Christ recalled that IPCC-33 agreed on guidance on a communications strategy and requested the Secretariat to elaborate on the strategy according to that guidance. She noted delays with hiring a senior communications specialist who will not be on board for several months and in this context explained that the Secretariat asked its long-term consultant, Charlie Methven, to help prepare the draft communications strategy in order to respond to the plenary’s request.

Methven then elaborated on the main points of the proposed strategy. Highlighting the unique challenges the IPCC faces, he underlined that the future communications system should be a resource rather than a typical corporate structure. At the same time, he said, it should provide a central communication function and a stronger link between various elements of the IPCC, including the WGs and their TSUs. Noting the already existing ad hoc support on communications across WGs, Methven said these practices should be incorporated to make for a more accountable and coherent structure. He also mentioned that the proposed strategy is achievable within the current level of funding.

Chair Pachauri then requested guidance from the plenary on major pillars of the draft strategy.

Many, including New Zealand, US, Austria and Japan, expressed a deep concern about the delay with hiring a senior communications specialist who should be involved in the development of the strategy. Chair Pachauri explained that the hiring process is conducted according to WMO procedures but an individual had been selected and the discussion is now on a compensation package. He noted that this person cannot start immediately after accepting the offer, and that the selected candidate is not aware of the IPCC process sufficiently to actively contribute to its communications strategy.

Referring to the unique nature of the IPCC, the US highlighted the important role of WG Co-Chairs in communication of relevant products and that the proposed communications structure should not be independent from the WGs. He highlighted in this regard that a senior communication specialist should be facilitative in nature and expressed concern that the Executive Committee had no interaction with candidates for this role. Pachauri explained it was difficult to engage all members of the Executive Committee and that some of them were involved in developing the draft communications strategy.

Austria suggested preparing a Panel’s letter to WMO highlighting the urgency of hiring a communications person for the IPCC. He also suggested there should be a role for governments in the communications strategy, especially when it comes to regional matters. Switzerland underlined the importance of scientific integrity in the communication of the IPCC’s work, which often means “sticking literally to what has been said.” Australia proposed that a strategy should be forward-looking and contain a clear set of communications objectives: what to communicate, to whom and how. Several delegates suggested the document be forwarded to the full Executive Committee and Bureau for discussion.

Pachauri concluded that the draft communications strategy would now be discussed by a small group comprising representatives of the WGs, TFI, Secretariat and consultant Methven before being forwarded to the Executive Committee, Bureau and eventually the plenary.

In the final plenary on Saturday, Belgium recalled its proposal to re-establish a Task Force on Outreach and Communications Strategy, noting that such a Task Force had existed but disappeared when Pachauri became Chair, and to collect written comments by governments to advance the issue. Chair Pachauri supported the proposal and suggested Belgium submit it in written form. On a request for clarification by IPCC Vice-Chair Jean-Pascal van Ypersele, Chair Pachauri confirmed agreement at the Executive Committee meeting to have one of the IPCC Vice-Chairs involved in the group in charge of formulating the communications strategy.

The UK proposed, and the Panel agreed, to circulate the new draft communications strategy for comments and revision before the next session. Chair Pachauri said the Executive Committee will come up with a timetable to do so.

MATTERS RELATED TO UNFCCC AND OTHER INTERNATIONAL BODIES

During the opening plenary session, Chair Pachauri informed the Panel that, in contrast to all previous occasions when the IPCC had addressed the UNFCCC COP in plenary, he had now been asked to only present at SBSTA in Durban. He emphasized that this was an issue of institutions, not of personalities. Many countries expressed their disappointment and underscored the importance of conveying the IPCC’s findings to the COP directly, possibly also at the high-level segment. South Africa noted the concerns expressed on the participation of the IPCC at Durban and assured that the matter would receive proper attention by the upcoming COP Presidency.

A drafting group prepared a letter to the UNFCCC, which was distributed to the Panel for approval. The letter, addressed to the UNFCCC Executive Secretary, expressed the Panel’s disappointment and noted the inappropriateness of the decision, underscoring the strategic importance of having the IPCC address the UNFCCC at the COP level as has been the case since the first COP. The letter called for conveying the message to the current and upcoming COP Presidencies. The US, Saudi Arabia and New Zealand called for reflecting on the wisdom of this mode of communication and proposed Chair Pachauri speak again informally to the UNFCCC Executive Secretary on this matter.

On Saturday morning, Chair Pachauri informed the Panel that, after further communication, the UNFCCC Executive Secretary had written to say that she had consulted with the South African delegation and that, although the opening session of UNFCCC COP 17 will be more of a ceremonial nature, the IPCC would be invited to address the COP on Wednesday, 30 November, when it takes up substantive matters.

RULES OF PROCEDURE FOR THE ELECTION OF THE IPCC BUREAU AND ANY TASK FORCE BUREAU

In plenary on Saturday, Secretary Christ invited the Panel to provide guidance on how provisions arising from the review of IPCC processes and procedures at IPCC-33 and 34 are to be reflected in the revision to Appendix C to the Principles Governing IPCC Work: Rules of Procedure for the Election of the IPCC Bureau and Any Task Force Bureau (IPCC-XXXIV/Doc. 7). New Zealand, with Malaysia and Australia, noted that there was no representative from Region V (South-West Pacific) on the WGIII Bureau, and that the revised text leaves open the possibility that someone from Region V is not on the WGIII Bureau. Australia also highlighted that Region V does not have representation on the Executive Committee and said that these issues should be a high priority for IPCC-36. Secretary Christ said that the Secretariat would distribute a text to governments taking into consideration suggestions from IPCC-33 and 34, and would make this a high priority agenda item for IPCC-36.

IPCC PROGRAMME AND BUDGET AND FINANCIAL PROCEDURES FOR THE IPCC

During Friday’s opening plenary session, Secretary Christ gave an overview of issues related to the IPCC Trust Fund Programme and Budget (IPCC-XXXIV/Doc. 3, Rev.1) and the adoption of the revised “Appendix B to the Principles Governing IPCC Work: Financial procedures for the IPCC” (IPCC-XXXIV/Doc. 4, Corr. 1). She noted the need to address the greater cost of the publication and translation of the SRREN and an additional expert meeting on wetlands by TGICA, and urged resolution on the revised Appendix B in order to allow auditing of IPCC accounts.

The Financial Task Team, co-chaired by IPCC Vice-Chair Ismail A.R. El Gizouli (Sudan) and Nicolas Beriot (France), met to address these issues, convening twice on Friday. On Saturday morning, Co-Chair Beriot presented the deliberations of the Task Team to plenary, noting that the meetings had been well attended. He highlighted changes made to Appendix B, including the addition of a paragraph on the Financial Task Team and the revision of a paragraph that grants authority to the Secretariat to adjust allocations in the event that the IPCC Trust Fund is less than the approved budget. On Appendix B, the WMO and EU queried the implication of the IPCC Trust Fund being administered under International Public Sector Accounting Standards. Secretary Christ clarified that the text was drafted with the WMO legal consul, and expressed hope that in negotiating future agreements with the EU the various financial requirements will be reconciled.

Co-Chair Beriot highlighted two other Financial Task Team recommendations to the Panel in relation to simplifying language on procedural matters in the revised Appendix B no later than IPCC-37 and greater flexibility in financing travel arrangements for experts or members of the Bureau from developing countries. The UK and Austria recommended adding a second plenary session next year in order to have enough time to respond to the IAC Review; however, after further discussion, the Panel agreed that a four-day plenary session would be preferable to two two-day plenary sessions because of both time and resource constraints. New Zealand also suggested that teleconferences can be used for preparation meetings prior the next IPCC session.

Final Decision: In its decision, the Panel, inter alia:

approves the modified 2011 budget with respect to cost-related increases in the translation and publication of the SRREN;
approves the modified 2012 budget, which includes cost-related increases in the preparation of the 2013 IPCC Guidelines on Wetlands;
approves the revised “Appendix B to the Principles Governing IPCC Work: Financial Procedures for the IPCC” (IPCC-XXXIV/Doc. 4, Corr.1) with modifications, which include adding the Financial Task Team and granting authority to the Secretariat to make adjustments to allocations if there is a budget shortfall;
requests the Secretariat simplify language in the revised Appendix B document to improve clarity and readability no later than IPCC-37;
notes the forecast budget for 2013 and the indicative budgets for 2014 and 2015;
urges governments from developed countries to continue providing financial support for travel of experts to IPCC meetings;
requests that countries maintain their contributions in 2011 and 2012 and invites governments, which may be able to do so, to increase their level of contributions to the IPCC Trust Fund or to contribute in case they have not done so; and
endorses the expression of concern regarding the imposition of travel plans and arrangements on some experts or members of the Bureau from developing countries, with little concern to the particular traveler constraints and commitments, and that this be relate to the WMO Secretary-General.
PROGRESS REPORTS

AR5, PROGRESS REPORTS OF WGs I, II AND III: The WG Co-Chairs presented on progress since IPCC-33. WGII Co-Chair Vicente Barros (Argentina) highlighted a range of on-going expert, regional expert and lead author meetings, and Head of WGII TSU Kristie Ebi discussed the draft chapter writing schedule (IPCC-XXXIV/Doc. 10).

Head of WGIII TSU Jan Minx highlighted a range of expert and lead author meetings, and noted changes to the WGIII AR5 schedule and the writing process, which include a review of cross-chapter consistency and a policy to remove inactive authors (IPCC-XXXIV/Doc. 18, Rev.1).

WGI Co-Chair Thomas Stocker discussed a variety of expert meetings, including a Joint Expert Meeting in Lima, Peru, on Geoengineering in June 2011; a second WGI Lead Author meeting held in Brest, France in July 2011, which engaged primarily with cross-chapter issues; and a third Lead Author WGI meeting to be held in Marrakech, Morocco in April 2012. Stocker noted that on 16 December 2011 the First Order Draft of the WGI contribution to the AR5 will become available for an eight-week expert review (IPCC-XXXIV/Doc. 14).

TASK GROUP ON DATA AND SCENARIO SUPPORT FOR IMPACT AND CLIMATE ANALYSIS (TGICA): Due to the absence of TGICA representatives at the meeting, Chair Pachauri referred the plenary to the report of the Task Group (IPCC-XXXIV/Doc. 13).

TASK FORCE ON NATIONAL GREENHOUSE GAS INVENTORIES: TFB Co-Chair Thelma Krug (Brazil) reviewed progress on the 2013 Supplement to the 2006 IPCC Guidelines for National Greenhouse Gas Inventories: Wetlands (2013 Wetlands Supplement) work programme (IPCC-XXXIV/Doc. 12), and noted that a recent Lead Author meeting in Japan identified the scope and coverage of each chapter and addressed several cross-cutting and interacting issues. A Zero Order Draft is expected to be ready for the first science meeting next year. Co-Chair Krug also highlighted ongoing expert meetings and the success of an open symposium hosted in Japan on 22 August 2011, which aimed to explain the purpose and achievement of the TFI to the public.

SRREN: Head of WGIII TSU Jan Minx introduced this issue (IPCC-XXXIV/Doc. 17), noting the outreach activities and publication process timeline.

CROSS-CUTTING THEMES: IPCC Vice-Chair Hoesung Lee (Republic of Korea) discussed the coordination of cross-cutting themes for the AR5 SYR, highlighting that a questionnaire has been prepared and will be sent to the WGs to gain input into how the IPCC Vice-Chairs should best facilitate this process.

IPCC SCHOLARSHIP PROGRAMME: Secretary Christ updated the plenary on progress with the IPCC Scholarship Programme (IPCC-XXXIV/Doc. 16), noting that a total of nine students and researchers from developing countries had been awarded scholarships for the period 2011-2012. She said these included a postgraduate student from Uganda, Jamiat Nanteza, who would be working on climate-related disaster management issues. Secretary Christ stressed that the Secretariat does not have sufficient capacity to continue fundraising activities as there are no specific funds allocated for that work. She said they have been in contact with the UN Foundation that can conduct fundraising in the US but there would be charges involved.

Chair Pachauri underlined that the Programme had been launched with great success, highlighting many applications from the least developed countries, and said guidance is needed from the plenary on how to keep the Programme going. He said given the number of applications, it would be desirable to award at least 40 to 50 scholarships. The US expressed caution regarding this suggestion as it might require a big commitment from the IPCC leadership and Secretariat. He noted that this might also influence how the IPCC is perceived as an assessment body and recalled that when the Programme was launched there was no expectation this would become a major workstream. Belgium expressed interest in the opinion of the Board of Trustees to the Programme.

Chair Pachauri suggested this matter would be discussed at the Bureau meeting, which would provide a paper with a set of options on further direction for the Programme and ways to reduce the workload burden on the Secretariat, to be presented at the next IPCC session.

TIME AND PLACE OF THE NEXT SESSION

Croatia presented its offer to host the next session in Dubrovnik or elsewhere on the Adriatic Coast at a time to be determined.

Recalling the untimely death of SBSTA Chair Mama Konate, IPCC Vice-Chair van Ypersele called for always scheduling a break between any WG or approval session and a plenary session scheduled back-to-back in a way that, insofar as possible, respects participants’ health and wellbeing.

OTHER BUSINESS AND CLOSING OF THE SESSION

Secretary Christ presented on the outcome of the 16th WMO Congress related to the IPCC. She also noted that WMO had not yet decided on the request by IPCC-32 to WMO to not convert their in-cash contribution into in-kind contribution.

Also, Secretary Christ drew attention to a notification from UN Headquarters that the Republic of South Sudan was admitted as a new Member State by the UN General Assembly on 14 July 2011, and that the official name of the Libyan Arab Jamahiriya had been changed to Libya (IPCC-XXXIV/INF.2). The Panel agreed to reflect these changes in the necessary amendments. South Sudan has therefore become a new member of the IPCC, bringing the total of its members to 195 countries.

In his final remarks, Chair Pachauri thanked the government and people of Uganda for their hospitality and excellent organization of the meeting. The session closed at 4:45 pm with a dance performance celebrating Africa by Francis Hayes, conference officer, and local organizers.

A BRIEF ANALYSIS OF IPCC-34

THE CHALLENGE OF CHANGE

It was just a little over a year ago, in October 2010 in Busan, Republic of Korea, when Sir Peter Williams, Vice-President of The Royal Society, UK, presented the major findings and recommendations of the InterAcademy Council (IAC) review of the IPCC processes and procedures. The review was called for by UN Secretary-General Ban Ki-moon and IPCC Chair Rajendra Pachauri to address major criticisms of the IPCC’s work as a result of the discovery of a small number of serious factual errors in the Fourth Assessment Report, allegations of conflicts of interest among those involved in the assessment, and failure to respond adequately to these charges. The IAC report contained recommendations on reforming IPCC’s management and governance, communications strategy, and processes and procedures.

Since then, the IPCC has been busy addressing these recommendations, enacting changes that it hopes will make it more solid and able to weather the intense public scrutiny and attacks by climate change skeptics. At the same time, the IPCC has had to focus on its work on the Fifth Assessment Report (AR5), the cornerstone of its activities. With the IPCC midway through the AR5 cycle, these changes stand to have an impact on the AR5. It is a useful moment in time to begin to assess how much the decisions taken so far have led to substantive changes in the IPCC. This brief analysis will address these questions.

IMPLEMENTING CHANGE

IPCC-34 came at a time when the most difficult decisions in response to the IAC review have already been taken or are well advanced. A variety of organizational, procedural, governance and policy changes were made prior to the Kampala meeting. These include the establishment of an Executive Committee to provide management oversight and address emerging issues on behalf of the Panel between sessions; limiting the terms of office for key Bureau positions; the development of a conflict of interest policy; and increasing transparency in its procedures, including clarifying the selection of participants at expert meetings, authors and others. Other critical issues that have been tackled include a clear policy for correcting errors, strengthening of the review process, and improved guidance for authors, including on evaluation of evidence and consistent treatment of uncertainty.

This session in Kampala concentrated on completing revisions to the Procedures for the IPCC reports. As a result, the Panel finalized its work on the production and treatment of guidance material, the selection of participants to IPCC workshops and expert meetings, matters related to the transparency, quality and efficiency of the review process, anonymous expert review, and approval sessions for Summaries for Policy Makers.

Perhaps most notably, at this session the IPCC agreed on the Implementation Procedures for the Conflict of Interest Policy, which had been developed at IPCC-33. The agreement represented a source of much satisfaction among participants, who feel that the decision taken here allows for prompt implementation and adequate oversight by those who are most interested in maintaining the integrity of the IPCC—that is, the Panel’s Executive Committee. Importantly, implementation of the new comprehensive Conflict of Interest Policy will contribute to increased transparency of the IPCC process—just what the Panel needs to ensure the credibility of its findings.

To the dismay of many, however, the development and implementation of a comprehensive communications strategy is still incomplete. The IPCC has long acknowledged that its outreach and communication is critically deficient and attempts had been initiated to address it in the past, such as the first IPCC communications strategy in 2005-2006, which included the recruitment of a communications officer. The IAC review reinforced this criticism, finding that communication was a major weakness, and recommended the development of a communications strategy, including guidelines on who should speak on behalf of the IPCC. More than a year later, however, the IPCC still has no strategy in place and has not appointed a senior communications officer. In Kampala, the draft communications strategy was met with wide discontent. Many felt a senior communications professional should have been involved in the preparation of the strategy. In addition, others were concerned that the draft strategy had not been discussed by the Executive Committee prior to its presentation before the IPCC. With both the strategy and the appointment delayed, lack of progress on communications elicited much frustration among participants in Kampala and many others in the climate change community alike, and remains a critical gap in the response of the IPCC to the IAC review.

ASSESSING THE QUALITY OF CHANGE

Although it is too early to judge the transformational extent of the changes introduced in the IPCC as a result of the IAC review, it is useful to note some signs of the effects of these changes.

The most evident and welcome changes relate to increased transparency in the IPCC processes and procedures. There is more transparency and consistency over different stages of the assessment process, including the preparation, review, and endorsement of IPCC reports. There is a policy in place to address real or potential conflict of interest among all participants. There is even a better understanding of how the Panel is run, including its management structure, and roles and responsibilities. All these are critically important.

Changes affecting the quality of management and governance are, however, more difficult to see and assess. Having good rules is the start, but adherence and practice is what makes a difference. The fact that the Executive Committee was not consulted or involved in the recruitment of the senior communications professional came as a surprise to many.

One question was how the changes resulting from the IAC review would affect progress on the AR5. In many ways, the IAC review came at a convenient time for the IPCC—having just completed the Fourth Assessment Report and with the bulk of work concentrated on the Working Groups (WGs) as they initiated the AR5. In fact, many of the changes implemented had already been initiated by the WGs, including on a conflict of interest policy, guidance on the treatment of uncertainties and other guidance on procedures. Even the Executive Committee is a formalization of the previous Executive Team. As to the deliverables, the approval in the space of six months of two timely Special Reports –on Renewable Energy Sources and Climate Change Mitigation and on Managing the Risks of Extreme Events and Disasters to Advance Adaptation (SREX) —comes as evidence that the IAC review has not distracted the IPCC from its core business.

As one participant noted, the IAC review was not meant to illicit a revolution but an evolution. The significance of the IPCC reforms will only become apparent as new challenges arise. Assessing the quality of change, that is whether the reforms that the IPCC has already undertaken will actually lead to making the Panel stronger in front of the increased public scrutiny, remains to be seen.

Unfortunately, the lack of a comprehensive communications strategy stands in the way of making the Panel’s reforms and its work evident to the outside world. Communicating the complex science of climate extremes and impacts as presented in the SREX could have already benefited from it. That is why most participants see rapid progress on a communications strategy as vital to ensure success in the implementation of the IPCC changes. While progress on the AR5 is going well, the impact of the IPCC’s findings, and consequently its relevance, will be significantly influenced by how it is communication to the outside world.

UPCOMING MEETINGS

Joint 9th Meeting of the Vienna Convention COP and 23rd Montreal Protocol MOP: The 23rd session of the Meeting of the Parties to the Montreal Protocol on Substances that Deplete the Ozone Layer (MOP 23) and ninth meeting of the Conference of the Parties to the Vienna Convention for the Protection of the Ozone Layer (COP 9) are taking place in Bali. dates: 21-25 November 2011 location:Bali, Indonesia contact: Ozone Secretariat phone: +254-20-762-3851 fax: +254-20-762-4691 email: ozoneinfo@unep.org www:http://ozone.unep.org

UNFCCC COP 17 and COP/MOP 7: The 17th session of the UNFCCC Conference of the Parties (COP 17) and the 7th session of the Meeting of the Parties (MOP 7) to the Kyoto Protocol will take place in Durban, South Africa. The 35th session of the Subsidiary Body for Implementation (SBI), the 35th session of the Subsidiary Body for Scientific and Technological Advice (SBSTA), the Ad Hoc Working Group on Further Commitments for Annex I Parties under the Kyoto Protocol (AWG-KP), and the Ad Hoc Working Group on Long-term Cooperative Action under the Convention (AWG-LCA) will also meet. dates: 28 November – 9 December 2011 location: Durban, South Africa contact: UNFCCC Secretariat phone: +49-228-815-1000 fax: +49-228-815-1999 email: secretariat@unfccc.int www:http://unfccc.int/ and http://www.cop17durban.com

Eye on Earth Summit: The Eye on Earth Summit: Pursuing a Vision is being organized under the theme “Dynamic system to keep the world environmental situation under review.” This event will launch the global environmental information network (EIN) strengthening initiative and address major policy and technical issues. dates: 12-15 December 2011 location: Abu Dhabi, United Arab Emirates contact: Marije Heurter, Eye on Earth Event Coordinator phone: +971-2-693-4516 email: Marije.heurter@ead.ae orEoecommunity@ead.ae www: http://www.eyeonearthsummit.org/

Fifth World Future Energy Summit: The fifth World Future Energy Summit will take place from 16-19 January 2012, in Abu Dhabi, United Arab Emirates. The Summit will concentrate on energy innovation in policy implementation, technology development, finance and investment approaches, and existing and upcoming projects. The Summit will seek to set the scene for future energy discussions in 2012 with leading international speakers from government, industry, academia and finance, to share insights, expertise and cutting edge advances in technology. dates: 16-19 January 2012 location: Abu Dhabi, United Arab Emirates contact: Naji El Haddad phone: +971-2-409-0499 email:naji.haddad@reedexpo.ae www: http://www.worldfutureenergysummit.com/

IPCC WGIII AR5 Second Expert meeting on Scenarios: Scenarios have a key role in the WGIII contribution to the AR5 as an integrative element. Authors from all relevant chapters will meet to coordinate and integrate the scenario activities across chapters.dates: 17-18 March 2012 location: Wellington, New Zealand contact: IPCC Secretariat phone: +41-22-730-8208 fax: +41-22-730-8025 email:IPCC-Sec@wmo.int www: http://www.ipcc.ch/

UN Conference on Sustainable Development: The UNCSD (or Rio+20) will mark the 20th anniversary of the UN Conference on Environment and Development, which convened in Rio de Janeiro, Brazil dates: 20-22 June 2012 location: Rio de Janeiro, Brazil contact: UNCSD Secretariat email:uncsd2012@un.org www: http://www.uncsd2012.org/

IPCC WGIII AR5 Expert Meeting for Businesses and NGOs: Based on the good experiences made during the SRREN, WGIII will organize and execute an Expert Meeting for Businesses and NGOs. The meeting aims to gather structured input for consideration by the AR5 authors from these communities. The meeting will take place during the Expert Review Period (22 June – 20 August 2012). date: to be determined location: to be determined contact: IPCC Secretariat phone: +41-22-730-8208 fax: +41-22-730-8025 email:IPCC-Sec@wmo.int www: http://www.ipcc.ch/

IPCC 35th Session: The 35th session of the IPCC will consider pending issues arising from the consideration of the IAC Review of the IPCC processes and procedures, namely those on: governance and management, and communications strategy. dates: to be determined location: Croatia contact: IPCC Secretariat phone: +41-22-730-8208 fax: +41-22-730-8025 email:IPCC-Sec@wmo.intwww: http://www.ipcc.ch/

Roteiro para acordo global sobre o clima (Correio Braziliense)

JC e-mail 4393, de 28 de Novembro de 2011.

Por Connie Hedegaard

Quando ministros e negociadores de todo o mundo se reunirem, a partir de hoje, em Durban (África do Sul) para a Conferência da ONU sobre o Clima, será um momento decisivo para avançarmos no combate internacional contra as alterações climáticas.

Alguns perguntarão: não poderíamos aguardar um pouco e tratar do problema do clima depois de termos resolvido a crise da dívida na Europa, quando houver uma nova retomada do crescimento? A resposta é não. As inundações na Tailândia e as secas no Texas e no Chifre da África são apenas alguns dos mais recentes alertas de que o problema do clima não perdeu o caráter de urgência, porque as alterações climáticas estão se agravando. O recente relatório World Energy Outlook, da Agência Internacional da Energia (AIE), foi mais um sinal de alarme: o tempo está se esgotando e a fatura vai multiplicar-se assustadoramente se não agirmos já.

Portanto, o que podemos conseguir em Durban? Os comentários da comunicação social nos deixam a impressão de que só há uma forma de aferir o êxito: levar os países desenvolvidos a subscreverem um segundo período de compromisso do Protocolo de Kyoto, após o termo do primeiro, em 2012.

Sejamos claros: a UE apoia o Protocolo de Kyoto. Baseamos a nossa legislação nos seus princípios; somos a região do mundo com o objetivo mais ambicioso no âmbito de Kyoto – e estamos a cumpri-lo. Na verdade, estamos a caminho de ultrapassar o nosso objetivo.

Mas o Protocolo de Kyoto baseia-se numa distinção nítida entre países desenvolvidos e países em desenvolvimento e exige medidas apenas aos primeiros. Não lhes parece que a evolução da economia mundial ao longo das últimas duas décadas tem atenuado cada vez mais essa distinção?

Consideremos Cingapura e Coreia do Sul. São fortes economias de exportação, com indústrias competitivas e classificações impressionantes no Índice de Desenvolvimento Humano publicado pelo Programa das Nações Unidas para o Desenvolvimento. Contudo, no Protocolo de Quioto, figuram como países em desenvolvimento. Ou consideremos uma economia emergente dinâmica como o Brasil. Tem indústrias florescentes, recursos naturais imensos e um rendimento per capita visivelmente superior aos da Bulgária ou da Romênia, por exemplo.

Os padrões de poluição estão igualmente colocando em causa a distinção entre países desenvolvidos e países em desenvolvimento. Segundo a AIE, o atual aumento da poluição pelo CO2 é causado principalmente por economias emergentes dependentes do carvão. E essa tendência só irá acentuar-se. Até 2035, 90% do aumento da procura de energia caberão a países não pertencentes à OCDE. No caso da China, por exemplo, as suas emissões relacionadas com a energia triplicaram desde 1990, o que a torna o maior emissor mundial. Em média, um cidadão chinês emite hoje mais do que, por exemplo, um português, um sueco ou um húngaro. Por conseguinte, o mundo simplesmente não pode combater com eficácia as alterações climáticas sem o envolvimento da China e de outras economias emergentes.

Outro problema é que os Estados Unidos não subscreveram Kyoto – nem jamais subscreverão -, além de que o Japão, a Rússia e o Canadá disseram claramente que não tencionam aderir a um segundo período de compromisso. Significa isso, em suma, que, se a União Europeia subscrevesse um segundo período relativo a Kyoto, juntamente com algumas outras economias desenvolvidas, poderia cobrir, no máximo, 16% das emissões mundiais, quando o primeiro período de Kyoto cobria cerca de um terço. Como se pode chamar a isso uma vitória para o clima? Por outras palavras, esse critério não tem hipótese de manter o aumento da temperatura abaixo de 2°C (3,6°F), que a comunidade internacional reconheceu dever ser o nosso objetivo comum.

Para termos hipótese de alcançar aquele objetivo, o que realmente necessitamos é de um quadro de ação mundial por parte de todas as grandes economias, tanto no mundo desenvolvido quanto no mundo em desenvolvimento. Um quadro de ação que verdadeiramente reflita o mundo do século 21, no qual todos os compromissos tenham o mesmo peso jurídico. A União Europeia está aberta a um segundo período de Kyoto, sob condição de que a integridade ambiental de Kyoto seja melhorada e Durban aprove um roteiro e um calendário claros para a conclusão desse quadro nos anos mais próximos e a sua aplicação, ao mais tardar, em 2020.

É minha esperança que todos os países demonstrem a vontade e a liderança política necessárias para se iniciar um tal processo em Durban. Em Copenhague, os dirigentes juraram manter-se abaixo dos 2°C. Soou a hora de provarem que não falavam em vão.

Connie Hedegaard é comissária europeia responsável pela Ação Climática.