Arquivo da tag: Previsão

Conservatives’ Trust in Science at All-Time Low (Slate/L.A.Times)

A new study suggests a growing partisan divide as science plays an increasing role in policy debates.By  | Posted Thursday, March 29, 2012, at 1:29 PM ET

91275814
A new report suggests the number of conservatives who trust science is at an all-time low. Photo by Aude Guerrucci-Pool/Getty Images.

This may explain some of the rhetoric we’ve been hearing in GOP stump speeches of late: The number of conservatives who say they have a “great deal” of trust in science has fallen to 35 percent, down 28 points from the mid-1970s, according to a new academic paper.

The study, which was published Thursday in the American Sociological Review, found that liberal and moderate attitudes toward the topic have remained mostly unchanged since national pollsters first began posing the question in 1974, back when roughly half of all liberals and conservatives expressed significant trust in science.

The peer-reviewed research paper explains: “These results are quite profound because they imply that conservative discontent with science was not attributable to the uneducated but to rising distrust among educated conservatives.”

The man behind the study, UNC Chapel Hill’s Gordon Gauchat, says the change comes as conservatives have rebelled against the so-called “elite.”

“It kind of began with the loss of Barry Goldwater and the construction of Fox News and all these [conservative] think tanks. The perception among conservatives is that they’re at a disadvantage, a minority,” Gauchat explained in an interview with U.S. News. “It’s not surprising that the conservative subculture would challenge what’s viewed as the dominant knowledge production groups in society—science and the media.”

The sociologist suggested that the shift is also likely tied to science’s changing role in the national dialogue. In the middle of the 20th century, science was tied closely with NASA and the Department of Defense, but now it more frequently comes up when the conversation shifts to the environment and government regulations.

“Science has become autonomous from the government—it develops knowledge that helps regulate policy, and in the case of the EPA, it develops policy,” he said. “Science is charged with what religion used to be charged with—answering questions about who we are and what we came from, what the world is about. We’re using it in American society to weigh in on political debates, and people are coming down on a specific side.”

You can read a more of the interview at U.S. News, a more detailed recap of the the study over the Los Angeles Times, or check out the full paper here.

Conservatives’ trust in science has declined sharply

Since 1974, when conservatives had the highest trust in science, their confidence has dropped precipitously, an American Sociological Review study concludes.

By John Hoeffel – Los Angeles TimesMarch 29, 2012
As the Republican presidential race has shown, the conservatives who dominate the primaries are deeply skeptical of science — making Newt Gingrich, for one, regret he ever settled onto a couch with Nancy Pelosi to chat about global warming.A study released Thursday in the American Sociological Review concludes that trust in science among conservatives and frequent churchgoers has declined precipitously since 1974, when a national survey first asked people how much confidence they had in the scientific community. At that time, conservatives had the highest level of trust in scientists.

Confidence in scientists has declined the most among the most educated conservatives, the peer-reviewed research paper found, concluding: “These results are quite profound because they imply that conservative discontent with science was not attributable to the uneducated but to rising distrust among educated conservatives.”

“That’s a surprising finding,” said the report’s author, Gordon Gauchat, in an interview. He has a doctorate in sociology and is a postdoctoral fellow at the University of North Carolina at Chapel Hill.

To highlight the dramatic impact conservative views of science have had on public opinion, Gauchat pointed to results from Gallup, which found in 2012 that just 30% of conservatives believed the Earth was warming as a result of greenhouse gases versus 50% two years earlier. In contrast, the poll showed almost no change in the opinion of liberals, with 74% believing in global warming in 2010 versus 72% in 2008.

Gauchat suggested that the most educated conservatives are most acquainted with views that question the credibility of scientists and their conclusions. “I think those people are most fluent with the conservative ideology,” he said. “They have stronger ideological dispositions than people who are less educated.”

Chris Mooney, who wrote “The Republican War on Science,” which Gauchat cites, agreed. “If you think of the reasons behind this as nature versus nurture, all this would be nurture, that it was the product of the conservative movement,” he said. “I think being educated is a proxy for people paying attention to politics, and when they do, they tune in to Fox News and blogs.”

Gauchat also noted the conservative movement had expanded substantially in power and influence, particularly during the presidencies of Ronald Reagan and George W. Bush, creating an extensive apparatus of think tanks and media outlets. “There’s a whole enterprise,” he said.

Science has also increasingly come under fire, Gauchat said, because its cultural authority and its impact on government have grown. For years, he said, the role science played was mostly behind the scenes, creating better military equipment and sending rockets into space.

But with the emergence of the Environmental Protection Agency, for example, scientists began to play a crucial and visible role in developing regulations.

Jim DiPeso, policy director of Republicans for Environmental Protection, has been trying to move his party to the center on issues such as climate change, but he said many Republicans were wary of science because they believed it was “serving the agenda of the regulatory state.”

“There has been more and more resistance to accepting scientific conclusions,” he said. “There is concern about what those conclusions could lead to in terms of bigger government and more onerous regulation.”

The study also found that Americans with moderate political views have long been the most distrustful of scientists, but that conservatives now are likely to outstrip them.

Moderates are typically less educated than either liberals or conservatives, Gauchat said. “These folks are just generally alienated from science,” he said, describing them as the “least engaged and least knowledgeable about basic scientific facts.”

The study was based on results from the General Social Survey, administered between 1974 and 2010 by the National Opinion Research Center at the University of Chicago.

Gauchat, who has been studying public attitudes toward science for about eight years, has applied for a National Science Foundation grant to investigate why trust in science has waned. He plans to ask a battery of questions, including some focused on scientific controversies, such as those overvaccines and genetically modified foods, to try to understand what makes conservatives and moderates so distrustful.

“It’s not one simple thing,” he said.

john.hoeffel@latimes.com

Neela Banerjee in the Washington bureau contributed to this report.

Why The Future Is Better Than You Think (Reason.com)

Sharif Christopher Matar | March 15, 2012

Can a Masai Warrior in Africa today communicate better than Ronald Reagan could? If he’s on a cell phone, Peter Diamandis says he can.

Peter Diamandis is the founder and chairman of the X Prize Foundation, which offers big cash prizes “to bring about radical breakthroughs for the benefit of humanity.” Reason’s Tim Cavanaugh sat down to talk with Peter about his new book Abundance and why he think we live in an “incredible time”, but no one realizes it. Peter thinks that there are some powerful human forces combined with technological advancements that are transforming the world for the better.

“The challenge is that the rate of innovation is so fast…” Peter says, “the government can’t keep up with it.” If the government tries to play “catch up” with regulations and policy, the technology with just go overseas. Certain inovations in “food, water, housing, health, education is getting better and better.” Peter “hopes we are not going to be in a situation where, entrenched interests are preventing the consumer from having better health care.”

Filmed by Sharif Matar and Tracy Oppenheimer. Edited by Sharif Matar

Americans Listening to Politicians, Not Climate Scientists (Ars Technica/Wired)

By Scott K. Johnson, Ars Technica
February 27, 2012

US public opinion about climate change has been riding a roller coaster over the past decade. After signs of growing acceptance and emphasis around 2006 and 2007, a precipitous decline brought us back to where we started, with fully a quarter of the public not even thinking that the planet has warmed up. It’s not shocking that concerns about climate change would take a back seat to the economic recession, but that doesn’t explain why some are skeptical that global warming is even real.

Since economic turmoil does not extend to past temperature measurements, it seems clear that public acceptance of the data depends at least partly on something other than the data itself. So the natural question is — what’s driving public opinion? Why the big shifts? The answer to that question may hold the key to the US’ response to the changing climate.A recent study published in Climatic Change evaluates the impact of several potential opinion drivers: extreme weather events, public access to scientific information, media coverage, advocacy efforts, and the influence of political leaders. These are compared to a compilation of 74 surveys performed by six different organizations. The polls took place between 2002 and 2010, and provide a total of 84,000 responses. The researchers used all the questions that asked respondents to rate their concern about climate change to calculate a “climate change threat index” that could be tracked through time.

For extreme weather events, the researchers used NOAA’s Climate Extremes Index, which includes things like unusually high temperatures and precipitation events, as well as severe droughts. To evaluate public access to scientific information, they tracked the number of climate change papers published in Science, major assessments like the 2007 IPCC report, and climate change articles published in popular science magazines.

Similarly, media coverage was tracked with a simple count of stories appearing on broadcast evening news shows and in several leading periodicals. Advocacy was measured using a number of “major environmental” and “conservative magazines.” In addition, they captured the influence of Al Gore’s An Inconvenient Truth (a favorite target of climate contrarians) using the number of times it was mentioned in the New York Times.

Finally, they counted up congressional press releases, hearings, and votes on bills related to climate change. For comparison, they also looked at the influence of unemployment, GDP, oil prices, and the number of deaths associated with the wars in Iraq and Afghanistan.

The researchers compared each time series to their climate change threat index. They found no statistically significant correlation with extreme weather events, papers in Science(hardly shocking—when was the last time you found Science in the waiting room at the dentist’s?), or oil prices. There was a minor correlation with major scientific assessments.

While articles in popular science magazines and advocacy efforts (especially An Inconvenient Truth) appeared to have an effect, the impact of news media coverage came about because it is transmitting statements from political leaders, what the researchers refer to as “elite cues.” That’s where the meat of this story lies. Those elite cues were the most significant driver of public opinion, followed by economic factors.

The researchers note that around the time when public acceptance of climate change reached its peak, political bipartisanship on the subject also hit a high point. Republican Senator and (then) presidential candidate John McCain was pushing for climate legislation, and current presidential candidate Newt Gingrich filmed a commercial together with an unlikely partner — Democratic Congresswoman Nancy Pelosi — urging action.

And then things changed. The economy went pear-shaped and Republican rhetoric shifted into attack mode on climate science. Gingrich’s commercial with Pelosi offers one example — opposing candidates in the presidential race have used its mere existence as a weapon against him, and Gingrich has tried to distance himself, calling it “the dumbest thing I’ve done in the last four years.”

Flipping this around, it suggests that serious action on climate change depends on a healthy economy and bipartisan agreement among politicians. If that leaves you pondering a future connection between global warming legislation and icy conditions in hell, the cooperation in 2007 indicates it isn’t totally unthinkable.

In addition, recent polling has shown that acceptance of climate change is, once again, climbing among those who identify as moderate Republicans. It’s unclear how to interpret that in terms of this study’s conclusions. Is economic optimism having an impact, have Republican presidential candidates alienated moderates in the party, or is something totally different responsible?

While it’s certainly not surprising, it’s discouraging to see how little effect scientific outreach efforts and reports have had on public opinion. Even on simple questions like “Is there solid evidence that the Earth has warmed?” — it’s politicians that are driving public opinion, not scientists or the data they produce.

Image: Hurricane Ike in 2008. (NOAA)

The Inside Story on Climate Scientists Under Siege (Wired/The Guardian)

By Suzanne Goldenberg, The Guardian
February 17, 2012 |

It is almost possible to dismiss Michael Mann’s account of a vast conspiracy by the fossil fuel industry to harass scientists and befuddle the public. His story of that campaign, and his own journey from naive computer geek to battle-hardened climate ninja, seems overwrought, maybe even paranoid.

But now comes the unauthorized release of documents showing how a libertarian thinktank, the Heartland Institute, which has in the past been supported by Exxon, spent millions on lavish conferences attacking scientists and concocting projects to counter science teaching for kindergarteners.

Mann’s story of what he calls the climate wars, the fight by powerful entrenched interests to undermine and twist the science meant to guide government policy, starts to seem pretty much on the money. He’s telling it in a book out on March 6, The Hockey Stick and the Climate Wars: Dispatches From the Front Lines.

“They see scientists like me who are trying to communicate the potential dangers of continued fossil fuel burning to the public as a threat. That means we are subject to attacks, some of them quite personal, some of them dishonest.” Mann said in an interview conducted in and around State College, home of Pennsylvania State University, where he is a professor.

It’s a brilliantly sunny day, and the light snowfall of the evening before is rapidly melting.

Mann, who seems fairly relaxed, has just spoken to a full-capacity, and uniformly respectful and supportive crowd at the university.

It’s hard to square the surroundings with the description in the book of how an entire academic discipline has been made to feel under siege, but Mann insists that it is a given.

“It is now part of the job description if you are going to be a scientist working in a socially relevant area like human-caused climate change,” he said.

He should know. For most of his professional life has been at the center of those wars, thanks to a paper he published with colleagues in the late 1990s showing a sharp upward movement in global temperatures in the last half of the 20th century. The graph became known as the “hockey stick”.

If the graph was the stick, then its publication made Mann the puck. Though other prominent scientists, such as Nasa’s James Hansen and more recently Texas Tech University’s Katharine Hayhoe, have also been targeted by contrarian bloggers and thinktanks demanding their institutions turn over their email record, it’s Mann who’s been the favorite target.

He has been regularly vilified on Fox news and contrarian blogs, and by Republican members of Congress. The attorney general of Virginia, who has been fighting in the courts to get access to Mann’s email from his earlier work at the University of Virginia. And then there is the high volume of hate mail, the threats to him and his family.

“A day doesn’t go by when I don’t have to fend off some attack, some specious criticism or personal attack,” he said. “Literally a day doesn’t go by where I don’t have to deal with some of the nastiness that comes out of a campaign that tries to discredit me, and thereby in the view of our detractors to discredit the entire science of climate change.”

By now he and other climate scientists have been in the trenches longer than the U.S. army has been in Afghanistan.

And Mann has proved a willing combatant. He has not gone so far as Hansen, who has been arrested at the White House protesting against tar sands oil and in West Virginia protesting against coal mining. But he spends a significant part of his working life now blogging and tweeting in his efforts to engage with the public – and fending off attacks.

On the eve of his talk at Penn State, a coal industry lobby group calling itself the Common Sense Movement/Secure Energy for America put up a Facebook page demanding the university disinvite their own professor from speaking, and denouncing Mann as a “disgraced academic” pursuing a radical environmental agenda. The university refused. Common Sense appeared to have dismantled the Facebook page.

But Mann’s attackers were merely regrouping. A hostile blogger published a link to Mann’s Amazon page, and his opponents swung into action, denouncing the book as a “fairy tale” and climate change as “the greatest scam in human history.”

It was not the life Mann envisaged when he began work on his post-graduate degree at Yale. All Mann knew then was that he wanted to work on big problems, that resonated outside academia. At heart, he said, he was like one of the amiable nerds on the television show Big Bang Theory.

“At that time I wanted nothing more than just to bury my head in my computer and study data and write papers and write programs,” he said. “That is the way I was raised. That is the culture I came from.”

What happened instead was that the “hockey stick” graph, because it so clearly represented what had happened to the climate over the course of hundreds of years, itself became a proxy in the climate wars. (Mann’s reconstruction of temperatures over the last millennium itself used proxy records from tree rings and coral).

“I think because the hockey stick became an icon, it’s been subject to the fiercest of attacks really in the whole science of climate change,” he said.

The U.N.’s Intergovernmental Panel on Climate Change produced a poster-sized graph for the launch of its climate change report in 2001.

Those opposed to climate change began accusing Mann of overlooking important data or even manipulating the records. None of the allegations were ever found to have substance. The hockey stick would eventually be confirmed by more than 10 other studies.

Mann, like other scientists, was just not equipped to deal with the media barrage. “It took the scientific community some time I think to realize that the scientific community is in a street fight with climate change deniers and they are not playing by the rules of engagement of science. The scientific community needed some time to wake up to that.”

By 2005, when Hurricane Katrina drew Americans’ attention to the connection between climate change and coastal flooding, scientists were getting better at making their case to the public. George Bush, whose White House in 2003 deleted Mann’s hockey stick graph from an environmental report, began talking about the need for biofuels. Then Barack Obama was elected on a promise to save a planet in peril.

But as Mann lays out in the book, the campaign to discredit climate change continued to operate, largely below the radar until November 2009 when a huge cache of email from the University of East Anglia’s Climatic Research Unit was released online without authorization.

Right-wing media and bloggers used the emails to discredit an entire body of climate science. They got an extra boost when an embarrassing error about melting of Himalayan glaciers appeared in the U.N.’s IPCC report.

Mann now admits the climate community took far too long to realize the extent of the public relations debacle. Aside from the glacier error, the science remained sound. But Mann said now: “There may have been an overdue amount of complacency among many in the scientific community.”

Mann, who had been at the center of so many debates in America, was at the heart of the East Anglia emails battle too.

Though he has been cleared of any wrongdoing, Mann does not always come off well in those highly selective exchanges of email released by the hackers. In some of the correspondence with fellow scientists, he is abrupt, dismissive of some critics. In our time at State College, he mentions more than once how climate scientists are a “cantankerous” bunch. He has zero patience, for example, for the polite label “climate skeptic” for the network of bloggers and talking heads who try to discredit climate change.

“When it comes to climate change, true skepticism is two-sided. One-sided skepticism is no skepticism at all,” he said. “I will call people who deny the science deniers … I guess I won’t be deterred by the fact that they don’t like the use of that term and no doubt that just endears me to them further.”

“It’s frustrating of course because a lot of us would like to get past this nonsensical debate and on to the real debate to be had about what to do,” he said.

But he said there are compensations in the support he gets from the public. He moves over to his computer to show off a web page: I ❤ climate scientists. He’s one of three featured scientists. “It only takes one thoughtful email of support to offset a thousand thoughtless attacks,” Mann said.

And although there are bad days, he still seems to believe he is on the winning side.

Across America, this is the third successive year of weird weather. The U.S. department of agriculture has just revised its plant hardiness map, reflecting warming trends. That is going to reinforce scientists’ efforts to cut through the disinformation campaign, Mann said.

“I think increasingly the campaign to deny the reality of climate change is going to come up against that brick wall of the evidence being so plain to people whether they are hunters, fishermen, gardeners,” he said.

And if that doesn’t work then Mann is going to fight to convince them.

“Whether I like it or not I am out there on the battlefield,” he said. But he believes the experiences of the last decade have made him, and other scientists, far better fighters.

“Those of us who have had to go through this are battle-hardened and hopefully the better for it,” he said. “I think you are now going to see the scientific community almost uniformly fighting back against this assault on science. I don’t know what’s going to happen in the future, but I do know that my fellow scientists and I are very ready to engage in this battle.”

Video: James West, The Climate Desk

Original story at The Guardian.

Newly Discovered Space Rock Is Headed Toward Earth, Estimated Time of Arrival 2040 (POPSCI.com)

The UN is figuring out how to ward off a potential collision

By Clay Dillow
Posted 02.27.2012 at 1:34 pm

Earth, and the Near-Earth Objects that Threaten It ESA – P.Carril

All eyes are on the asteroid Apophis, but a new threat–just 460 feet wide–dominated the conversation at a recent meeting of the UN Action Team on near-Earth objects (NEOs). Known as 2011 AG5, the asteroid could well be on a collision course with Earth in 2040, and some are already calling on scientists to figure out how to deflect it.

Discovered early last year, 2011 AG5 is still somewhat of a mystery to astronomers, as they have a pretty good idea how big it is but have only been able to observe it for roughly half an orbit. That makes it difficult to project the object’s path over time–and to verify whether it may be a threat in 2040. Ideally, researchers would like to observe at least two full orbits before making projections about an NEO’s path, but that hasn’t stopped several in the astronomy from fixing odds on an impact in 2040.

Specifically, those odds are currently at 1 in 625 for an impact on Feb. 5, 2040. But like most odds, these are fluid. From 2013 to 2016, the asteroid will be observable from the ground, and that will give NEO watchers a better idea of its orbit and future trajectory. If those observations don’t vastly diminish the odds of an impact, there should still be time to do something about it before its 2023 keyhole pass.Like Apophis, which may or may not impact Earth in 2036, 2011 AG5 has a keyhole–a region is space near Earth through which it would travel if indeed it is going to impact us on its next pass. It will make its keyhole pass on its approach near Earth in February 2023 when it comes within just 0.02 astronomical units of Earth (that’s roughly 1.86 million miles). NASA’s Jet Propulsion Lab estimates 2011 AG5’s keyhole is about 62 miles wide–not big at all by astronomical standards, but bigger than Apophis’s.

If 2011 AG5 does look like it is going to pass through that keyhole after the 2013-2016 observations, scientists will have a few years to figure out how to alter its orbit and push it outside of the keyhole in 2023, thus averting disaster 17 years later. Such a deflection mission could be good practice. Apophis will make a run at its keyhole in 2029.

 

O planeta doente (culturaebarbarie.org)

por Guy Debord

A “poluição” está hoje na moda, exatamente da mesma maneira que a revolução: ela se apodera de toda a vida da sociedade e é representada ilusoriamente no espetáculo. Ela é tagarelice tediosa numa pletora de escritos e de discursos errôneos e mistificadores, e, nos fatos, ela pega todo mundo pelo pescoço. Ela se expõe em todo lugar enquanto ideologia e ganha terreno enquanto processo real. Esses dois movimentos antagônicos, o estágio supremo da produção mercantil e o projeto de sua negação total, igualmente ricos de contradições em simesmos, crescem em conjunto. São os dois lados pelos quais se manifesta um mesmo momento histórico há muito tempo esperado e freqüentemente previsto sob figuras parciais inadequadas: a impossibilidade da continuação do funcionamento do capitalismo.

A época que tem todos os meios técnicos de alterar as condições de vida na Terra é igualmente a época que, pelo mesmo desenvolvimento técnico e científico separado, dispõe de todos os meios de controle e de previsão matematicamente indubitável para medir com exatidão antecipada para onde conduz — e em que data — o crescimento automático das forças produtivas alienadas da sociedade de classes: isto é, para medir a degradação rápida das condições de sobrevida, no sentido o mais geral e o mais trivial do termo.

Enquanto imbecis passadistas ainda dissertam sobre, e contra, uma crítica estética de tudo isso, e crêem mostrar-se lúcidos e modernos por se mostrarem esposados com seu século, proclamando que a auto-estrada ou Sarcelles têm sua beleza que se deveria preferir ao desconforto dos “pitorescos” bairros antigos ou ainda fazendo observar gravemente que o conjunto da população come melhor, a despeito das nostalgias da boa cozinha, já o problema da degradação da totalidade do ambiente natural e humano deixou completamente de se colocar no plano da pretensa qualidade antiga, estética ou outra, para se tornar radicalmente o próprio problema da possibilidade material de existência do mundo que persegue um tal movimento. A impossibilidade está de fato já perfeitamente demonstrada por todo o conhecimento científico separado, que discute somente sua data de vencimento; e os paliativos que, se fossem aplicados firmemente, a poderiam regular superficialmente. Uma tal ciência apenas pode acompanhar em direção à destruição o mundo que a produziu e que a mantém; mas ela é obrigada a fazê-lo com os olhos abertos. Ela mostra assim, num nível caricatural, a inutilidade do conhecimento sem uso.

Mede-se e se extrapola com uma precisão excelente o aumento rápido da poluição química da atmosfera respirável, da água dos rios, dos lagos e até mesmo dos oceanos; e o aumento irreversível da radioatividade acumulada pelo desenvolvimento pacífico da energia nuclear, dos efeitos do barulho, da invasão do espaço por produtos de materiais plásticos que podem exigir uma eternidade de depósito universal, da natalidade louca, da falsificação insensata dos alimentos, da lepra urbanística que se estende sempre mais no lugar do que antes foram a cidade e o campo; assim como as doenças mentais — aí compreendidas as fobias neuróticas e as alucinações que não poderiam deixar de se multiplicar bem cedo sobre o tema da própria poluição, da qual se mostra em todo lugar a imagem alarmante — e do suicídio, cujas taxas de expansão se entrecruzam já exatamente com as de edificação de um tal ambiente (para não falar dos efeitos da guerra atômica ou bacteriológica, cujos meios estão posicionados como a espada de Dâmocles, mas permanecem evidentemente evitáveis).

Logo, se a amplitude e a própria realidade dos “terrores do Ano Mil” são ainda um assunto controverso entre os historiadores, o terror do Ano Dois Mil é tão patente quanto bem fundado; ele é desde o presente uma certeza científica. Contudo, o que se passa não é em si mesmo nada novo: é somente o fim necessário do antigo processo. Uma sociedade cada vez mais doente, mas cada vez mais poderosa, recriou em todo lugar concretamente o mundo como ambiente e décorde sua doença, enquanto planeta doente. Uma sociedade que não se tornou ainda homogênea e que não é mais determinada por si mesma, mas cada vez maispor uma parte dela mesma que lhe é superior, desenvolveu um movimento de dominação da natureza que contudo não se dominou a si mesmo. O capitalismo finalmente trouxe a prova, por seu próprio movimento, de que ele não pode mais desenvolver as forças produtivas; e isso não quantitativamente, como muitos acreditaram compreender, mas qualitativamente.

Contudo, para o pensamento burguês, metodologicamente, somente o quantitativo é o sério, o mensurável, o efetivo; e o qualitativo é somente a incerta decoração subjetiva ou artística do verdadeiro real estimado em seu verdadeiro peso. Ao contrário, para o pensamento dialético, portanto, para a história e para o proletariado, o qualitativo é a dimensão a mais decisiva do desenvolvimento real. Eis aí o que o capitalismo e nós terminamos por demonstrar.

Os senhores da sociedade são obrigados agora a falar da poluição, tanto para combatê-la (pois eles vivem, apesar de tudo, no mesmo planeta que nós; é este o único sentido ao qual se pode admitir que o desenvolvimento do capitalismo realizou efetivamente uma certa fusão das classes) e para a dissimular, pois a simples verdade dos danos e dos riscos presentes basta para constituir um imenso fator de revolta, uma exigência materialista dos explorados, tão inteiramente vital quanto o foi a luta dos proletários do século XIX pela possibilidade de comer. Após o fracasso fundamental de todos os reformismos do passado — que aspiram todos eles à solução definitiva do problema das classes —, um novo reformismo se desenha, que obedece às mesmas necessidades que os precedentes: lubrificar a máquina e abrir novas oportunidades de lucros às empresas de ponta. O setor mais moderno da indústria se lança nos diferentes paliativos da poluição, como em um novo nicho de mercado, tanto mais rentável quanto mais uma boa parte do capital monopolizado pelo Estado nele está a empregar e a manobrar. Mas se este novo reformismo tem de antemão a garantia de seu fracasso, exatamente pelas mesmas razões que os reformismos passados, ele guarda em face deles a radical diferença de que não tem mais tempo diante de si.

O desenvolvimento da produção se verificou inteiramente até aqui enquanto realização daeconomia política: desenvolvimento da miséria, que invadiu e estragou o próprio meio da vida. A sociedade em que os produtores se matam no trabalho, e cujo resultado devem somente contemplar, lhes deixa claramente ver, e respirar, o resultado geral do trabalho alienado enquanto resultado de morte. Na sociedade da economia superdesenvolvida, tudo entrou na esfera dos bens econômicos, mesmo a água das fontes e o ar das cidades, quer dizer que tudo se tornou o mal econômico, “negação acabada do homem” que atinge agora sua perfeita conclusão material. O conflito entre as forças produtivas modernas e as relações de produção, burguesas ou burocráticas, da sociedade capitalista entrou em sua fase última. A produção da não-vida prosseguiu cada vez mais seu processo linear e cumulativo; vindo a atravessar um último limiar em seu progresso, ela produz agora diretamente a morte.

A função última, confessada, essencial, da economia desenvolvida hoje, no mundo inteiro em que reina o trabalho-mercadoria, que assegura todo o poder a seus patrões, é a produção dos empregos. Está-se bem longe das idéias “progressistas” do século anterior [século XIX] sobre a diminuição possível do trabalho humano pela multiplicação científica e técnica da produtividade, que se supunha assegurar sempre mais facilmente a satisfação das necessidades anteriormente reconhecidas por todos reais e sem alteração fundamental da qualidade mesma dos bens que se encontrariam disponíveis. É presentemente para produzir empregos, até nos campos esvaziados de camponeses, ou seja, para utilizar o trabalho humano enquanto trabalho alienado, enquanto assalariado, que se faz todo o resto; e, portanto, que se ameaça estupidamente as bases, atualmente mais frágeis ainda que o pensamento de um Kennedy ou de um Brejnev, da vida da espécie.

O velho oceano é em si mesmo indiferente à poluição; mas a história não o é. Ela somente pode ser salva pela abolição do trabalho-mercadoria. E nunca a consciência histórica teve tanta necessidade de dominar com tanta urgência seu mundo, pois o inimigo que está à sua porta não é mais a ilusão, mas sua morte.

Quando os pobres senhores da sociedade da qual vemos a deplorável conclusão, bem pior do que todas as condenações que puderam fulminar outrora os mais radicais dos utopistas, devem presentemente reconhecer que nosso ambiente se tornou social, que a gestão detudo se tornou um negócio diretamente político, até as ervas dos campos e a possibilidade de beber, até a possibilidade de dormir sem muitos soníferos ou de tomar um banho sem sofrer de alergias, num tal momento se deve ver também que a velha política especializada deve reconhecer que ela está completamente finda.

Ela está finda na forma suprema de seu voluntarismo: o poder burocrático totalitário dos regimes ditos socialistas, porque os burocratas no poder não se mostraram capazes nem mesmo de gerir o estágio anterior da economia capitalista. Se eles poluem muito menos — apenas os Estados Unidos produzem sozinhos 50% da poluição mundial — é porque são muito mais pobres. Eles somente podem, como por exemplo a China, reunindo em bloco uma parte desproporcionada de sua contabilidade de miséria, comprar a parte de poluição de prestígio das potências pobres, algumas descobertas e aperfeiçoamentos nas técnicas da guerra termonuclear, ou mais exatamente, do espetáculo ameaçador. Tanta pobreza, material e mental, sustentada por tanto terrorismo, condena as burocracias no poder. E o que condena o poder burguês mais modernizado é o resultado insuportável de tanta riquezaefetivamente empestada. A gestão dita democrática do capitalismo, em qualquer país que seja, somente oferece suas eleições-demissões que, sempre se viu, nunca mudava nada no conjunto, e mesmo muito pouco no detalhe, numa sociedade de classes que se imaginava poder durar indefinidamente. Elas aí não mudam nada de mais no momento em que a própria gestão enlouquece e finge desejar, para cortar certos problemas secundários embora urgentes, algumas vagas diretrizes do eleitorado alienado e cretinizado (U.S.A., Itália, Inglaterra, França). Todos os observadores especializados sempre salientaram — sem se preocuparem em explicar — o fato de que o eleitor não muda nunca de “opinião”: é justamente porque é eleitor, o que assume, por um breve instante, o papel abstrato que é precisamente destinado a impedir de ser por si mesmo, e de mudar (o mecanismo foi demonstrado centenas de vezes, tanto pela análise política desmistificada quanto pelas explicações da psicanálise revolucionária). O eleitor não muda mais quando o mundo muda sempre mais precipitadamente em torno dele e, enquanto eleitor, ele não mudaria mesmo às vésperas do fim do mundo. Todo sistema representativo é essencialmente conservador, mesmo se as condições de existência da sociedade capitalista não puderam nunca ser conservadas: elas se modificam sem interrupção, e sempre mais rápido, mas a decisão — que afinal é sempre a decisão de liberar o próprio processo da produção capitalista — é deixada inteiramente aos especialistas da publicidade, quer sejam eles únicos na competição ou em concorrência com aqueles que vão fazer a mesma coisa, e aliás o anunciam abertamente. Contudo, o homem que vota “livremente” nos gaullistas ou no P.C.F., tanto quanto o homem que vota, constrangido e forçado, num Gomulka, é capaz de mostrar o que ele verdadeiramente é, na semana seguinte, participando de uma greve selvagem ou de uma insurreição.

A autoproclamada “luta contra a poluição”, por seu aspecto estatal e legalista, vai de início criar novas especializações, serviços ministeriais, cargos, promoção burocrática. E sua eficácia estará completamente na medida de tais meios. Mas ela somente pode se tornar uma vontade real ao transformar o sistema produtivo atual em suas próprias raízes. E somente pode ser aplicada firmemente no instante em que todas suas decisões, tomadas democraticamente em conhecimento pleno de causa, pelos produtores, estiverem a todo instante controladas e executadas pelos próprios produtores (por exemplo, os navios derramarão infalivelmente seu petróleo no mar enquanto não estiverem sob a autoridade de reais soviets de marinheiros). Para decidir e executar tudo isso, é preciso que os produtores se tornem adultos: é preciso que se apoderem todos do poder.

O otimismo científico do século XIX se desmoronou em três pontos essenciais. Primeiro, a pretensão de garantir a revolução como resolução feliz dos conflitos existentes (esta era a ilusão hegelo-esquerdista e marxista; a menos notada naintelligentsia burguesa, mas a mais rica e, afinal, a menos ilusória). Segundo, a visão coerente do universo, e mesmo simplesmente, da matéria. Terceiro, o sentimento eufórico e linear do desenvolvimento das forças produtivas. Se nós dominarmos o primeiro ponto, teremos resolvido o terceiro; e saberemos fazer bem mais tarde do segundo nossa ocupação e nosso jogo. Não é preciso tratar dos sintomas, mas da própria doença. Hoje o medo está em todo lugar, somente sairemos dele confiando-nos em nossas próprias forças, em nossa capacidade de destruir toda alienação existente e toda imagem do poder que nos escapou. Remetendo tudo, com exceção de nós próprios, ao único poder dos Conselhos de Trabalhadores possuindo e reconstruindo a todo instante a totalidade do mundo, ou seja, à racionalidade verdadeira, a uma legitimidade nova.

Em matéria de ambiente “natural” e construído, de natalidade, de biologia, de produção, de “loucura” etc., não haverá que escolher entre a festa e a infelicidade, mas, conscientemente e em cada encruzilhada, entre, de um lado, mil possibilidades felizes ou desastrosas, relativamente corrigíveis, e, de outra parte, o nada. As escolhas terríveis do futuro próximo deixam esta única alternativa: democracia total ou burocracia total. Aqueles que duvidam da democracia total devem esforçar-se para fazer por si mesmos a prova dela, dando-lhe a oportunidade de se provar em marcha; ou somente lhes resta comprar seu túmulo a prestações, pois “a autoridade, se a viu em obra, e suas obras a condenam” (Jacques Déjacque).

“A revolução ou a morte”: esse slogan não é mais a expressão lírica da consciência revoltada, é a última palavra do pensamento científico de nosso século [XX]. Isso se aplica aos perigos da espécie como à impossibilidade de adesão pelos indivíduos. Nesta sociedade em que o suicídio progride como se sabe, os especialistas tiveram que reconhecer, com um certo despeito, que ele caíra a quase nada em maio de 1968. Essa primavera obteve assim, sem precisamente subi-lo em assalto, um bom céu, porque alguns carros queimaram e porque a todos os outros faltou combustível para poluir. Quando chove, quando há nuvens sobre Paris, não esqueçam nunca que isso é responsabilidade do governo. A produção industrial alienada faz chover. A revolução faz o bom tempo.

Escrito em 1971, por Guy Debord, para aparecer no nº 13 da revista Internacional Situacionista, este artigo permaneceu inédito até recentemente, quando foi publicado, junto com dois outros textos do mesmo autor, em La Planète malade (Paris, Gallimard, 2004, pp. 77-94). A tradução de “O planeta doente” aqui publicada apareceu pela primeira vez em http://juralibertaire.over-blog.com/article-13908597.html. Tradução de Emiliano Aquino (http://emilianoaquino.blogspot.com/).

Fonte:  http://culturaebarbarie.org/sopro/arquivo/planetadoente.html

19 Climate Games that Could Change the Future (Climate Interactive Blog)

By 

March 9, 2012 – 10:13 a.m.

The prevalence of games in our culture provides an opportunity to increase the understanding of our global challenges. In 2008 the Pew Research Centerestimated that over half of American adults played video games and 80% of young Americans play video games. The vast majority of these games serve purely to entertain. There are a growing number of games that aim to make a difference, however. These games range from those that show players the complexity of creating adequate aid packages and delivering them to places in need to games thatrequire people to get out and work to improve their communities to do well in the game.

Looking at the climate change challenge there are a number of games and interactive tools to broaden our understanding of the dynamics involved.Climate Interactive, for one, has led the development of the role-playing game World Climate, which simulates the UN climate change negotiations and is being adopted from middle school all the way up to executive management-level classrooms. Many are recognizing the power of games and everyone from government agencies to NGOs to a group of teenagers is trying to launch a game to help address climate change. Below are some of the climate and sustainability-related games we’ve found. Let us know if you’ve found others.

Computer Games:

Climate Challenge

1. Climate Challenge: The player acts as a European leader who must make decisions for their nation to reduce CO2 emissions, but must also keep in mind public and international approval, energy, food, and financial needs.

2. Fate of the World: A PC game that challenges players to solve the crises facing the Earth from natural disasters and climate change to political uprisings and international relations.

3. CEO2: A game that puts players at the head of a company in one of four industries. The player must then make decisions to reduce the CO2 and maintain (and increase) the company’s value.

4. VGas: Users build a house and select the best furnishing and lifestyle choices to have the lowest carbon footprint.

5. CO2FX: A multi-player educational game, designed for students in high school, which explores the relationship of climate change to economic, political, and science policy decisions.

6. “Operation: Climate Control” Game: A multi-player computer game where the player’s role is to decide on local environmental policy for Europe through the 21st century.

My2050

7. My2050: An interactive game to determine a scenario for the UK to lower its CO2 emissions 20% below 1990 levels by 2050. The user can select from adjustments in sectors from energy to transit.

8. Plan it Green: Gamers act as the planners of a city to revitalize it to become a greener town through energy retrofits, clean energy jobs, and green building.

9. Logicity: A game that challenges players to reduce their carbon footprints by making decisions in a virtual city.

10. Electrocity: A game designed for school children in New Zealand to plan a city that balances the needs of energy, development, and the environment.

11. Climate Culture: A virtual social networking game based on players’ actual carbon footprints and lifestyle choices. Players compete to earn badges and awards for their decisions.

12. World Without Oil: An alternate reality game that was played out on blogs and other social media platforms for 32 weeks in 2007 by thousands of players to simulate what might happen if there was an oil crisis and oil became inaccessible. Participants wrote blogs and made videos about their experience as if it was real.

13. SimCity 5 (coming 2013): With over 20 years of experience and millions of players the SimCity series has captured imaginations by putting players in control of developing cities. Recently announced, SimCity 5 will add among other things the need to face sustainability challenges like climate change, limited natural resources, and urban walkability.

Role-playing Games:

14. World Climate Exercise: A role-playing game for groups that simulates the UN climate change negotiations by dividing the group into regional and national negotiating teams to negotiate a treaty to 2 degrees or less. 

15. “Stabilization Wedge” Game: A game to show participants the different ways to cut carbon emissions, through the concept of wedges.

Board Games:

16. Climate Catan: Building on the widely popular board game Settlers of Catan, this version adds oil as resource that spurs development but if too much is used it also instigates a climate related disaster which can ruin development.

17. Climate-Poker: A card game with the aim to have the largest climate conference in order to address climate change.

18. Keep Cool- Gambling with the Climate: Players take on the roles of national political leaders trying to address climate change and must make decisions about the type of growth and balance the desires of lobby groups and challenges of natural disasters.

19. Polar Eclipse Game: A game where players navigate different decisions in order to chart a path to future that avoids the worst temperature rise.

Lessons from Gaming for Climate Wonks and Leaders — Video

By 

Games can help us ensure that climate and energy analysis gets used to make a difference. Last week at the Climate Prediction Applications Science Workshopin Miami, Climate Interactive co-director Drew Jones, gave a keynote presentation to an audience of climate analysts, many who are working to communicate the massive amount of climate data to the public.

In Drew’s speech below, he draws out the key things that we are learning from games, like Angry Birds, Farmville, World of Warcraft, and the existing efforts to integrate climate change into games. Also included in this presentation, but left out of the video, was a condensed version of the World Climate Exercise, a game that Climate Interactive has developed to help people explore the complex dynamics encountered at the international climate change negotiations.

New report reveals how corporations undermine science with fake bloggers and bribes (io9)

BY ANNALEE NEWITZ

MAR 9, 2012 2:22 PM

You’ve probably heard about how the tobacco industry tried to suppress scientific evidence that smoking causes cancer by publishing shady research, bribing politicians, and pressuring researchers. But you may not have realized that tabacco’s dirty tricks are just the tip of the iceberg. In a disturbing new report published by the Union of Concerned Scientists about corporate corruption of the sciences, you’ll learn about how Monsanto hired a public relations team to invent fake people who harassed a scientific journal online, how Coca Cola offers bribes to suppress evidence that soft drinks harm kids’ teeth, and more. Here are some of the most egregious recent examples of corruption from this must-read report.

The report is a meaty assessment of corporate corruption in science that stretches back to incidents with Big Tobacco in the 1960s, up through contemporary examples. Here are just a few of those.

One way that corporations prevent negative information about their products from getting out is by harassing scientists and the journals that publish them. Here’s how Monsanto did it:

Dr. Ingacio Chapela of the University of California–Berkeley and graduate student David Quist published an article in Nature showing that DNA from genetically modified corn was contaminating native Mexican corn. The research spurred immediate backlash.Nature received a number of letters to the editor, including several comments on the Internet from “Mary Murphy” and “Andura Smetacek” accusing the scientists of bias. The backlash prompted Nature to publish an editorial agreeing that the report should not have been published. However, investigators eventually discovered that the comments from Murphy and Smetacek originated with The Bivings Group, a public relations firm that specializes in online communications and had worked for Monstanto. Mary Murphy and Andura Smetacek were found to be fictional names.

Corporations also form front organizations to hide their efforts to undermine science. That’s what happened when producers of unhealthy food got together to cast doubt on the FDA’s recommended health guidelines:

The Center for Consumer Freedom is a nonprofit that targets dietary guidelines recommended by the FDA, other government agencies, medical associations, and consumer advocacy organizations. The center has run ads and owns a website that accuses government agencies of overregulation, and has published articles claiming to refute evidence that high salt intake and other dietary guidelines are based on inadequate science. The center was founded with a $600,000 grant from Philip Morris, but has also received funding from Cargill, National Steak and Poultry, Monsanto, Coca-Cola, and Sutter Home Winery.

Sometimes corporations just go for it and buy off legit organizations, as Coca Cola did when they appear to have paid dentists to stop saying kids shouldn’t drink Coke:

In 2003, the American Academy of Pediatric Dentistry accepted a $1 million donation from Coca-Cola. That year, the group claimed that “scientific evidence is certainly not clear on the exact role that soft drinks play in terms of children’s oral disease.” The statement directly contradicted the group’s previous stance that “consumption of sugars in any beverage can be a significant factor…that contributes to the initiation and progression of dental caries.”

Corporations can also unduly influence federal agencies, as ReGen did when they wanted their device approved for trials by the FDA, despite serious medical problems:

ReGen Biologics attempted to gain FDA approval for clinical trials of Menaflex, a device it developed to replace knee cartilage. After an FDA panel rejected the device, the company enlisted four members of Congress from its home state of New Jersey to influence the evaluation process. In December 2007, Senator Frank Lautenberg, Senator Robert Menendez, and Representative Steve Rothman wrote to FDA Commissioner Andrew von Eschenbach asking him to personally look into Menaflex. Soon thereafter, the commissioner met with ReGen executives and heeded the company’s advice to have Dr. Daniel Shultz, head of the FDA’s medical devices division, oversee a new review. The FDA fast-tracked and approved the product despite serious concerns from the scientific community.

If bribery doesn’t work, you can always censor negative results, the way pharmaceutical company Boots did:

Boots commissioned Dr. Betty Dong, a scientist at the University of California–San Francisco, to test the effects of Synthroid, a replacement for thyroid hormone. Boots hoped to reveal that despite its high price, Synthroid was more effective than similar drugs. The company closely monitored the research, and when Dong found that the drug was no more effective than its competitors, instructed her not to publish the results. When she refused to comply, Boots threatened to sue. The company relented only after several years, during which consumers continued to pay for the costly product.

You can also try “refuting” scientific results with bad evidence, the way the formaldehyde industry did:

To counter a study that found that formaldehyde caused cancer in rats, a formaldehyde company commissioned its own study. That study-which found no association between the chemical and cancer-exposed only one-third the number of rats to formaldehyde for half as long as the original study. A formaldehyde association quickly publicized the results and argued before the Consumer Product Safety Commission (CPSC) that they indicated “no chronic health effects from exposure to the level of formaldehyde normally encountered in the home”

And then, if you’re Pfizer, you can just generate as much favorable research as you like to bolster sales of a drug, despite your discovery that the drug increases risk of suicide:

From 1998 to 2007, Pfizer discreetly facilitated the publication of 15 case studies, six case reports, and nine letters to the editor to boost off-label use of Neurontin, a drug prescribed to treat seizures in people who have epilepsy and nerve pain. The number of patients taking the drug rose from 430,000 to 6 million, making it one of Pfizer’s most profitable products. An investigation found that Pfizer had failed to publish negative results, selectively reported outcomes, and excluded specific patients from analysis. [Most importantly] Pfizer failed to note that the drug increased the risk of suicide.

Read the full report here, which includes sources for these stories, as well as an extensive section devoted to reforming scientific practices. There are ways we can avoid this kind of corruption, and they involve everything from federal reforms to corporate transparency.

[via Union of Concerned Scientists]

Science, Journalism, and the Hype Cycle: My piece in tomorrow’s Wall Street Journal (Discovery Magazine)

I think one of the biggest struggles a science writer faces is how to accurately describe the promise of new research. If we start promising that a preliminary experiment is going to lead to a cure for cancer, we are treating our readers cruelly–especially the readers who have cancer. On the other hand, scoffing at everything is not a sensible alternative, because sometimes preliminary experiments really do lead to great advances. In the 1950s, scientists discovered that bacteria can slice up virus DNA to avoid getting sick. That discovery led, some 30 years later, to biotechnology–to an industry that enabled, among other things, bacteria to produce human insulin.

This challenge was very much on my mind as I recently read two books, which I review in tomorrow’s Wall Street Journal. One is on gene therapy–a treatment that inspired wild expectations in the 1990s, then crashed, and now is coming back. The other is epigenetics, which seems to me to be in the early stages of the hype cycle. You can read the essay in full here. [see post below]

March 9th, 2012 5:33 PM by Carl Zimmer

Hope, Hype and Genetic Breakthroughs (Wall Street Journal)

By CARL ZIMMER

I talk to scientists for a living, and one of my most memorable conversations took place a couple of years ago with an engineer who put electrodes in bird brains. The electrodes were implanted into the song-generating region of the brain, and he could control them with a wireless remote. When he pressed a button, a bird singing in a cage across the lab would fall silent. Press again, and it would resume its song.

I could instantly see a future in which this technology brought happiness to millions of people. Imagine a girl blind from birth. You could implant a future version of these wireless electrodes in the back of her brain and then feed it images from a video camera.

As a journalist, I tried to get the engineer to explore what seemed to me to be the inevitable benefits of his research. To his great credit, he wouldn’t. He wasn’t even sure his design would ever see the inside of a human skull. There were just too many ways for it to go wrong. He wanted to be very sure that I understood that and that I wouldn’t claim otherwise. “False hope,” he warned me, “is a sinful thing.”

EPEGINE1

Stephen Voss. Gene therapy allowed this once-blind dog to see again.

Over the past two centuries, medical research has yielded some awesome treatments: smallpox wiped out with vaccines, deadly bacteria thwarted by antibiotics, face transplants. But when we look back across history, we forget the many years of failure and struggle behind each of these advances.

This foreshortened view distorts our expectations for research taking place today. We want to believe that every successful experiment means that another grand victory is weeks away. Big stories appear in the press about the next big thing. And then, as the years pass, the next big thing often fails to materialize. We are left with false hope, and the next big thing gets a reputation as the next big lie.

In 1995, a business analyst named Jackie Fenn captured this intellectual whiplash in a simple graph. Again and again, she had seen new advances burst on the scene and generate ridiculous excitement. Eventually they would reach what she dubbed the Peak of Inflated Expectations. Unable to satisfy their promise fast enough, many of them plunged into the Trough of Disillusionment. Their fall didn’t necessarily mean that these technologies were failures. The successful ones slowly emerged again and climbed the Slope of Enlightenment.

When Ms. Fenn drew the Hype Cycle, she had in mind dot-com-bubble technologies like cellphones and broadband. Yet it’s a good model for medical advances too. I could point to many examples of the medical hype cycle, but it’s hard to think of a better one than the subject of Ricki Lewis’s well-researched new book, “The Forever Fix”: gene therapy.

The concept of gene therapy is beguilingly simple. Many devastating disorders are the result of mutant genes. The disease phenylketonuria, for example, is caused by a mutation to a gene involved in breaking down a molecule called phenylalanine. The phenylalanine builds up in the bloodstream, causing brain damage. One solution is to eat a low-phenylalanine diet for your entire life. A much more appealing alternative would be to somehow fix the broken gene, restoring a person’s metabolism to normal.

In “The Forever Fix,” Ms. Lewis chronicles gene therapy’s climb toward the Peak of Inflated Expectations over the course of the 1990s. A geneticist and the author of a widely used textbook, she demonstrates a mastery of the history, even if her narrative sometimes meanders and becomes burdened by clichés. She explains how scientists learned how to identify the particular genes behind genetic disorders. They figured out how to load genes into viruses and then to use those viruses to insert the genes into human cells.

EPEGINE2

Stephen Voss. Alisha Bacoccini is tested on her ability to read letters, at UPenn Hospital, in Philadelphia, PA on Monday, June 23, 2008. Bacoccini is undergoing an experimental gene therapy trial to improve her sight.

By 1999, scientists had enjoyed some promising successes treating people—removing white blood cells from leukemia patients, for example, inserting working genes, and then returning the cells to their bodies. Gene therapy seemed as if it was on the verge of becoming standard medical practice. “Within the next decade, there will be an exponential increase in the use of gene therapy,” Helen M. Blau, the then-director of the gene-therapy technology program at Stanford University, told Business Week.

Within a few weeks of Ms. Blau’s promise, however, gene therapy started falling straight into the Trough. An 18-year-old man named Jesse Gelsinger who suffered from a metabolic disorder had enrolled in a gene-therapy trial. University of Pennsylvania scientists loaded a virus with a working version of an enzyme he needed and injected it into his body. The virus triggered an overwhelming reaction from his immune system and within four days Gelsinger was dead.

Gene therapy nearly came to a halt after his death. An investigation revealed errors and oversights in the design of Gelsinger’s trial. The breathless articles disappeared. Fortunately, research did not stop altogether. Scientists developed new ways of delivering genes without triggering fatal side effects. And they directed their efforts at one part of the body in particular: the eye. The eye is so delicate that inflammation could destroy it. As a result, it has evolved physical barriers that keep the body’s regular immune cells out, as well as a separate battalion of immune cells that are more cautious in their handling of infection.

It occurred to a number of gene-therapy researchers that they could try to treat genetic vision disorders with a very low risk of triggering horrendous side effects of the sort that had claimed Gelsinger’s life. If they injected genes into the eye, they would be unlikely to produce a devastating immune reaction, and any harmful effects would not be able to spread to the rest of the body.

Their hunch paid off. In 2009 scientists reported their first success with gene therapy for a congenital disorder. They treated a rare form of blindness known as Leber’s congenital amaurosis. Children who were once blind can now see.

As “The Forever Fix” shows, gene therapy is now starting its climb up the Slope of Enlightenment. Hundreds of clinical trials are under way to see if gene therapy can treat other diseases, both in and beyond the eye. It still costs a million dollars a patient, but that cost is likely to fall. It’s not yet clear how many other diseases gene therapy will help or how much it will help them, but it is clearly not a false hope.

Gene therapy produced so much excitement because it appealed to the popular idea that genes are software for our bodies. The metaphor only goes so far, though. DNA does not float in isolation. It is intricately wound around spool-like proteins called histones. It is studded with caps made of carbon, hydrogen and oxygen atoms, known as methyl groups. This coiling and capping of DNA allows individual genes to be turned on and off during our lifetimes.

The study of this extra layer of control on our genes is known as epigenetics. In “The Epigenetics Revolution,” molecular biologist Nessa Carey offers an enlightening introduction to what scientists have learned in the past decade about those caps and coils. While she delves into a fair amount of biological detail, she writes clearly and compellingly. As Ms. Carey explains, we depend for our very existence as functioning humans on epigenetics. We begin life as blobs of undifferentiated cells, but epigenetic changes allow some cells to become neurons, others muscle cells and so on.

Epigenetics also plays an important role in many diseases. In cancer cells, genes that are normally only active in embryos can reawaken after decades of slumber. A number of brain disorders, such as autism and schizophrenia, appear to involve the faulty epigenetic programming of genes in neurons.

Scientists got their first inklings about epigenetics decades ago, but in the past few years the field has become hot. In 2008 the National Institutes of Health pledged $190 million to map the epigenetic “marks” on the human genome. New biotech start-ups are trying to carry epigenetic discoveries into the doctor’s office. The FDA has approved cancer drugs that alter the pattern of caps on tumor-cell DNA. Some studies on mice hint that it may be possible to treat depression by taking a pill that adjusts the coils of DNA in neurons.

People seem to be getting giddy about the power of epigenetics in the same way they got giddy about gene therapy in the 1990s. No longer is our destiny written in our DNA: It can be completely overwritten with epigenetics. The excitement is moving far ahead of what the science warrants—or can ever deliver. Last June, an article on the Huffington Post eagerly seized on epigenetics, woefully mangling two biological facts: one, that experiences can alter the epigenetic patterns in the brain; and two, that sometimes epigenetic patterns can be passed down from parents to offspring. The article made a ridiculous leap to claim that we can use meditation to change our own brains and the brains of our children—and thereby alter the course of evolution: “We can jump-start evolution and leverage it on our own terms. We can literally rewire our brains toward greater compassion and cooperation.” You couldn’t ask for a better sign that epigenetics is climbing the Peak of Inflated Expectations at top speed.

The title “The Epigenetics Revolution” unfortunately adds to this unmoored excitement, but in Ms. Carey’s defense, the book itself is careful and measured. Still, epigenetics will probably be plunging soon into the Trough of Disillusionment. It will take years to see whether we can really improve our health with epigenetics or whether this hope will prove to be a false one.

The Forever Fix

By Ricki LewisSt. Martin’s, 323 pages, $25.99

The Epigenetics Revolution

By Nessa CareyColumbia, 339 pages, $26.95

—Mr. Zimmer’s books include “A Planet of Viruses and Evolution: Making Sense of Life,” co-authored with Doug Emlen, to be published in July.

Nature journal criticizes Canadian ‘muzzling’ (CBC News)

Time for Canadian government to set its scientists free, magazine says

The Canadian Press

Posted: Mar 2, 2012 7:08 AM ET

Last Updated: Mar 2, 2012 12:54 PM ET

One of the world's leading scientific journals is criticizing the Harper government for 'muzzling' federal scientists

One of the world’s leading scientific journals is accusing the Harper government of limiting its scientists from speaking publicly about their research.

The journal, Nature, says in an editorial in this week’s issue that it’s time for the Canadian government to set its scientists free.

Nature says Canada is headed in the wrong direction in not letting its scientists speak out freely.Nature says Canada is headed in the wrong direction in not letting its scientists speak out freely. (Nature)It notes that Canada and the United States have undergone role reversals in the past six years.

It says the U.S. has adopted more open practices since the end of George W. Bush’s presidency, while Canada has gone in the opposite direction.

Nature says policy directives on government communications released through access to information requests reveal the Harper government has little understanding of the importance of the free flow of scientific knowledge.

Two weeks ago, the Canadian Science Writers’ Association, the World Federation of Science Journalists and several other groups sent an open letter to Harper, calling on him to unmuzzle federal scientists.

The letter cited a couple of high-profile examples, including one last fall when Environment Canada barred Dr. David Tarasick from speaking to journalists about his ozone layer research when it was published in Nature.

Could Many Universities Follow Borders Bookstores Into Oblivion? (The Chronicle of Higher Education)

March 7, 2012, 7:44 pm
By Marc Parry

Atlanta — Higher education’s spin on the Silicon Valley garage. That was the vision laid out in September, when the Georgia Institute of Technology announced a new lab for disruptive ideas, the Center for 21st Century Universities. During a visit to Atlanta last week, I checked in to see how things were going, sitting down with Richard A. DeMillo, the center’s director and Georgia Tech’s former dean of computing, and Paul M.A. Baker, the center’s associate director. We talked about challenges and opportunities facing colleges at a time of economic pain and technological change—among them the chance that many universities might follow Borders Bookstores into oblivion.

Q. You recently wrote that universities are “bystanders” at the revolution happening around them, even as they think they’re at the center of it. How so?

Mr. DeMillo: It’s the same idea as the news industry. Local newspapers survived most of the last century on profits from classified ads. And what happened? Craigslist drove profits out of classified ads for local newspapers. If you think that it’s all revolving around you, and you’re going to be able to impose your value system on this train that’s leaving the station, that’s going to lead you to one set of decisions. Think of Carnegie Mellon, with its “Four Courses, Millions of Users” idea [which became the Open Learning Initiative], or Yale with the humanities courses, thinking that what the market really wants is universal access to these four courses at the highest quality. And really what the market is doing is something completely different. The higher-education market is reinventing what a university is, what a course is, what a student is, what the value is. I don’t know why anyone would think that the online revolution is about reproducing the classroom experience.

Q. So what is the revolution about?

Mr. DeMillo: You don’t know where events are going to take higher education. But if you want to be an important institution 20 years from now, you have to position yourself so that you can adapt to whatever those technology changes are. Whenever you have this kind of technological change, where there’s a large incumbency, the incumbents are inherently at a disadvantage. And we’re the incumbents.

Q. What are some of the most important changes happening now?

Mr. DeMillo: What you’re seeing, for example, is technology enabling a single master teacher to reach students on an individualized basis on a scale that is unprecedented. So when Sebastian Thrun offers his Intro to Robotics course and gets 150,000 students—that’s a big deal.

Why is it a big deal? Well, because people who want to learn robotics want to learn from the master. And there’s something about the medium that he uses that makes that connection intimate. It’s not the same kind of connection that you get by pointing a camera at the front of the room and letting someone write on a whiteboard. These guys have figured out how to design a way of explaining the material that connects with people at scale. So Stanford all of a sudden becomes a place with a network of stakeholders that’s several orders of magnitude larger than it was 10 years ago. Every one of those students in India that wants to connect to Stanford now—connect to a mentor—now has a way to connect by bypassing their local institutions. Every institution that can’t offer a robotics course now has a way of offering a robotics course.

I think what you see happening now with the massive open courses is going to fundamentally change the business models. It’s going to put the notion of value front and center. Why would I want a credential from this university? Why would I want to pay tuition to this university? It really ups the stakes.

Mr. Baker: There used to be something called Borders, you may remember. Think of Borders, the bookstore, “X, Y, Z University,” the bookstore. If you’ve got Amazon as an analogue for these massively open courses, there is still a model where people actually go into bookstores because sometimes they want to touch, or they like hanging out, or there’s other value offered by that. What it means is that the university needs to rethink what it’s doing, how it’s doing it.

And how it innovates in a way of surviving in the face of this. If I can do the Amazon equivalent of this open course, why should I come here? Well, maybe you shouldn’t. And that’s a client that is lost.

Mr. DeMillo: All you have to do is add up the amount of money spent on courses. Just take an introduction to computer science. Add up the amount of money that’s spent nationwide on introductory programming courses. It’s a big number, I’ll bet. What is the value received for that spend? If, in fact, there’s a large student population that can be served by a higher-quality course, what’s the argument for spending all that money on 6,000 introduction to programming courses?

Q. You really think that many universities could go the way of Borders?

Mr. DeMillo: Yeah. Well, you can see it already. We lost, in this university system, four institutions this year.

Mr. Baker: The University System of Georgia merged four institutions into other ones that were geographically within 50 miles. The programs essentially were replicated. And in an environment in which you’ve got reduced resources, you can’t afford to have essentially identical programs 50 miles apart.

Q. So what sort of learning landscape do you think might emerge?

Mr. DeMillo: One thing that you might see is highly tuned curricula, students being able to select from a range of things that they want to learn and a range of mentors that they want to interact with, whether you think of it as hacking degrees or pulling assessments from a menu of different universities. What does that mean for the individual university? It means that a university has to figure out where its true value sits in that landscape.

Mr. Baker: Another thing we’re looking at is development of a value index to try to calculate, to be vulgar, the return on investment. Our idea is to try to figure out ways of determining what constitutes value for a student, based on four or five personas. So for, let’s say, a mom returning at 50 who wants an education—she’s going to value certain things differently than a 17-year-old rocket scientist coming to Tech who wants to get through in three years and knows exactly what she wants to do.

Mr. Demillo: Jeff Selingo wrote a column about this, having one place to go to figure out the economic value of a degree from a university. It’s a great idea, but why focus only on the paycheck as an economic value? There are lots of indicators of value. Do students from this university go to graduate school by a disproportionately large number? Do they get fellowships? Are they people who stay in their profession for a long period of time? You start to build up a picture of what students tell you, of what alumni tell you, was the value of that education. Can we pull these metrics together and then say something interesting about our institution and by extension others?

Q. What other projects is your center working on right now?

Mr. DeMillo: The Khan Academy—small bursts of knowledge that may or may not be included in a curriculum—was a really interesting idea.

Can students generate this kind of material in a way that’s useful for other students? That’s the genesis of our TechBurstcompetition [in which students create short videos that explain a single topic].

It turns out there’s a lot of interest on the part of the students at Georgia Tech in teaching what they know to their peers. The interesting part of the project is the unexpected things that you get. We had a discussion yesterday about mistakes. This is student-generated stuff, so is it right? Not all the time. Which causes great angst on the part of traditionalists, because now we have Georgia Tech TechBurst video that has errors in it. If these were instructional videos that we were marketing, that would be a very big deal. But they’re not. They’re the start of a thread of conversation among students. There’s one on gerrymandering. So it’s a political-science video, it’s cutely produced, but in some sense it’s not exactly right. And so what you would expect is now other students will come along and annotate that video, and say, well, that’s not exactly what gerrymandering is. And you’ll start to see this students-teaching-students peer-tutoring process taking place in real time.

Q. What about the massive open online course Georgia Tech will run in the fall?

Mr. DeMillo: The idea of a massive open course is something that people normally apply to introductory courses. What happens when you look at a massive open advanced seminar? A seminar room with 10,000 students, 50,000 students—what does that even mean? We’ve got some people here that have been blogging for quite a while about advanced topics. In fact, one of the blogs—Godel’s Lost Letter, by Professor Dick Lipton of Georgia Tech, and Ken Regan of the University at Buffalo—is about advanced computer theory, so it’s a very mathematical blog. It’s in the top 0.1 percent of WordPress blogs. A typical day is 5,000 to 10,000 page views. A hot day is 100,000. The question is can we take this blogging format and turn it into an online seminar.

Q. How would that work?

Mr. DeMillo: The blog is essentially an expression of a master teacher’s understanding of a field to people that want to learn about it. We think that there are some very simple layers that can be built under the existing blogging format that can essentially turn it into a massive open online seminar. It’s also a way of conducting scientific research. When you think about what happens in this blog, it celebrates the process of scientific discovery. I’ll just give you one example. Last year about this time some industrial scientist claimed that he had solved one of the outstanding problems in this area. In the normal course of events, the scientist would have written up the paper, would have sent it to a conference. It would have been refereed. Nine months later the paper would have been presented at the conference. People would have talked about it. It would have been written up to submit to a journal. Refereeing would have taken a couple of years for that. Well, the paper got submitted to Lipton’s blog. It just caused a flurry of activity. So thousands and thousands of scientists flocked to this paper, and essentially speeded up the refereeing of the paper, shortening the time from five years to a couple of weeks. It turns out that people came to believe that the claim was not valid, and the paper was incorrect. But what an education for future research students. You get to see the process of scientific discovery in action.

This is an interesting bookend to the idea of a massive open course. Because the people that are thinking about the massive open online courses for introductory material have a set of considerations. Students are at different levels of achievement. Assessment is very important. The credentialing process is dictated by whether or not you want credit. If you go to the other end of the curriculum, and say, well, what happens when we try to do these advanced courses at scale, credentialing is completely different. Assessment is completely different. You can’t rely on the same automation that you could in the introductory courses. Social networks become extremely important if you’re going to do this stuff at scale, because one professor can’t deal with 100,000 readers. He has to have a network of trusted people who would be able to answer questions. The anticipation is that a whole new set of problems would come up with these kinds of courses.

This conversation has been edited and shortened.

Farmers in Mozambique trying to adapt farming to climate change (PRI.org)

Published 29 January, 2012 11:15:00 Living on Earth

image
Rui Alberto Campira hoes the soil. He’s part of a group of farmers who received a grant from Save the Children to grow cash crops. (Photo by Rowan Moore Gerety.)

As the rain and water in Mozambique becomes less predictable and less suited to subsistence farming, aid groups and the local government are trying to help some change the way they farm so they’re not so paralyzed by a flood or a drought. But there’s a lot of work to do.

Over the past two decades, Mozambique has suffered more than its fair share of weather disasters.

The east African nation has seen more devastating cyclones, droughts and floods than any country on the continent. Farmers in Mozambique have been particularly hard hit. This year alone, torrential rains in the mountains sent flood waters onto fields below, submerging tens of thousands of acres of crops.

And now, farmers are in the midst of another rainy season, which started in December.

Officials at Mozambique’s National Institute for Disaster Management have to prepare for rescue operations this time of year. Figueredo de Araujo, the institute’s information manager, said the emergency operations center is equipped with rescue boats as well as warehouses with various goods for humanitarian assistance: maize flour, tents, tarps, boots and rain coats among them.

Caia, where Mozambique’s main highway crosses the Zambezi river, sits in the middle of a vast, flat, floodplain that is home to nearly a million people. In 2000, the area was hit by the worst flooding in memory. The floods killed 700 people, displaced 100,000, and cost Mozambique a 1.5 percent loss in GDP through destruction of crops.

To Belem Monteiro, the emergency center’s director, much of Mozambique’s misfortune is a matter of geography.

“The fact that we have a problem is not news to us: given its location, Mozambique could only be vulnerable to these changes in climate,” Monteiro said.

Nearly 80 percent of Mozambican families are subsistence farmers, relying on rain-fed agriculture to produce their food. After the 2000 floods, farmers near the Zambezi River repeatedly lost their homes and crops.

“In the past, it happened every five years, now we have annual emergencies, which shows that the situation has changed,” Monteiro said.

But that’s presented a major challenge for the disaster management institute, which was conceived to intervene during freak emergencies, but has been forced to evolve to a permanent mission.

Some 30 miles from Caia, a resettlement zone called Tchetcha Um is home to some 5,000 families who were moved to higher ground. The organization Save the Children has partnered with the government in a program promoting livelihood resilience, diversifying their income sources, said Clemente Lourenço, a project officer for the group.

Farmer Rui Alberto Campira received a grant from Save the Children in 2009, which enabled he and 11 other farmers to built a 5-acre farm where they can grow crops for both consumption at home and sale at the local market. Campira says the soil is great for cash crops.

“It’s good. Especially for tomatoes. Tomatoes, onions, cabbage, collard greens. That’s what we usually plant here. There we only plant maize. Maize and sweet potatoes,” Campira said of his former home.

The land he’s farming now will also flood during the rainy season, but the irrigation system the grant enabled him to install allows him to farm during the dry season, when cash crops would typically die.

About 55 associations like Campira’s have formed in Caia district, not just growing cash crops, but trading in fish, beans, and clothing, and using animal traction to plow fields. Save the Children funds about 4500 farmers across three provinces.

Joao Novage is raising seven goats, as part of another association. The grant originally bought 40 goats that have in turn born another 20.

“When I see that I have 12 or 13 goats, I’ll take four and sell them to buy school supplies and clothes for my children. Children are our wealth. They’ll bring a better future for us,” Novage said.

Though the projects have been wildly successful, everyone admits they serve an insignificant portion of the population at this point. It remains to be seen if they can be expanded to make a measurable difference in the unger and poverty around this portion of east Africa.

Colombia prosecutors question ‘shaman rain payment’ (BBC)

18 January 2012 Last updated at 16:49 GMT

By Arturo Wallace
BBC Mundo, Bogota

The tournament, won by Brazil, was held across Colombia with the final in Bogota

Colombian prosecutors are investigating why organisers paid a “shaman” $2,000 (£1,400) to keep rain away from the closing ceremony of the Fifa U-20 World Cup held in the country last year.

The inquiry was launched after cost overruns totalling $1m came to light.

But the focus of their questions is a 64-year-old man who says he uses dowsing to stave off or attract rain.

The event’s organisers defended their decision to use him, noting that the final event was indeed rain-free.

The “rain-stopper” in question, Jorge Elias Gonzalez, has been dubbed a “shaman” or medicine man by the Colombian media.

A dark joke doing the rounds in the capital, Bogota, asks why the shaman was not also hired to minimise the impact of the last rainy season, which killed 477 people and affected some 2.6 million Colombians.

Yet more cynical voices have said that, given the corruption allegations involving the Bogota authorities in recent years, Mr Gonzalez should be praised as the only contractor to deliver what he promised.

The spectacular closing ceremony in Bogota’s El Campin stadium on 20 August last year remained dry – a stark contrast with the opening event in Barranquilla a month earlier that was drenched.

Ana Marta de Pizarro, the anthropologist and theatre director who was in charge of the ceremony, used this argument to defend the hiring of a rain stopper.

“Had it rained, the event would not have taken place. It didn’t rain on the ceremony, it was successful and I would use him again if I needed to,” she said.

And Ms Pizarro also said Mr Gonzalez had been hired in the past to ensure Bogota’s International Theatre Festival was rain-free.

In an interview with a local radio station on Wednesday, Mr Gonzalez also said he was also hired to keep the rain away from the swearing-in ceremony of President Juan Manuel Santos.

This has, as yet, neither been confirmed nor denied by the president’s office.

Respect

Prosecutors are adamant that Mr Gonzalez’s contract will be investigated.

The procurement law requires efficiency and professionalism in all service providers paid for by public funds “and that doesn’t include shamans”, a statement from the local comptroller’s office said.

“We’ll ask him to explain in which circumstances, how and where he can stop rain,” said the deputy prosecutor, Juan Carlos Forero.

The debate has also drawn in those who want to make sure no public funds are used to pay for any sort of religious rites, and those who want the traditions of indigenous Colombians to be treated with more respect.

In a bizarre twist to the dispute, Mr Gonzalez has always insisted that he is not a shaman.

“I’m not indigenous, so don’t call me a shaman, for I don’t even know what that is. Nor am I a wizard,” he told a local newspaper several years ago.

Mr Gonzalez has said that he can stop or attract rain using dowsing, although he also prays.

Anthropologist Mauricio Pardo believes that by describing him as a shaman, the Colombian media might end up belittling an important indigenous tradition.

“And those traditions deserve to be respected. Even our constitution demands so,” he told BBC Mundo.

Profetas da chuva do sertão, por Raquel de Queiroz

“Vá, por exemplo, ao sertão nordestino, nos meses de novembro e dezembro. O povo, lá não tira os olhos do céu, em procura dos prenúncios. Pequenas nuvens ao poente… pequenas, claro, ainda não é tempo das grandes, mas, se elas se juntam para o sul, quer dizer uma coisa; se aparecem ao poente, a coisa muda. Só o que elas não dizem é que a coisa será essa: como todos os adivinhos do mundo, gostam de se envolver em mistério. E aquelas nuvens inocentes são branquinhas como se fossem feitas só de gelo e neve, não, têm nada a ver com chuva, são só enfeites do céu…” (p. 13)

QUEIROZ, Rachel. Existe outra saída, sim. Fortaleza: Fundação Demócrito Rocha, 2003.

Politics hindering scientists on climate change (The Seattle Times)

Sunday, December 25, 2011 – Page updated at 08:00 p.m.

By JUSTIN GILLIS
The New York Times

At the end of one of the most bizarre weather years in U.S. history, climate research stands at a crossroads.

Scientists say they could, in theory, do a much better job of answering the question “Did global warming have anything to do with it?” after extreme weather events like the drought in Texas and the floods in New England.

But for many reasons, efforts to put out prompt reports on the causes of extreme weather are essentially languishing. Chief among the difficulties that scientists face: The political environment for new climate-science initiatives has turned hostile, and with the federal budget crisis, money is tight.

And so, as the weather becomes more erratic by the year, the public is left to wonder what is going on.

When 2010 ended, it had seemed as if people had lived through a startling year of weather extremes. But in the United States, if not elsewhere, 2011 has surpassed that.

A typical year in this country features three or four weather disasters whose costs exceed $1 billion each. But this year, the National Oceanic and Atmospheric Administration has tallied a dozen such events, including wildfires in the Southwest, floods in multiple regions of the country and a deadly spring tornado season. And the agency has not finished counting. The final costs are certain to exceed $50 billion.

“I’ve been a meteorologist 30 years and never seen a year that comes close to matching 2011 for the number of astounding, extreme weather events,” Jeffrey Masters, a co-founder of the popular website Weather Underground, said last month. “Looking back in the historical record, which goes back to the late 1800s, I can’t find anything that compares, either.”

Many of the individual events in 2011 do have precedents in the historical record. And the nation’s climate has featured other concentrated periods of extreme weather, including severe cold snaps in the early 20th century and devastating droughts and heat waves in the Dust Bowl era of the 1930s.

But it is unusual, if not unprecedented, for so many extremes to occur in such a short span. The calamities in 2011 included wildfires that scorched millions of acres, extreme flooding in the Upper Midwest and the Mississippi River valley and heat waves that shattered records in many parts of the country. Abroad, huge floods inundated Australia, the Philippines and large parts of Southeast Asia.

A major question nowadays is whether the frequency of particular weather extremes is being affected by human-induced climate change.

Climate science already offers some insight. Researchers have proved the temperature of the Earth’s surface is rising, and they are virtually certain the human release of greenhouse gases, mainly from the burning of fossil fuels, is the major reason. For decades, they have predicted this would lead to changes in the frequency of extreme weather events, and statistics show that has begun to happen.

For instance, scientists have long expected a warming atmosphere would result in fewer extremes of low temperature and more extremes of high temperature. In fact, research shows that about two record highs are being set in the U.S. for every record low, and similar trends can be detected in other parts of the world.

Likewise, a well-understood physical law suggests a warming atmosphere should hold more moisture. Scientists have directly measured the moisture in the air and confirmed it is rising, supplying the fuel for heavier rains, snowfalls and other types of storms.

“We are changing the large-scale properties of the atmosphere — we know that beyond a shadow of a doubt,” said Benjamin D. Santer, a leading climate scientist who works at the Lawrence Livermore National Laboratory in California. “You can’t engage in this vast planetary experiment — warming the surface, warming the atmosphere, moistening the atmosphere — and have no impact on the frequency and duration of extreme events.”

But if the human contribution to heat and precipitation is clear, scientists are on shakier ground analyzing many other events.

Some questions can be answered with focused studies of a specific weather event, but these are often finished years afterward. Lately, scientists have been discussing whether they can do a better job of analyzing events within days or weeks, not years.

“It’s clear we do have the scientific tools and the statistical wherewithal to begin answering these types of questions,” Santer said.

But doing this on a regular basis would probably require new personnel spread across several research teams, along with a strong push by the federal government, which tends to be the major source of financing and direction for climate and weather research. Yet Washington, D.C., is essentially frozen on the subject of climate change.

This year, when the National Oceanic and Atmospheric Administration tried to push through a reorganization that would have provided better climate forecasts to businesses, citizens and local governments, Republicans in the House of Representatives blocked it.

The idea had originated in the Bush administration, was strongly endorsed by an outside review panel and would have cost no extra money. But the House Republicans, many of whom reject the overwhelming scientific consensus about the causes of global warming, labeled the plan an attempt by the Obama administration to start a “propaganda” arm on climate.

In an interview, Jane Lubchenco, the director of NOAA, rejected that claim and said her agency had been deluged with information requests regarding future climate risks. “It’s truly unfortunate that we are not allowed to become more effective and efficient in delivering that information,” she said.

NOAA does finance research to understand the causes of weather extremes, as do the National Science Foundation and the Department of Energy. But with the strains on the federal budget, Lubchenco said, “it’s going to be more and more challenging to devote resources to many of our research programs.”

Copyright © The Seattle Times Company

Ocupe a sala de aula? (Valor Econômico)

JC e-mail 4408, de 19 de Dezembro de 2011.

Artigo de Dani Rodrik publicado no Valor Econômico de hoje (19).

No início de novembro, um grupo de estudantes abandonou um conhecido curso de Harvard de introdução à economia, “Ciências Econômicas 10”, lecionado por meu colega Greg Mankiw. A reclamação: o curso propaga ideologia conservadora disfarçada de ciência econômica e ajuda a perpetuar a desigualdade social.

Os estudantes fazem parte do crescente coro de protestos contra as ciências econômicas modernas da forma como são ensinadas nas principais instituições acadêmicas do mundo. As ciências econômicas sempre tiveram seus críticos, é claro, mas a crise financeira e suas sequelas lhes deram nova munição, que parece validar as antigas acusações contra as suposições pouco realistas da profissão, assim como sua retificação dos mercados e desprezo pelas preocupações sociais.

Mankiw, por sua vez, achou que os estudantes que protestavam estavam “mal informados”. As ciências econômicas não têm ideologia, retorquiu. Citou John Maynard Keynes e destacou que as ciências econômicas são um método que ajuda as pessoas a pensar mais claramente e a alcançar respostas corretas, sem conclusões políticas predeterminadas.

De fato, embora possa entender-se o ceticismo de quem não esteve imerso em anos de estudos avançados de economia, os trabalhos feitos pelos alunos em um curso típico de doutorado em economia produzem uma variedade desconcertante de receitas políticas, dependendo do contexto específico. Algumas das estruturas que os economistas usam para analisar o mundo favorecem o livre mercado, enquanto outras não. Na verdade, boa parte das análises econômicas são voltadas a compreender como a intervenção dos governos pode melhorar o desempenho econômico. E motivações não econômicas e comportamentos socialmente cooperativos são cada vez mais parte dos assuntos estudados por economistas.

Como o grande economista internacional Carlos Diaz-Alejandro, já falecido, disse certa vez, “atualmente, qualquer estudante universitário esperto, se escolher suas suposições […] cuidadosamente, pode produzir um modelo consistente, recomendando praticamente quaisquer medidas políticas às quais ele fosse favorável inicialmente”. E isso foi na década de 70! Um economista aprendiz não precisa mais ser particularmente esperto para produzir conclusões de políticas não ortodoxas.

Ainda assim, os economistas precisam aguentar acusações de que não saem das raias ideológicas, porque eles mesmos são seus piores inimigos no que se refere a aplicar suas teorias no mundo real. Em vez de comunicar todo o arsenal de perspectivas que sua disciplina oferece, eles mostram confiança excessiva em soluções em particular – frequentemente aquelas que melhor se encaixam em suas próprias ideologias.

Vejamos a crise financeira mundial. A macroeconomia e as finanças não carecem das ferramentas necessárias para entender como a crise surgiu e se desenrolou. De fato, a literatura acadêmica está repleta de modelos de bolhas financeiras, informações assimétricas, distorções dos incentivos, crises autorrealizáveis e risco sistêmico. Nos anos que levaram à crise, no entanto, muitos economistas menosprezaram as lições desses modelos em favor dos que tratavam sobre a eficiência e o poder de autocorreção dos mercados, o que, na esfera das políticas, resultou em supervisão inadequada dos mercados financeiros pelos governos.

Em meu livro “O Paradoxo da Globalização”, imagino o seguinte experimento. Consiste em que um jornalista ligue a um professor de economia e pergunte se um acordo de livre comércio com o país X ou Y seria uma boa ideia. Podemos ter quase certeza de que o economista, assim como a ampla maioria das pessoas na profissão, se mostrará empolgado em seu apoio ao livre comércio.

Em outra situação, o repórter não se identifica e diz ser um estudante no seminário universitário avançado do professor sobre teoria do comércio internacional. Ele faz a mesma pergunta: O livre comércio é bom? Duvido que a resposta será tão rápida e sucinta. Na verdade, é provável que o professor se sinta bloqueado com a pergunta. “O que você quer dizer com “bom”?”, ele perguntará. “E “bom” para quem?”

O professor, então, entrará em uma longa e cansativa exegese, que acabará culminando em uma declaração pesadamente evasiva: “Então, se a longa lista de condições que acabei de descrever for cumprida e supondo que podemos tributar os beneficiários para compensar os que saíram perdendo, um comércio mais livre tem o potencial para melhorar o bem-estar de todos.” Se estivesse em dia inspirado, o professor poderia até acrescentar que o impacto do livre comércio no índice de crescimento da economia não seria claro e dependeria de um conjunto inteiramente diferente de requisitos.

A afirmação direta e incondicional sobre os benefícios do livre comércio agora foi transformada em uma declaração adornada com todos os tipos de “se” e “mas”. Estranhamente, o conhecimento que o professor transmite de boa vontade e com grande orgulho a seus estudantes avançados é considerado impróprio (ou perigoso) para o público em geral.

O ensino das ciências econômicas no nível universitário sofre do mesmo problema. Em nosso empenho para mostrar as joias da coroa da profissão de forma imaculada – a eficiência do mercado, a mão invisível, a vantagem comparativa – nós pulamos as complicações e nuances do mundo real, tão conhecidas como são na disciplina. É como se os cursos de introdução à física presumissem um mundo sem gravidade, porque assim tudo ficaria muito mais simples.

Aplicadas apropriadamente, com uma dose saudável de senso comum, as ciências econômicas nos teriam preparado para a crise financeira e nos indicado a direção certa para consertar o que a causou. Mas a ciência econômica que precisamos é a do tipo da “sala de seminário” e não a do tipo “geral”. Precisamos das ciências econômicas que reconheçam suas limitações e saibam que a mensagem apropriada depende do contexto.

Negligenciar a diversidade de orientações intelectuais dentro de sua disciplina não torna os economistas melhores analistas do mundo real. Nem os torna mais populares. (Tradução de Sabino Ahumada)

Dani Rodrik é professor de Economia Política na Harvard University e autor de “The Globalization Paradox: Democracy and the Future of the World Economy” (o paradoxo da globalização: democracia e o futuro da economia mundial, em inglês).

Climate summit was a pathetic exercise in deceit (Globe and Mail)

Thomas Homer-Dixon
Last updated Monday, Dec. 12, 2011 10:01AM EST

It was an “emperor-has-no-clothes” moment. The 17-year-old youth delegate rose before the assembled participants at the Durban climate conference and looked them straight in the eye.

“I speak for more than half the world’s population,” declared Anjali Appadurai of Maine’s College of the Atlantic. “We are the silent majority. You’ve given us a seat in this hall, but our interests are not at the table. What does it take to get a stake in this game? Lobbyists? Corporate influence? Money?”

“You have been negotiating all of my life. In that time, you’ve failed to meet pledges, you’ve missed targets, and you’ve broken promises.”

Ms. Appadurai nailed it. There’s really only one label for the pathetic exercise we’ve just witnessed in South Africa: deceit. The whole climate-change negotiation process and the larger political discourse surrounding this horrible problem is a drawn-out and elaborate exercise in lying – to each other, to ourselves, and especially to our children. And the lies are starting to corrupt our civilization from inside out.

The climate negotiators lie to each other and the world when they claim the world can still limit the planet’s warming to two degrees Celsius above the pre-industrial average, the point at which many experts believe the risks from climate change rise sharply.

It’s a lie because we’ve already experienced 0.8 degrees warming, and we’ve got at least another 0.6 degrees on the way due to carbon already in the atmosphere. Given that global carbon dioxide emissions of about 35 billion tons each year are now growing at an average of 3 per cent a year – which means they’re doubling every 23 years – it’s virtually certain we’re going to use up the remaining 0.6 degrees of leeway. In fact, the emerging consensus among climate experts is that we’ll be lucky to limit warming to 4 degrees.

India, China, and Brazil lie to their own citizens when they claim that by blocking a climate deal they’re protecting the opportunity for their economies to develop. “Am I to write a blank cheque and sign away the livelihoods and sustainability of 1.2-billion Indians?” asked India’s environment minister, Jayanthi Natarajan.

But this choice is patently false, as senior officials of these countries surely know. It’s not a choice between a climate-change deal and economic development; it’s really a choice of both or neither. If we don’t reduce carbon emissions, the impacts of climate change will eventually devastate the economies of poor countries. Repeated failures of monsoons in India and China or the desiccation of the Amazon basin in Brazil will drive a stake through these countries’ economies. Dealing with climate change is a prerequisite for prosperity this century – for all people on this planet.

The Canadian federal government lies to Canadians when it says we can still meet the government’s stated target of a 17 per cent reduction of emissions below the country’s 2005 level by 2020. Given the projected growth in oil sands output and the Conservatives’ neglect of the climate change file, nobody in the know seriously believes such a target can be achieved.

And we lie to ourselves when we tell ourselves that fixing climate change is someone else’s responsibility, or that the science is too uncertain to justify action, or that we’ll find a technology to solve the problem when it gets serious enough, or that it simply costs too much to do anything.

But most of all we lie to our kids. We tell them we’ve got the climate problem under control, while we’ve actually lost control of it completely. Worse, we tell them that we’re protecting their options for the future, while we’re actually closing down those options to protect powerful political and economic vested interests in the present.

It took a 17-year-old to tell the truth. The rest of us, supposedly adults, should be ashamed.

Thomas Homer-Dixon is the director of the Waterloo Institute for Complexity and Innovation and is the CIGI Chair of Global Systems at the Balsillie School of International Affairs in Waterloo, Ont.

Avanço diplomático, atraso climático (O Globo)

JC e-mail 4403, de 12 de Dezembro de 2011.

A adesão de EUA, China e Índia é marco da COP-17. Mas cortes de CO2 ficam na promessa.

Quase dois dias depois do previsto, a reunião das Nações Unidas sobre mudanças climáticas de Durban, na África do Sul, terminou na madrugada de ontem (11) sem que nenhum novo acordo com valor de lei fosse firmado. Nas 36 horas de prorrogação da cúpula, representantes de 194 países concordaram em estender o Protocolo de Kioto até 2017 e a dar início a negociações para a elaboração de um novo tratado global que só entraria em vigor em 2020. Para analistas, o resultado é uma vitória da diplomacia – uma vez que, pela primeira vez, EUA, China e Índia aceitaram negociar metas compulsórias -, mas um fracasso do ponto de vista climático. A Plataforma de Durban é um plano de ação para negociações futuras, mas representa um atraso concreto nos cortes de gases do efeito estufa.

Cientistas são praticamente unânimes em afirmar que para que o aumento da temperatura da Terra se mantenha no patamar dos 2° Celsius até o fim do século – acima da qual considera-se que haveria mudanças climáticas perigosas – um novo acordo global com metas obrigatórias de cortes de emissões já teria que entrar em vigor até o fim do ano que vem, quando o Protocolo de Kioto expiraria. Quase dez anos de espera para se ter metas compulsórias – “a década perdida”, como já a apelidaram ambientalistas – pode levar o aumento da temperatura planetária para a casa dos 3° Celsius a 4° Celsius, com consequências climáticas dramáticas.

A prorrogação do Protocolo de Kioto até 2017, por sua vez, é apenas simbólica. Com a saída de Rússia, Japão e Canadá do acordo (que nunca teve a adesão dos EUA, nem obrigações dos países em desenvolvimento), o protocolo, atualmente, cobre apenas 15% das emissões do planeta. Como, na melhor das hipóteses, o novo acordo só será implementado em 2020, tampouco se sabe que tratado estará em vigor entre 2017 e 2020.

Negociações formais começam em 2012 – Ainda assim, os participantes da reunião consideraram o acordo uma grande vitória da diplomacia. De fato, foi a primeira vez que Estados Unidos, China e Índia (os maiores emissores de CO2) concordaram em negociar a elaboração de um documento com metas compulsórias de corte de emissões – as negociações começariam já no ano que vem e se estenderiam até 2015. O Brasil, que está entre os cinco maiores emissores por conta do desmatamento, já havia concordado com o plano de intenções e teve papel crucial nas negociações. Se tudo der certo, será a primeira vez que o mundo terá um acordo global, com valor legal e o envolvimento de todos os países.

Para a ministra do Meio Ambiente, Izabella Teixeira, foi um desfecho “histórico”. A presidente Dilma Rousseff, informada do resultado pela ministra, se disse satisfeita com o resultado do encontro e elogiou a participação do Brasil.

“O documento é extraordinário. Ele lança um futuro de cooperação internacional, com condições para que se venha a ter no mesmo instrumento jurídico todos os países, abrindo uma nova era na luta contra a mudança do clima”, resumiu o embaixador Luiz Alberto Figueiredo, negociador-chefe da delegação brasileira.

Especialista da Coppe/UFRJ e integrante do Painel Intergovernamental de Mudanças Climáticas (IPCC) da ONU, Suzana Kahn Ribeiro, tem uma visão diferente. “Se o objetivo dos negociadores era ter algum tipo de acordo, não deixar um vácuo, ok, então eu posso considerar que o encontro foi vitorioso. Agora, se a meta era ter uma solução para o aquecimento global, então a conferência foi um fracasso total. Temos um instrumento legal (Kioto) que não tem valor prático nenhum e um plano de intenções para 2020 puramente declaratório”, afirmou.

Assessor da prefeitura para a Rio+20, o economista Sérgio Besserman concorda com a colega. “Esta é uma negociação diplomática, como tantas outras, mas a diferença, neste caso, é que não temos controle sobre a agenda, que é ditada externamente, pelo clima. Quando o debate é sobre comércio, por exemplo, se atrasar, atrasou. Mas com o clima não é assim, ele tem seu próprio ritmo. É claro que é preferível que se tenha um plano de intenções, que a toalha não tenha sido jogada, mas estamos nos atrasando consideravelmente”, declarou.

Para Besserman, “é assustadora a incapacidade da governança mundial de dar uma resposta ao conhecimento científico que já se tem sobre o que vem pela frente”. “Vale lembrar que um aumento de 3° Celsius é 50% acima do que se considera o limite do perigo”, avaliou.

Duas das principais organizações ambientais do mundo, WWF e Greenpeace condenaram o resultado da conferência. “O mundo merece um pacto melhor que o débil compromisso de Durban”, afirmou Regine Günther, do WWF Alemanha, lembrando que o acordo não impedirá que a temperatura suba acima dos 2° Celsius.

Para o Greenpeace, “o compromisso não conduz a um tratado vinculante mundial para a proteção do clima, mas a um acordo vago”, lembrando que não há sequer sanções para quem não cumprir o plano de intenções.

Para o cientista político e professor de Relações Internacionais da Universidade de Brasília, Eduardo Viola, o resultado da conferência é “desastroso” do ponto de vista do clima. “Tudo foi protelado para 2020, uma vez que essa prorrogação de Kioto é irrelevante, é a prorrogação do nada”, resumiu. “O resultado não é histórico, como estão dizendo os que estavam envolvidos nas negociações. Ele lamenta a decisão de adiar as medidas até 2020, uma ideia de que se está fazendo algo pelo clima quando a ciência aponta que as medidas de redução das emissões já deveriam vigorar em 2013.”

Ainda assim, o especialista garante estar otimista. “A Humanidade aprende pela dor”, afirma, lembrando que as mudanças climáticas ainda são uma realidade distante para boa parte da população. “Ela aprende com mais dor do que precisaria e em muito mais tempo do que seria necessário, mas não está condenada ao suicídio.”

Os principais pontos acertados na COP-17
O que aconteceu em Durban? 194 países se reuniram na 17ª rodada de negociações da Convenção do Clima da ONU, cuja meta é deter o aquecimento global ao limitar as emissões de gases do efeito estufa. A conferência durou dois dias além do previsto, na mais longa reunião ambiental realizada.

O que foi obtido? Após duríssimas negociações, se chegou à “Plataforma de Durban”. No documento de duas páginas, pela primeira vez, todos os países prometem cortar emissões. Um plano guiará os países em negociações até 2015 para que cheguem a um acordo legal de cortes. Porém, ele só começará a vigorar em 2020.

Foi um avanço ou um retrocesso? Depende do ângulo que se olhe. Um sucesso em termos de se manter as negociações vivas, salvando o processo da ONU, após este quase ter colapsado em Copenhague e Cancún. A União Europeia chama seu plano de ação (a Plataforma de Durban) de “avanço histórico”. Para a UE, essa é a primeira vez que EUA, China e Índia se comprometem a assinar um tratado de legal para cortar emissões. Porém, é um atraso do ponto de vista de muitos países em desenvolvimento, de grupos ambientalistas e de cientistas. Eles argumentam que a linguagem usada precisa ser mais forte para forçar os países a agir e que deveria haver datas concretas de cortes.

E o Protocolo de Kioto? Ele será estendido até 2017, com metas de redução para a UE e poucos outros países desenvolvidos. Japão e Rússia já tinham anunciado que deixariam Kioto. Um novo acordo deve ser negociado para cobrir o período até 2020. Porém, Índia, China e EUA continuam de fora. Os dois primeiros porque não têm obrigação legal e os EUA por não serem signatários. Nesse período de intervalo países como o Brasil, que têm metas voluntárias, continuarão a fazer cortes de emissões.

O dinheiro prometido em 2010 para ajudar os países pobres? O Fundo Verde criado em Cancún deverá despender US$60 bilhões por ano a partir de 2020. Porém, os detalhes de como isso será feito são muito vagos. Não está definido de onde virá o dinheiro. Uma das possibilidades são taxas sobre a aviação.

E o desmatamento? O REDD, o plano para pagar países pobres a não cortar suas árvores, avançou pouco. Mais uma vez, não ficou definido de onde virá o dinheiro. Há temor de que os recursos sejam desviados em corrupção. O REDD deverá continuar na mesa de negociação.

O que o acontecerá agora? Rodadas sobre clima estão previstas para março, em Londres, em Bonn (Alemanha), e finalmente no Qatar, na COP-18, em dezembro de 2012. Embora a Rio+20 não tenha foco no clima, especialistas acreditam que ela será fundamental nesse sentido. Em 2012 começam as negociações para se chegar a um acordo em 2015. Isso incluirá as metas por países, que deverão ser diferenciadas. Espera-se que países sejam pressionados pela sociedade a assumir metas mais ousadas.

O papel da confiança na decisão social (FAPESP)

08/12/2011

Por Mônica Pileggi

Estudo realizado no Mackenzie e publicado no The Journal of Neuroscience indica que cérebro não percebe injustiça de amigos em situações de decisão econômica (Wikimedia)

Agência FAPESP – Durante situações de decisão econômica, a amizade é uma das variáveis que modulam nosso cérebro, tornando o ser humano incapaz de se sentir injustiçado. Essa é uma das conclusões de uma pesquisa desenvolvida no Laboratório de Neurociência Cognitiva e Social da Universidade Presbiteriana Mackenzie (UPM) e publicada no The Journal of Neuroscience.

O trabalho, liderado pelo professor Paulo Sérgio Boggio, coordenador de pesquisa do Centro de Ciências Biológicas e da Saúde da UPM, foi realizado durante o mestrado “Estudo preliminar sobre potenciais cognitivos em tarefa de tomada de decisão social”, da psicóloga Camila Campanhã, que atualmente faz o doutorado na UPM, ambos com bolsas da FAPESP.

Segundo Campanhã, o estudo teve como objetivo estudar o papel da confiança na tomada de decisão social e suas bases neurobiológicas. Para isso, ela se baseou na teoria dos jogos, ramo da matemática aplicada que estuda situações estratégicas nas quais jogadores escolhem diferentes ações na tentativa de melhorar seu retorno.

Inicialmente desenvolvida como ferramenta para compreender comportamento econômico e depois usada até mesmo para definir estratégias nucleares, a teoria dos jogos é hoje aplicada em diversos campos acadêmicos. Tornou-se um ramo proeminente da matemática especialmente após a publicação, em 1944, de The Theory of Games and Economic Behavior de John von Neumann e Oskar Morgenstern.

Campanhã – cujo estudo foi realizado em colaboração com os pesquisadores Ludovico Minati, do Istituto Neurologico “Carlo Besta” (Itália), e Felipe Fregni, da Universidade Harvard (Estados Unidos) – conta que para a realização do experimento foi utilizado o Ultimatum Game, jogo utilizado na neuroeconomia e por estudiosos do comportamento social.

Composto por participantes da faixa etária de 18 a 25 anos, o jogo foi dividido em dois blocos. No primeiro, o computador enviou propostas econômicas justas e injustas de amigos (que se encontravam em ambientes diferentes). No segundo, as propostas foram feitas por integrantes do laboratório, desconhecidos dos participantes.

Os valores das propostas foram classificadas como justas (50:50), mais ou menos justas (70:30) e muito injustas (80:20 e 90:10). “Os participantes receberam a mesma quantidade de propostas justas e injustas, tanto do amigo como do desconhecido, enviadas pelo computador. Registramos toda a atividade eletroencefalográfica desses participantes durante o experimento”, disse Campanhã à Agência FAPESP.

Nesse tipo de experimento, caso a pessoa aceite a proposta, ambos recebem o valor combinado. Se ela recusar, os dois não recebem nada. “Do ponto de vista comportamental, observamos que as pessoas rejeitaram muito mais as propostas injustas do desconhecido do que as oferecidas pelo amigo – nas quais o amigo sairia ganhando mais. Além disso, essas pessoas pontuaram os amigos como mais justos do que os desconhecidos”, destacou.

O estudo apontou uma inversão positiva na atividade neuroelétrica para as propostas de amigos. “A expectativa era que os dados seriam negativos conforme se recebessem propostas injustas do amigo. No entanto, os participantes não perceberam essa injustiça”, disse Campanhã.

Segundo ela, a inversão de polaridade positiva está relacionada à satisfação de receber algo bom e justo, cuja recompensa está acima do esperado. Nesse caso, a dopamina é liberada. No sinal negativo há quebra de expectativa e a substância é inibida, gerando raiva.

“Ao realizarmos a análise para identificar a área do cérebro ativada naquele momento, observamos que o sinal elétrico apareceu no córtex pré-frontal medial anterior. Essa é uma área relacionada à habilidade de imaginar e tentar entender o que o outro está pensando e sentindo”, disse.

“Não significa que as pessoas não processam a injustiça, mas esse processo é diferente quando se confia em alguém. É como se não precisasse tentar entender o que se passa com a outra pessoa ou o que ela está sentindo”, disse.

O artigo Responding to Unfair Offers Made by a Friend: Neuroelectrical Activity Changes in the Anterior Medial Prefrontal Cortex (doi:10.1523/JNEUROSCI.1253-11.2011), de Camila Campanhã e outros, pode ser lido por assinantes da The Journal of Neuroscience emwww.jneurosci.org/content/31/43/15569.full.pdf+html?sid=94d0a3e8-79b9-47a8-89d8-24dcf41750e7. 

Human brains unlikely to evolve into a ‘supermind’ as price to pay would be too high (University of Warwick)

University of Warwick

Human minds have hit an evolutionary “sweet spot” and – unlike computers – cannot continually get smarter without trade-offs elsewhere, according to research by the University of Warwick.

Researchers asked the question why we are not more intelligent than we are given the adaptive evolutionary process. Their conclusions show that you can have too much of a good thing when it comes to mental performance.

The evidence suggests that for every gain in cognitive functions, for example better memory, increased attention or improved intelligence, there is a price to pay elsewhere – meaning a highly-evolved “supermind” is the stuff of science fiction.

University of Warwick psychology researcher Thomas Hills and Ralph Hertwig of the University of Basel looked at a range of studies, including research into the use of drugs like Ritalan which help with attention, studies of people with autism as well as a study of the Ashkenazi Jewish population.

For instance, among individuals with enhanced cognitive abilities- such as savants, people with photographic memories, and even genetically segregated populations of individuals with above average IQ, these individuals often suffer from related disorders, such as autism, debilitating synaesthesia and neural disorders linked with enhanced brain growth.

Similarly, drugs like Ritalan only help people with lower attention spans whereas people who don’t have trouble focusing can actually perform worse when they take attention-enhancing drugs.

Dr Hills said: “These kinds of studies suggest there is an upper limit to how much people can or should improve their mental functions like attention, memory or intelligence.

“Take a complex task like driving, where the mind needs to be dynamically focused, attending to the right things such as the road ahead and other road users – which are changing all the time.

“If you enhance your ability to focus too much, and end up over-focusing on specific details, like the driver trying to hide in your blind spot, then you may fail to see another driver suddenly veering into your lane from the other direction.

“Or if you drink coffee to make yourself more alert, the trade-off is that it is likely to increase your anxiety levels and lose your fine motor control. There are always trade-offs.

“In other words, there is a ‘sweet spot’ in terms of enhancing our mental abilities – if you go beyond that spot – just like in the fairy-tales – you have to pay the price.”

The research, entitled ‘Why Aren’t We Smarter Already: Evolutionary Trade-Offs and Cognitive Enhancements,’ is published in Current Directions in Psychological Science, a journal of the Association for Psychological Science.

CO2 may not warm the planet as much as thought (New Scientist)

19:00 24 November 2011 by Michael Marshall

The climate may be less sensitive to carbon dioxide than we thought – and temperature rises this century could be smaller than expected. That’s the surprise result of a new analysis of the last ice age. However, the finding comes from considering just one climate model, and unless it can be replicated using other models, researchers are dubious that it is genuine.

As more greenhouse gases enter the atmosphere, more heat is trapped and temperatures go up – but by how much? The best estimates say that if the amount of carbon dioxide in the atmosphere doubles, temperatures will rise by 3 °C. This is the “climate sensitivity”.

But the 3 °C figure is only an estimate. In 2007, the Intergovernmental Panel on Climate Change (IPCC) said the climate sensitivity could be anywhere between 2 and 4.5 °C. That means the temperature rise from a given release of carbon dioxide is still uncertain.

There have been several attempts to pin down the sensitivity. The latest comes from Andreas Schmittner of Oregon State University, Corvallis, and colleagues, who took a closer look at the Last Glacial Maximum around 20,000 years ago, when the last ice age was at its height.

Icy cold

They used previously published data to put together a detailed global map of surface temperatures. This showed that the planet was, on average, 2.2 °C cooler than today. We already know from ice cores that greenhouse gas levels in the atmosphere at the time were much lower than they are now.

Schmittner plugged the atmospheric greenhouse gas concentrations that existed during the Last Glacial Maximum into a climate model and tried to recreate the global temperature patterns. He found that he had to assume a relatively small climate sensitivity of 2.4 °C if the model was to give the best fit.

If climate sensitivity really is so low, global warming this century will be at the lower end of the IPCC’s estimates. Assuming we keep burning fossil fuels heavily, the IPCC estimates that temperatures will rise about 4 °C by 2100, compared with 1980 to 1999. Schmittner’s study suggests the warming would be closer to their minimum estimate for the “heavy burning” scenario, which is 2.4 °C.

Sensitive models

Past climates can help us work out the true climate sensitivity, says Gavin Schmidt of the NASA Goddard Institute of Space Studies in New York City. But he says the results of Schmittner’s study aren’t strong enough to change his mind about the climate sensitivity. “I don’t expect this to impact consensus estimates,” he says.

In particular, the model that Schmittner used in his analysis underestimates the cooling in Antarctica and the mid-latitudes. “The model estimate of the cooling during the Last Glacial Maximum is a clear underestimate,” Schmidt says. “A different model would give a cooler Last Glacial Maximum, and thus a larger sensitivity.”

Schmittner agrees it is too early to draw firm conclusions.Individual climate models all have their own quirks, so he wants to try the experiment with several models to find out if others repeat the result.

Even if the climate sensitivity really is as low as 2.4 °C, Schmittner says that doesn’t mean we are safe from climate change. The Last Glacial Maximum was only 2.2 °C cooler than today, yet there were huge ice sheets, plant life was different, andsea levels were 120 metres lower.

“Very small changes in temperature cause huge changes in certain regions,” Schmittner says. So even if we get a smaller temperature rise than we expected, the knock-on effects would still be severe.

Journal reference: Science, DOI: 10.1126/science.1203513

World on track for nearly 11-degree temperature rise, energy expert says (Washington Post)

By , Published: November 28

The chief economist for the International Energy Agency said Monday that current global energy consumption levels put the Earth on a trajectory to warm by 6 degrees Celsius (10.8 degrees Fahrenheit) above pre-industrial levels by 2100, an outcome he called “a catastrophe for all of us.”

Fatih Birol spoke as as delegates from nearly 200 countries convened the opening day of annual U.N. climate talks in Durban, South Africa.

This year has been an unprecedented one for natural disasters. By the end of June, economic losses totaled $265 billion, according to German reinsurer Munich Re. That easily exceeds the total figure for 2005, which was previously the costliest year.

International climate negotiators have pledged to keep the global temperature rise to 2 degrees Celsius, or 3.6 degrees Fahrenheit, above pre-industrial levels. The Earth has already warmed 0.8 degrees Celsius, or 1.4 Fahrenheit, so far, according to climate scientists.According to the IEA’s most recent analysis, heat-trapping emissions from the world’s energy infrastructure will lead to a 2-degree Celsius increase in the Earth’s temperature that, as more capacity is added to the system, will climb to 6 degrees Celsius of warming by 2100.Unless there is a shift away from some of the fossil fuel energy now used for electricity generation and transportation, Birol said, “the world is perfectly on track for a six-degree Celsius increase in temperature.“Everybody, even the schoolchildren, knows this is a catastrophe for all of us,” he said at the Carnegie Endowment for International Peace.

Birol spoke in unusually blunt terms about the climate implications of the global energy mix, implications that are disputed by many conservatives in the United States who don’t believe in the connection between human activity and climate change.

David Burwell, who directs the energy and climate program at the Carnegie Endowment, said Birol’s comments have “big implications for capital investment in energy,” though he noted that it will be oil executives and others in the private sector who will drive many of the key decisions.

“We can try to regulate, we can try to incentivize, but ultimately, they’ve got to make the decisions, they’ve got to make the investments,” he said, adding that government officials should engage with the energy industry on this topic. “Now’s the time to have the conversation about investments.”

Burwell added that while the IEA has analyzed energy use and production for years, this is the first year its officials have spoken this publicly about the need to shift gears.

“They’re definitely raising the red flag, because the numbers speak for themselves,” he said. “This is the first year they’ve started stamping their foot and saying, ‘Lookit, listen to us.’ ”

In an interview after his talk, Birol said he believes his agency’s analysis is having an impact in places such as China, which he said would outpace the European Union in per capita carbon emissions by 2015. He added that by 2035, China would outrank the industrialized world as the single biggest overall emitter of greenhouse gases in history.

“They are one of the few countries putting an emphasis on climate change,” Birol said, noting they will experiment next year with putting a price on carbon in some regions.

The U.N. talks, meanwhile, suffered a setback as Canada announced Monday that it would not agree to sign up to a second commitment period under the Kyoto Protocol, the 1997 climate pact that set emissions targets for all major industrialized nations. Canada had pledged to cut its overall greenhouse gas emissions 6 percent by 2012 compared with 1990 levels; as of 2009, its carbon output was 29.8 percent above 1990 levels.