Arquivo da tag: Enquadramento

Manifesto de neurocientistas sobre a consciência animal

16/07/2012

Comportamento animal

Quase humanos (Veja)

Neurocientistas publicam manifesto afirmando que mamíferos, aves e até polvos têm consciência e esquentam debate sobre direitos dos animais

Marco Túlio Pires

Chimpanzé alimenta um filhote de tigre dourado, em mini zoológico na cidade de Samutprakan, Tailândia

Chimpanzé alimenta um filhote de tigre dourado, em mini zoológico na cidade de Samutprakan, Tailândia: percepção de sua própria existência e do mundo ao seu redor (Rungroj Yongrit/EFE)

Os seres humanos não são os únicos animais que têm consciência. A afirmação não é de ativistas radicais defensores dos direitos dos animais. Pelo contrário. Um grupo de neurocientistas — doutores de instituições de renome como Caltech, MIT e Instituto Max Planck — publicou um manifesto asseverando que o estudo da neurociência evoluiu de modo tal que não é mais possível excluir mamíferos, aves e até polvos do grupo de seres vivos que possuem consciência. O documento divulgado no último sábado (7), em Cambridge, esquenta uma discussão que divide cientistas, filósofos e legisladores há séculos sobre a natureza da consciência e sua implicação na vida dos humanos e de outros animais.

Leia mais: A íntegra, em inglês, do manifesto que afirma a existência da consciência em todos os mamíferos, aves e outras criaturas, como polvos

Apresentado à Nasa nesta quinta-feira, o manifesto não traz novas descobertas da neurociência — é uma compilação das pesquisas da área. Representa, no entanto, um posicionamento inédito sobre a capacidade de outros seres perceberem sua própria existência e o mundo ao seu redor. Em entrevista ao site de VEJA, Philip Low, criador do iBrain, o aparelho que recentemente permitiu a leitura das ondas cerebrais do físico Stephen Hawking, e um dos articuladores do movimento, explica que nos últimos 16 anos a neurociência descobriu que as áreas do cérebro que distinguem seres humanos de outros animais não são as que produzem a consciência. “As estruturas cerebrais responsáveis pelos processos que geram a consciência nos humanos e outros animais são equivalentes”, diz. “Concluímos então que esses animais também possuem consciência.”

O que é consciência?

PARA A FILOSOFIA
Filosoficamente, é o entendimento que uma criatura tem sobre si e seu lugar na natureza. Alguns atributos definem a consciência, como ser senciente, ou seja, sentir o mundo à sua volta e reagir a ele; estar alerta ou acordado ou ter consciência sobre si mesmo (o que, para a filosofia já basta para incluir alguns animais “não-linguísticos” entre os seres com consciência).Fonte: Enciclopédia de Filosofia de Stanford

PARA A CIÊNCIA
A ciência considera como consciência as percepções sobre o mundo e as sensações corporais, junto com os pensamentos, memórias, ações e emoções. Ou seja, tudo o que escapa aos processos cerebrais automáticos e chega à nossa atenção. O conteúdo da consciência geralmente é estudado usando exames de imagens cerebrais para comparar quais estímulos chegam à nossa atenção e quais não. Como resumiu o neurocientista Bernard Baars, em 1987, o cérebro é como um teatro no qual a maioria dos eventos neurais são inconscientes, portanto acontecem “nos bastidores”, enquanto alguns poucos entram no processo consciente, ou seja, chegam ao “palco”.

Estudos recentes, como os da pesquisadora Diana Reiss (uma das cientistas que assinaram o manifesto), da Hunter College, nos Estados Unidos, mostram que golfinhos e elefantes também são capazes de se reconhecer no espelho. Essa capacidade é importante para definir se um ser está consciente. O mesmo vale para chimpanzés e pássaros. Outros tipos de comportamento foram analisados pelos neurocientistas. “Quando seu cachorro está sentindo dor ou feliz em vê-lo, há evidências de que no cérebro deles há estruturas semelhantes às que são ativadas quando exibimos medo e dor e prazer”, diz Low.

Personalidade animal – Dizer que os animais têm consciência pode trazer várias implicações para a sociedade e o modo como os animais são tratados. Steven Wise, advogado e especialista americano em direito dos animais, diz que o manifesto chega em boa hora. “O papel dos advogados e legisladores é transformar conclusões científicas como essa em legislação que ajudará a organizar a sociedade”, diz em entrevista ao site de VEJA. Wise é líder do Projeto dos Direitos de Animais não Humanos. O advogado coordena um grupo de 70 profissionais que organizam informações, casos e jurisprudência para entrar com o primeiro processo em favor de que alguns animais — como grandes primatas, papagaios africanos e golfinhos — tenham seu status equiparado ao dos humanos.

O manifesto de Cambridge dá mais munição ao grupo de Wise para vencer o caso. “Queremos que esses animais recebam direitos fundamentais, que a justiça as enxergue como pessoas, no sentido legal.” Isso, de acordo com o advogado, quer dizer que esses animais teriam direito à integridade física e à liberdade, por exemplo. “Temos que parar de pensar que esses animais existem para servir aos seres humanos”, defende Wise. “Eles têm um valor intrínseco, independente de como os avaliamos.”

Questão moral – O manifesto não decreta o fim dos zoológicos ou das churrascarias, muito menos das pesquisas médicas com animais. Contudo, já foi suficiente para provocar reflexão e mudança de comportamento em cientistas, como o próprio Low. “Estou considerando me tornar vegetariano”, diz. “Temos agora que apelar para nossa engenhosidade, para desenvolver tecnologias que nos permitam criar uma sociedade cada vez menos dependente dos animais.” Low se refere principalmente à pesquisa médica. Para estudar a vida, a ciência ainda precisa tirar muitas. De acordo com o neurocientista, o mundo gasta 20 bilhões por ano para matar 100 milhões de vertebrados. Das moléculas medicinais produzidas por esse amontoado de dinheiro e mortes, apenas 6% chega a ser testada em seres humanos. “É uma péssima contabilidade”, diz Low.

Contudo, a pesquisa com animais ainda é necessária. O endocrinologista americano Michael Conn, autor do livro The Animal Research War, sem edição no Brasil, argumenta que se trata de uma escolha priorizar a espécie humana. “Conceitos como os de consentimento e autonomia só fazem sentido dentro de um código moral que diz respeito aos homens, e não aos animais”, disse em entrevista ao site de VEJA. “Nossa obrigação com os animais é fazer com que eles sejam devidamente cuidados, não sofram nem sintam dor — e não tratá-los como se fossem humanos, o que seria uma ficção”, argumenta. “Se pudéssemos utilizar apenas um computador para fazer pesquisas médicas seria ótimo. Mas a verdade é que não é possível ainda.”

A inteligência dos polvos

O vídeo mostra diversas situações em que o polvo consegue resolver problemas. Desde a captura de presas em diferentes tipos de recipientes até escapar de locais extremamente difíceis. As situações mostram que o animal é capaz de formular soluções para problemas específicos, o que denota, na opinião dos neurocientistas, um estado de consciência inteligente.

*   *   *

“Não é mais possível dizer que não sabíamos”, diz Philip Low (Veja)

Entrevista

Neurocientista explica por que pesquisadores se uniram para assinar manifesto que admite a existência da consciência em todos os mamíferos, aves e outras criaturas, como o polvo, e como essa descoberta pode impactar a sociedade

Marco Túlio Pires

Epilepsia: especialistas estimam que 2% da população brasileira tenha a doença

Estruturas do cérebro responsáveis pela produção da consciência são análogas em humanos e outros animais, dizem neurocientistas (Thinkstock)

O neurocientista canadense Philip Low ganhou destaque no noticiário científico depois deapresentar um projeto em parceria com o físico Stephen Hawking, de 70 anos. Low quer ajudar Hawking, que está completamente paralisado há 40 anos por causa de uma doença degenerativa, a se comunicar com a mente. Os resultados da pesquisa foram revelados no último sábado (7) em uma conferência em Cambridge. Contudo, o principal objetivo do encontro era outro. Nele, neurocientistas de todo o mundo assinaram um manifesto afirmando que todos os mamíferos, aves e outras criaturas, incluindo polvos, têm consciência. Stephen Hawking estava presente no jantar de assinatura do manifesto como convidado de honra.

Philip LowPhilip Low: “Todos os mamíferos e pássaros têm consciência”. Divulgação.

Low é pesquisador da Universidade Stanford e do MIT (Massachusetts Institute of Technology), ambos nos Estados Unidos. Ele e mais 25 pesquisadores entendem que as estruturas cerebrais que produzem a consciência em humanos também existem nos animais. “As áreas do cérebro que nos distinguem de outros animais não são as que produzem a consciência”, diz Low, que concedeu a seguinte entrevista ao site de VEJA:

Estudos sobre o comportamento animal já afirmam que vários animais possuem certo grau de consciência. O que a neurociência diz a respeito?Descobrimos que as estruturas que nos distinguem de outros animais, como o córtex cerebral, não são responsáveis pela manifestação da consciência. Resumidamente, se o restante do cérebro é responsável pela consciência e essas estruturas são semelhantes entre seres humanos e outros animais, como mamíferos e pássaros, concluímos que esses animais também possuem consciência.

Quais animais têm consciência? Sabemos que todos os mamíferos, todos os pássaros e muitas outras criaturas, como o polvo, possuem as estruturas nervosas que produzem a consciência. Isso quer dizer que esses animais sofrem. É uma verdade inconveniente: sempre foi fácil afirmar que animais não têm consciência. Agora, temos um grupo de neurocientistas respeitados que estudam o fenômeno da consciência, o comportamento dos animais, a rede neural, a anatomia e a genética do cérebro. Não é mais possível dizer que não sabíamos.

É possível medir a similaridade entre a consciência de mamíferos e pássaros e a dos seres humanos? Isso foi deixado em aberto pelo manifesto. Não temos uma métrica, dada a natureza da nossa abordagem. Sabemos que há tipos diferentes de consciência. Podemos dizer, contudo, que a habilidade de sentir dor e prazer em mamíferos e seres humanos é muito semelhante.

Que tipo de comportamento animal dá suporte à ideia de que eles têm consciência?Quando um cachorro está com medo, sentindo dor, ou feliz em ver seu dono, são ativadas em seu cérebro estruturas semelhantes às que são ativadas em humanos quando demonstramos medo, dor e prazer. Um comportamento muito importante é o autorreconhecimento no espelho. Dentre os animais que conseguem fazer isso, além dos seres humanos, estão os golfinhos, chimpanzés, bonobos, cães e uma espécie de pássaro chamada pica-pica.

Quais benefícios poderiam surgir a partir do entendimento da consciência em animais? Há um pouco de ironia nisso. Gastamos muito dinheiro tentando encontrar vida inteligente fora do planeta enquanto estamos cercados de inteligência consciente aqui no planeta. Se considerarmos que um polvo — que tem 500 milhões de neurônios (os humanos tem 100 bilhões) — consegue produzir consciência, estamos muito mais próximos de produzir uma consciência sintética do que pensávamos. É muito mais fácil produzir um modelo com 500 milhões de neurônios do que 100 bilhões. Ou seja, fazer esses modelos sintéticos poderá ser mais fácil agora.

Qual é a ambição do manifesto? Os neurocientistas se tornaram militantes do movimento sobre o direito dos animais? É uma questão delicada. Nosso papel como cientistas não é dizer o que a sociedade deve fazer, mas tornar público o que enxergamos. A sociedade agora terá uma discussão sobre o que está acontecendo e poderá decidir formular novas leis, realizar mais pesquisas para entender a consciência dos animais ou protegê-los de alguma forma. Nosso papel é reportar os dados.

As conclusões do manifesto tiveram algum impacto sobre o seu comportamento? Acho que vou virar vegano. É impossível não se sensibilizar com essa nova percepção sobre os animais, em especial sobre sua experiência do sofrimento. Será difícil, adoro queijo.

O que pode mudar com o impacto dessa descoberta? Os dados são perturbadores, mas muito importantes. No longo prazo, penso que a sociedade dependerá menos dos animais. Será melhor para todos. Deixe-me dar um exemplo. O mundo gasta 20 bilhões de dólares por ano matando 100 milhões de vertebrados em pesquisas médicas. A probabilidade de um remédio advindo desses estudos ser testado em humanos (apenas teste, pode ser que nem funcione) é de 6%. É uma péssima contabilidade. Um primeiro passo é desenvolver abordagens não invasivas. Não acho ser necessário tirar vidas para estudar a vida. Penso que precisamos apelar para nossa própria engenhosidade e desenvolver melhores tecnologias para respeitar a vida dos animais. Temos que colocar a tecnologia em uma posição em que ela serve nossos ideais, em vez de competir com eles.

Argentine Invasion (Radiolab)

Monday, July 30, 2012 – 10:00 PM

From a suburban sidewalk in southern California, Jad and Robert witness the carnage of a gruesome turf war. Though the tiny warriors doing battle clock in at just a fraction of an inch, they have evolved a surprising, successful, and rather unsettling strategy of ironclad loyalty, absolute intolerance, and brutal violence.

Drawing of an Argentinte Ant

(Adam Cole/WNYC)

David Holway, an ecologist and evolutionary biologist from UC San Diego, takes us to a driveway in Escondido, California where a grisly battle rages. In this quiet suburban spot, two groups of ants are putting on a chilling display of dismemberment and death. According to David, this battle line marks the edge of an enormous super-colony of Argentine ants. Think of that anthill in your backyard, and stretch it out across five continents.

Argentine ants are not good neighbors. When they meet ants from another colony, any other colony, they fight to the death, and tear the other ants to pieces. While other kinds of ants sometimes take slaves or even have sex with ants from different colonies, the Argentine ants don’t fool around. If you’re not part of the colony, you’re dead.

According to evolutionary biologist Neil Tsutsui and ecologist Mark Moffett, the flood plains of northern Argentina offer a clue as to how these ants came to dominate the planet. Because of the frequent flooding, the homeland of Linepithema humile is basically a bootcamp for badass ants. One day, a couple ants from one of these families of Argentine ants made their way onto a boat and landed in New Orleans in the late 1800s. Over the last century, these Argentine ants wreaked havoc across the southern U.S. and a significant chunk of coastal California.

In fact, Melissa Thomas, an Australian entomologist, reveals that these Argentine ants are even more well-heeled than we expected – they’ve made to every continent except Antarctica. No matter how many thousands of miles separate individual ants, when researchers place two of them together – whether they’re plucked from Australia, Japan, Hawaii … even Easter Island – they recognize each other as belonging to the same super-colony.

But the really mind-blowing thing about these little guys is the surprising success of their us-versus-them death-dealing. Jad and Robert wrestle with what to make of this ant regime, whether it will last, and what, if anything, it might mean for other warlike organisms with global ambitions.

Information Overload in the Era of ‘Big Data’ (Science Daily)

ScienceDaily (Aug. 20, 2012) — Botany is plagued by the same problem as the rest of science and society: our ability to generate data quickly and cheaply is surpassing our ability to access and analyze it. In this age of big data, scientists facing too much information rely on computers to search large data sets for patterns that are beyond the capability of humans to recognize — but computers can only interpret data based on the strict set of rules in their programming.

New tools called ontologies provide the rules computers need to transform information into knowledge, by attaching meaning to data, thereby making those data retrievable by computers and more understandable to human beings. Ontology, from the Greek word for the study of being or existence, traditionally falls within the purview of philosophy, but the term is now used by computer and information scientists to describe a strategy for representing knowledge in a consistent fashion. An ontology in this contemporary sense is a description of the types of entities within a given domain and the relationships among them.

A new article in this month’s American Journal of Botany by Ramona Walls (New York Botanical Garden) and colleagues describes how scientists build ontologies such as the Plant Ontology (PO) and how these tools can transform plant science by facilitating new ways of gathering and exploring data.

When data from many divergent sources, such as data about some specific plant organ, are associated or “tagged” with particular terms from a single ontology or set of interrelated ontologies, the data become easier to find, and computers can use the logical relationships in the ontologies to correctly combine the information from the different databases. Moreover, computers can also use ontologies to aggregate data associated with the different subclasses or parts of entities.

For example, suppose a researcher is searching online for all examples of gene expression in a leaf. Any botanist performing this search would include experiments that described gene expression in petioles and midribs or in a frond. However, a search engine would not know that it needs to include these terms in its search — unless it was told that a frond is a type of leaf, and that every petiole and every midrib are parts of some leaf. It is this information that ontologies provide.

The article in the American Journal of Botany by Walls and colleagues describes what ontologies are, why they are relevant to plant science, and some of the basic principles of ontology development. It includes an overview of the ontologies that are relevant to botany, with a more detailed description of the PO and the challenges of building an ontology that covers all green plants. The article also describes four keys areas of plant science that could benefit from the use of ontologies: (1) comparative genetics, genomics, phenomics, and development; (2) taxonomy and systematics; (3) semantic applications; and (4) education. Although most of the examples in this article are drawn from plant science, the principles could apply to any group of organisms, and the article should be of interest to zoologists as well.

As genomic and phenomic data become available for more species, many different research groups are embarking on the annotation of their data and images with ontology terms. At the same time, cross-species queries are becoming more common, causing more researchers in plant science to turn to ontologies. Ontology developers are working with the scientists who generate data to make sure ontologies accurately reflect current science, and with database developers and publishers to find ways to make it easier for scientist to associate their data with ontologies.

Journal Reference:

R. L. Walls, B. Athreya, L. Cooper, J. Elser, M. A. Gandolfo, P. Jaiswal, C. J. Mungall, J. Preece, S. Rensing, B. Smith, D. W. Stevenson. Ontologies as integrative tools for plant scienceAmerican Journal of Botany, 2012; 99 (8): 1263 DOI: 10.3732/ajb.1200222

Politics and Prejudice Explored (Science Daily)

ScienceDaily (Aug. 20, 2012) — Research has associated political conservatism with prejudice toward various stereotyped groups. But research has also shown that people select and interpret evidence consistent with their own pre-existing attitudes and ideologies. In this article, Chambers and colleagues hypothesized that, contrary to what some research might indicate, prejudice is not restricted to a particular political ideology.

Rather, the conflicting values of liberals and conservatives give rise to different kinds of prejudice, with each group favoring other social groups that share their values. In the first study, three diverse groups of participants rated the ideological position and their overall impression of 34 different target groups.

Participants’ impressions fell in line with their ideology. For example, conservatives expressed more prejudice than liberals against groups that were identified as liberal (e.g., African-Americans, homosexuals), but less prejudice against groups identified as conservative (e.g., Christian fundamentalists, business people).

In the second and third studies, participants were presented with 6 divisive political issues and descriptions of racially diverse target persons for each issue. Neither liberals’ nor conservatives’ impressions of the target persons were affected by the race of the target, but both were strongly influenced by the target’s political attitudes.

From these findings the researchers conclude that prejudices commonly linked with ideology are most likely derived from perceived ideological differences and not from other characteristics like racial tolerance or intolerance.

Journal References:

J. B. Luguri, J. L. Napier, J. F. Dovidio. Reconstruing Intolerance: Abstract Thinking Reduces Conservatives’ Prejudice Against Nonnormative GroupsPsychological Science, 2012; 23 (7): 756 DOI:10.1177/0956797611433877

J. B. Luguri, J. L. Napier, J. F. Dovidio. Reconstruing Intolerance: Abstract Thinking Reduces Conservatives’ Prejudice Against Nonnormative GroupsPsychological Science, 2012; 23 (7): 756 DOI:10.1177/0956797611433877

 

*   *   *

Prejudice Comes from a Basic Human Need and Way of Thinking, New Research Suggests

ScienceDaily (Dec. 21, 2011) — Where does prejudice come from? Not from ideology, say the authors of a new paper. Instead, prejudice stems from a deeper psychological need, associated with a particular way of thinking. People who aren’t comfortable with ambiguity and want to make quick and firm decisions are also prone to making generalizations about others.

In a new article published in Current Directions in Psychological Science, a journal of the Association for Psychological Science, Arne Roets and Alain Van Hiel of Ghent University in Belgium look at what psychological scientists have learned about prejudice since the 1954 publication of an influential book, The Nature of Prejudice by Gordon Allport.

People who are prejudiced feel a much stronger need to make quick and firm judgments and decisions in order to reduce ambiguity. “Of course, everyone has to make decisions, but some people really hate uncertainty and therefore quickly rely on the most obvious information, often the first information they come across, to reduce it” Roets says. That’s also why they favor authorities and social norms which make it easier to make decisions. Then, once they’ve made up their mind, they stick to it. “If you provide information that contradicts their decision, they just ignore it.”

Roets argues that this way of thinking is linked to people’s need to categorize the world, often unconsciously. “When we meet someone, we immediately see that person as being male or female, young or old, black or white, without really being aware of this categorization,” he says. “Social categories are useful to reduce complexity, but the problem is that we also assign some properties to these categories. This can lead to prejudice and stereotyping.”

People who need to make quick judgments will judge a new person based on what they already believe about their category. “The easiest and fastest way to judge is to say, for example, ok, this person is a black man. If you just use your ideas about what black men are generally like, that’s an easy way to have an opinion of that person,” Roets says. “You say, ‘he’s part of this group, so he’s probably like this.'”

It’s virtually impossible to change the basic way that people think. Now for the good news: It’s possible to actually also use this way of thinking to reduce people’s prejudice. If people who need quick answers meet people from other groups and like them personally, they are likely to use this positive experience to form their views of the whole group. “This is very much about salient positive information taking away the aversion, anxiety, and fear of the unknown,” Roets says.

Roets’s conclusions suggest that the fundamental source of prejudice is not ideology, but rather a basic human need and way of thinking. “It really makes us think differently about how people become prejudiced or why people are prejudiced,” Roets says. “To reduce prejudice, we first have to acknowledge that it often satisfies some basic need to have quick answers and stable knowledge people rely on to make sense of the world.”

Journal Reference:

Arne Roets and Alain Van Hiel. Allport’s Prejudiced Personality Today: Need for Closure as the Motivated Cognitive Basis of PrejudiceCurrent Directions in Psychological Science, (in press)

 

*   *   *

Ironic Effects of Anti-Prejudice Messages

ScienceDaily (July 7, 2011) — Organizations and programs have been set up all over the globe in the hopes of urging people to end prejudice. According to a research article, which will be published in an upcoming issue of Psychological Science, a journal of the Association for Psychological Science, such programs may actually increase prejudices.

Lisa Legault, Jennifer Gutsell and Michael Inzlicht, from the University of Toronto Scarborough, were interested in exploring how one’s everyday environment influences people’s motivation toward prejudice reduction.

The authors conducted two experiments which looked at the effect of two different types of motivational intervention — a controlled form (telling people what they should do) and a more personal form (explaining why being non-prejudiced is enjoyable and personally valuable).

In experiment one; participants were randomly assigned one of two brochures to read: an autonomy brochure or a controlling brochure. These brochures discussed a new campus initiative to reduce prejudice. A third group was offered no motivational instructions to reduce prejudice. The authors found that, ironically, those who read the controlling brochure later demonstrated more prejudice than those who had not been urged to reduce prejudice. Those who read the brochure designed to support personal motivation showed less prejudice than those in the other two groups.

In experiment two, participants were randomly assigned a questionnaire, designed to stimulate personal or controlling motivation to reduce prejudice. The authors found that those who were exposed to controlling messages regarding prejudice reduction showed significantly more prejudice than those who did not receive any controlling cues.

The authors suggest that when interventions eliminate people’s freedom to value diversity on their own terms, they may actually be creating hostility toward the targets of prejudice.

According to Dr. Legault, “Controlling prejudice reduction practices are tempting because they are quick and easy to implement. They tell people how they should think and behave and stress the negative consequences of failing to think and behave in desirable ways.” Legault continues, “But people need to feel that they are freely choosing to be nonprejudiced, rather than having it forced upon them.”

Legault stresses the need to focus less on the requirement to reduce prejudices and start focusing more on the reasons why diversity and equality are important and beneficial to both majority and minority group members.

Story Source:

The above story is reprinted from materials provided byAssociation for Psychological Science, via EurekAlert!, a service of AAAS.

Extreme Weather Linked to Global Warming, Nobel Prize-Winning Scientist Says (Science Daily)

New scientific analysis strengthens the view that record-breaking summer heat, crop-withering drought and other extreme weather events in recent years do, indeed, result from human activity and global warming, Nobel Laureate Mario J. Molina, Ph.D. explains. (Credit: NASA Goddard Space Flight Center Image by Reto Stöckli (land surface, shallow water, clouds). Enhancements by Robert Simmon (ocean color, compositing, 3D globes, animation). Data and technical support: MODIS Land Group; MODIS Science Data Support Team; MODIS Atmosphere Group; MODIS Ocean Group Additional data: USGS EROS Data Center (topography); USGS Terrestrial Remote Sensing Flagstaff Field Center (Antarctica); Defense Meteorological Satellite Program (city lights).)

ScienceDaily (Aug. 20, 2012) — New scientific analysis strengthens the view that record-breaking summer heat, crop-withering drought and other extreme weather events in recent years do, indeed, result from human activity and global warming, Nobel Laureate Mario J. Molina, Ph.D., said at a conference in Philadelphia on August 20.

Molina, who shared the 1995 Nobel Prize in Chemistry for helping save the world from the consequences of ozone depletion, presented the keynote address at the 244thNational Meeting & Exposition of the American Chemical Society.

“People may not be aware that important changes have occurred in the scientific understanding of the extreme weather events that are in the headlines,” Molina said. “They are now more clearly connected to human activities, such as the release of carbon dioxide ― the main greenhouse gas ― from burning coal and other fossil fuels.”

Molina emphasized that there is no “absolute certainty” that global warming is causing extreme weather events. But he said that scientific insights during the last year or so strengthen the link. Even if the scientific evidence continues to fall short of the absolute certainly measure, the heat, drought, severe storms and other weather extremes may prove beneficial in making the public more aware of global warming and the need for action, said Molina.

“It’s important that people are doing more than just hearing about global warming,” he said. “People may be feeling it, experiencing the impact on food prices, getting a glimpse of what everyday life may be like in the future, unless we as a society take action.”

Molina, who is with the University of California, San Diego, suggested a course of action based on an international agreement like the Montreal Protocol that phased out substances responsible for the depletion of the ozone layer.

“The new agreement should put a price on the emission of greenhouse gases, which would make it more economically favorable for countries to do the right thing. The cost to society of abiding by it would be less than the cost of the climate change damage if society does nothing,” he said.

In the 1970s and 1980s, Molina, F. Sherwood Rowland, Ph.D., and Paul J. Crutzen, Ph.D., established that substances called CFCs in aerosol spray cans and other products could destroy the ozone layer. The ozone layer is crucial to life on Earth, forming a protective shield high in the atmosphere that blocks potentially harmful ultraviolet rays in sunlight. Molina, Rowland and Crutzen shared the Nobel Prize for that research. After a “hole” in that layer over Antarctica was discovered in 1985, scientists established that it was indeed caused by CFCs, and worked together with policymakers and industry representatives around the world to solve the problem. The result was the Montreal Protocol, which phased out the use of CFCs in 1996.

Adopted and implemented by countries around the world, the Montreal Protocol eliminated the major cause of ozone depletion, said Molina, and stands as one of the most successful international agreements. Similar agreements, such as the Kyoto Protocol, have been proposed to address climate change. But Molina said these agreements have largely failed.

Unlike the ozone depletion problem, climate change has become highly politicized and polarizing, he pointed out. Only a small set of substances were involved in ozone depletion, and it was relatively easy to get the small number of stakeholders on the same page. But the climate change topic has exploded. “Climate change is a much more pervasive issue,” he explained. “Fossil fuels, which are at the center of the problem, are so important for the economy, and it affects so many other activities. That makes climate change much more difficult to deal with than the ozone issue.”

In addition to a new international agreement, other things must happen, he said. Scientists need to better communicate the scientific facts underlying climate change. Scientists and engineers also must develop cheap alternative energy sources to reduce dependence on fossil fuels.

Molina said that it’s not certain what will happen to Earth if nothing is done to slow down or halt climate change. “But there is no doubt that the risk is very large, and we could have some consequences that are very damaging, certainly for portions of society,” he said. “It’s not very likely, but there is some possibility that we would have catastrophes.”

Cloud Brightening to Control Global Warming? Geoengineers Propose an Experiment (Science Daily)

A conceptualized image of an unmanned, wind-powered, remotely controlled ship that could be used to implement cloud brightening. (Credit: John McNeill)

ScienceDaily (Aug. 20, 2012) — Even though it sounds like science fiction, researchers are taking a second look at a controversial idea that uses futuristic ships to shoot salt water high into the sky over the oceans, creating clouds that reflect sunlight and thus counter global warming.

University of Washington atmospheric physicist Rob Wood describes a possible way to run an experiment to test the concept on a small scale in a comprehensive paper published this month in the journal Philosophical Transactions of the Royal Society.

The point of the paper — which includes updates on the latest study into what kind of ship would be best to spray the salt water into the sky, how large the water droplets should be and the potential climatological impacts — is to encourage more scientists to consider the idea of marine cloud brightening and even poke holes in it. In the paper, he and a colleague detail an experiment to test the concept.

“What we’re trying to do is make the case that this is a beneficial experiment to do,” Wood said. With enough interest in cloud brightening from the scientific community, funding for an experiment may become possible, he said.

The theory behind so-called marine cloud brightening is that adding particles, in this case sea salt, to the sky over the ocean would form large, long-lived clouds. Clouds appear when water forms around particles. Since there is a limited amount of water in the air, adding more particles creates more, but smaller, droplets.

“It turns out that a greater number of smaller drops has a greater surface area, so it means the clouds reflect a greater amount of light back into space,” Wood said. That creates a cooling effect on Earth.

Marine cloud brightening is part of a broader concept known as geoengineering which encompasses efforts to use technology to manipulate the environment. Brightening, like other geoengineering proposals, is controversial for its ethical and political ramifications and the uncertainty around its impact. But those aren’t reasons not to study it, Wood said.

“I would rather that responsible scientists test the idea than groups that might have a vested interest in proving its success,” he said. The danger with private organizations experimenting with geoengineering is that “there is an assumption that it’s got to work,” he said.

Wood and his colleagues propose trying a small-scale experiment to test feasibility and begin to study effects. The test should start by deploying sprayers on a ship or barge to ensure that they can inject enough particles of the targeted size to the appropriate elevation, Wood and a colleague wrote in the report. An airplane equipped with sensors would study the physical and chemical characteristics of the particles and how they disperse.

The next step would be to use additional airplanes to study how the cloud develops and how long it remains. The final phase of the experiment would send out five to 10 ships spread out across a 100 kilometer, or 62 mile, stretch. The resulting clouds would be large enough so that scientists could use satellites to examine them and their ability to reflect light.

Wood said there is very little chance of long-term effects from such an experiment. Based on studies of pollutants, which emit particles that cause a similar reaction in clouds, scientists know that the impact of adding particles to clouds lasts only a few days.

Still, such an experiment would be unusual in the world of climate science, where scientists observe rather than actually try to change the atmosphere.

Wood notes that running the experiment would advance knowledge around how particles like pollutants impact the climate, although the main reason to do it would be to test the geoengineering idea.

A phenomenon that inspired marine cloud brightening is ship trails: clouds that form behind the paths of ships crossing the ocean, similar to the trails that airplanes leave across the sky. Ship trails form around particles released from burning fuel.

But in some cases ship trails make clouds darker. “We don’t really know why that is,” Wood said.

Despite increasing interest from scientists like Wood, there is still strong resistance to cloud brightening.

“It’s a quick-fix idea when really what we need to do is move toward a low-carbon emission economy, which is turning out to be a long process,” Wood said. “I think we ought to know about the possibilities, just in case.”

The authors of the paper are treading cautiously.

“We stress that there would be no justification for deployment of [marine cloud brightening] unless it was clearly established that no significant adverse consequences would result. There would also need to be an international agreement firmly in favor of such action,” they wrote in the paper’s summary.

There are 25 authors on the paper, including scientists from University of Leeds, University of Edinburgh and the Pacific Northwest National Laboratory. The lead author is John Latham of the National Center for Atmospheric Research and the University of Manchester, who pioneered the idea of marine cloud brightening.

Wood’s research was supported by the UW College of the Environment Institute.

Journal Reference:

J. Latham, K. Bower, T. Choularton, H. Coe, P. Connolly, G. Cooper, T. Craft, J. Foster, A. Gadian, L. Galbraith, H. Iacovides, D. Johnston, B. Launder, B. Leslie, J. Meyer, A. Neukermans, B. Ormond, B. Parkes, P. Rasch, J. Rush, S. Salter, T. Stevenson, H. Wang, Q. Wang, R. Wood. Marine cloud brighteningPhilosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences, 2012; 370 (1974): 4217 DOI:10.1098/rsta.2012.0086

Cientistas apontam problemas da cobertura da imprensa sobre mudanças climáticas (Fapesp)

Especialistas reunidos em São Paulo para debater gestão de riscos dos extremos climáticos manifestam preocupação com dificuldades enfrentadas por jornalistas para lidar com a complexidade do tema (Wikimedia)

21/08/2012

Por Fábio de Castro

Agência FAPESP – Na avaliação de especialistas reunidos em São Paulo para discutir a gestão de riscos dos extremos climáticos e desastres, para que seja possível gerenciar de forma adequada os impactos desses eventos, é fundamental informar a sociedade – incluindo os formuladores de políticas públicas – sobre as descobertas das ciências climáticas.

No entanto, pesquisadores estão preocupados com as dificuldades encontradas na comunicação com a sociedade. A complexidade dos estudos climáticos tende a gerar distorções na cobertura jornalística do tema e o resultado pode ser uma ameaça à confiança do público em relação à ciência.

A avaliação foi feita por participantes do workshop “Gestão dos riscos dos extremos climáticos e desastres na América Central e na América do Sul – o que podemos aprender com o Relatório Especial do IPCC sobre extremos?”, realizado na semana passada na capital paulista.

O evento teve o objetivo de debater as conclusões do Relatório Especial sobre Gestão dos Riscos de Extremos Climáticos e Desastres (SREX, na sigla em inglês) – elaborado e recentemente publicado pelo Painel Intergovernamental sobre Mudanças Climáticas (IPCC) – e discutir opções para gerenciamento dos impactos dos extremos climáticos, especialmente nas Américas do Sul e Central.

O workshop foi realizado pela FAPESP e pelo Instituto Nacional de Pesquisas Espaciais (Inpe), em parceria com o IPCC, o Overseas Development Institute (ODI) e a Climate and Development Knowledge (CKDN), ambos do Reino Unido, e apoio da Agência de Clima e Poluição do Ministério de Relações Exteriores da Noruega.

Durante o evento, o tema da comunicação foi debatido por autores do IPCC-SREX, especialistas em extremos climáticos, gestores e líderes de instituições de prevenção de desastres.

De acordo com Vicente Barros, do Centro de Investigação do Mar e da Atmosfera da Universidade de Buenos Aires, o IPCC, do qual é membro, entrou há três anos em um processo de reestruturação que compreende uma mudança na estratégia de comunicação.

“A partir de 2009, o IPCC passou a ser atacado violentamente e não estávamos preparados para isso, porque nossa função era divulgar o conhecimento adquirido, mas não traduzi-lo para a imprensa. Temos agora um grupo de jornalistas que procura fazer essa mediação, mas não podemos diluir demais as informações e a última palavra na formulação da comunicação é sempre do comitê executivo, porque o peso político do que é expresso pelo painel é muito grande”, disse Barros.

A linguagem é um grande problema, segundo Barros. Se for muito complexa, não atinge o público. Se for muito simplificada, tende a distorcer as conclusões e disseminar visões que não correspondem à realidade.

“O IPCC trata de problemas muito complexos e admitimos que não podemos fazer uma divulgação que chegue a todos. Isso é um problema. Acredito que a comunicação deve permanecer nas mãos dos jornalistas, mas talvez seja preciso investir em iniciativas de treinamento desses profissionais”, disse.

Fábio Feldman, do Fórum Paulista de Mudanças Climáticas, manifestou preocupação com as dificuldades de comunicação dos cientistas com o público, que, segundo ele, possibilitam que os pesquisadores “céticos” – isto é, que negam a influência humana nos eventos de mudanças climáticas – ganhem cada vez mais espaço na mídia e no debate público.

“Vejo com preocupação um avanço do espaço dado aos negacionistas no debate público. A imprensa acha que é preciso usar necessariamente o princípio do contraditório, dando espaço e importância equânimes para as diferentes posições no debate”, disse.

De acordo com Feldman, os cientistas – especialmente aqueles ligados ao IPCC – deveriam ter uma atitude mais pró-ativa no sentido de se contrapor aos “céticos” no debate público.

Posições diferentes

Para Reynaldo Luiz Victoria, da Coordenação do Programa FAPESP de Pesquisa em Mudanças Climáticas Globais, é importante que a imprensa trate as diferentes posições de modo mais equitativo.

“Há casos específicos em que a imprensa trata questões de maneira pouco equitativa – e eventualmente sensacionalista –, mas acho que nós, como pesquisadores, não temos obrigação de reagir. A imprensa deveria nos procurar para fazer o contraponto e esclarecer o público”, disse Victoria à Agência FAPESP.

Victoria, no entanto, destacou a importância de que os “céticos” também sejam ouvidos. “Alguns são cientistas sérios e merecem um tratamento equitativo. Certamente que não se pode ignorá-los, mas, quando fazem afirmações passíveis de contestação, a imprensa deve procurar alguém que possa dar um contraponto. Os jornalistas precisam nos procurar e não o contrário”, disse.

De modo geral, a cobertura da imprensa sobre mudanças climáticas é satisfatória, segundo Victoria. “Os bons jornais publicam artigos corretos e há jornalistas muito sérios produzindo material de alta qualidade”, destacou.

Para Luci Hidalgo Nunes, professora do Departamento de Geografia da Universidade Estadual de Campinas (Unicamp), os negacionistas ganham espaço porque muitas vezes o discurso polêmico tem mais apelo midiático do que a complexidade do conhecimento científico.

“O cientista pode ter um discurso bem fundamentado, mas que é considerado enfadonho pelo público. Enquanto isso, um pesquisador com argumentos pouco estruturados pode fazer um discurso simplificado, portanto atraente para o público, e polêmico, o que rende manchetes”, disse à Agência FAPESP.

Apesar de a boa ciência ter, em relação ao debate público, uma desvantagem inerente à sua complexidade, Nunes acredita ser importante que a imprensa continue pluralista. A pesquisadora publicou um estudo no qual analisa a cobertura do jornal O Estado de S. Paulo sobre mudanças climáticas durante um ano. Segundo Nunes, um dos principais pontos positivos observados consistiu em dar voz às diferentes posições.

“Sou favorável a que a imprensa cumpra seu papel e dê todos os parâmetros, para que haja um debate democrático. Acho que isso está sendo bem feito e a própria imprensa está aberta para nos dar mais espaço. Mas precisamos nos manifestar para criar essas oportunidades”, disse.

Nunes também considera que a cobertura da imprensa sobre mudanças climáticas, de modo geral, tem sido satisfatória, ainda que irregular. “O tema ganha vulto em determinados momentos, mas não se mantém na pauta do noticiário de forma permanente”, disse.

Segundo ela, o assunto sobressaiu especialmente em 2007, com a publicação do primeiro relatório do IPCC, e em 2012 durante a RIO+20.

“Em 2007, a cobertura foi intensa, mas a popularização do tema também deu margem a distorções e exageros. O sensacionalismo é ruim para a ciência, porque faz o tema ganhar as manchetes rapidamente por algum tempo, mas no médio prazo o efeito é inverso: as pessoas percebem os exageros e passam a olhar com descrédito os resultados científicos de modo geral”, disse.

No Vale do Ribeira, Defensoria Pública defende comunidades tradicionais contra corrupção e mercado de carbono (Racismo Ambiental)

Por racismoambiental, 24/06/2012 11:45

Tania Pacheco*

“Posto diante de todos estes homens reunidos, de todas estas mulheres, de todas estas crianças (sede fecundos, multiplicai-vos e enchei a terra, assim lhes fora mandado), cujo suor não nascia do trabalho que não tinham, mas da agonia insuportável de não o ter, Deus arrependeu-se dos males que havia feito e permitido, a um ponto tal que, num arrebato de contrição, quis mudar o seu nome para um outro mais humano. Falando à multidão, anunciou: “A partir de hoje chamar-me-eis Justiça”. E a multidão respondeu-lhe: “Justiça, já nós a temos, e não nos atende”. Disse Deus: “Sendo assim, tomarei o nome de Direito”. E a multidão tornou a responder-lhe: “Direito, já nós o temos, e não nos conhece”. E Deus: “Nesse caso, ficarei com o nome de Caridade, que é um nome bonito”. Disse a multidão: “Não necessitamos de caridade, o que queremos é uma Justiça que se cumpra e um Direito que nos respeite”. José Saramago (Prefácio à obra Terra, de Sebastião Salgado).

O trecho acima foi retirado de uma peça jurídica. Um mandado de segurança com pedido de liminar impetrado no dia 6 de junho pelos Defensores Thiago de Luna Cury e Andrew Toshio Hayama, respectivamente da 2ª e da 3ª Defensorias Publicas de Registro, São Paulo, contra o Prefeito de Iporanga, região de Lageado, Vale do Ribeira. Seu objetivo: impedir que, seguindo uma prática que vem se tornando constante no estado, a autoridade municipal expulse comunidades tradicionais e desaproprie vastas extensões de terras, transformando-as em Parques Naturais a serem transacionados no mercado de carbono.

Para ganhar dinheiro a qualquer custo, não interessa investigar se nessas terras há comunidades tradicionais, quilombolas e camponeses. Não interessa se o Direito à Consulta Prévia e Informada estipulado pela Convenção 169 da OIT foi respeitado. Não interessa, inclusive, se, caso audiências públicas tivessem sido realizadas, as comunidades teriam condições de entender plenamente o que estava sendo proposto e decidir se seria de seu interesse abandonar seus territórios, suas tradições, suas gentes, uma vez que nesse tipo de unidade de conservação integral não pode haver moradores. Em parcerias com empresas e ONGs fajutas, o esquema é montado; de uma penada decretado; e o lucro é garantido e dividido entre os integrantes das quadrilhas.

Mas não foi bem assim que aconteceu em Iporanga. A Defensoria Pública agiu, e agiu pela Justiça e pelo Direito, de forma indignada, culta, forte, poética e, sempre, muito bem fundamentada nas leis. E coube ao Juiz Raphael Garcia Pinto, de Eldorado, São Paulo, reconhecê-lo em decisão do dia 11 de junho de 2012.

Este Blog defende intransigentemente a “democratização do sistema de Justiça”. E tanto no mandado como na decisão é um exemplo disso que temos presente: da prática da democracia pelos operadores do Direito. Por isso fazemos questão de socializá-los, não só como uma homenagem aos Defensores Thiago de Luna Cury e Andrew Toshio Hayama (e também ao Juiz Raphael Garcia Pinto), mas também como um exemplo a ser seguido Brasil afora, como forma de defender as comunidades e honrar a tod@s nós.

Para ver o mandado de segurança clique AQUI. Para ver a decisão clique AQUI. Boa leitura.

* Com informações enviadas por Luciana Zaffalon.

Should Doctors Treat Lack of Exercise as a Medical Condition? Expert Says ‘Yes’ (Science Daily)

ScienceDaily (Aug. 13, 2012) — A sedentary lifestyle is a common cause of obesity, and excessive body weight and fat in turn are considered catalysts for diabetes, high blood pressure, joint damage and other serious health problems. But what if lack of exercise itself were treated as a medical condition? Mayo Clinic physiologist Michael Joyner, M.D., argues that it should be. His commentary is published this month in The Journal of Physiology.

Physical inactivity affects the health not only of many obese patients, but also people of normal weight, such as workers with desk jobs, patients immobilized for long periods after injuries or surgery, and women on extended bed rest during pregnancies, among others, Dr. Joyner says. Prolonged lack of exercise can cause the body to become deconditioned, with wide-ranging structural and metabolic changes: the heart rate may rise excessively during physical activity, bones and muscles atrophy, physical endurance wane, and blood volume decline.

When deconditioned people try to exercise, they may tire quickly and experience dizziness or other discomfort, then give up trying to exercise and find the problem gets worse rather than better.

“I would argue that physical inactivity is the root cause of many of the common problems that we have,” Dr. Joyner says. “If we were to medicalize it, we could then develop a way, just like we’ve done for addiction, cigarettes and other things, to give people treatments, and lifelong treatments, that focus on behavioral modifications and physical activity. And then we can take public health measures, like we did for smoking, drunken driving and other things, to limit physical inactivity and promote physical activity.”

Several chronic medical conditions are associated with poor capacity to exercise, including fibromyalgia, chronic fatigue syndrome and postural orthostatic tachycardia syndrome, better known as POTS, a syndrome marked by an excessive heart rate and flu-like symptoms when standing or a given level of exercise. Too often, medication rather than progressive exercise is prescribed, Dr. Joyner says.

Texas Health Presbyterian Hospital Dallas and University of Texas Southwestern Medical Center researchers found that three months of exercise training can reverse or improve many POTS symptoms, Dr. Joyner notes. That study offers hope for such patients and shows that physicians should consider prescribing carefully monitored exercise before medication, he says.

If physical inactivity were treated as a medical condition itself rather than simply a cause or byproduct of other medical conditions, physicians may become more aware of the value of prescribing supported exercise, and more formal rehabilitation programs that include cognitive and behavioral therapy would develop, Dr. Joyner says.

For those who have been sedentary and are trying to get into exercise, Dr. Joyner advises doing it slowly and progressively.

“You just don’t jump right back into it and try to train for a marathon,” he says. “Start off with achievable goals and do it in small bites.”

There’s no need to join a gym or get a personal trainer: build as much activity as possible into daily life. Even walking just 10 minutes three times a day can go a long way toward working up to the 150 minutes a week of moderate physical activity the typical adult needs, Dr. Joyner says.

Global Warming’s Terrifying New Math (Rolling Stone)

Three simple numbers that add up to global catastrophe – and that make clear who the real enemy is

by: Bill McKibben

reckoning illoIllustration by Edel Rodriguez

If the pictures of those towering wildfires in Colorado haven’t convinced you, or the size of your AC bill this summer, here are some hard numbers about climate change: June broke or tied 3,215 high-temperature records across the United States. That followed the warmest May on record for the Northern Hemisphere – the 327th consecutive month in which the temperature of the entire globe exceeded the 20th-century average, the odds of which occurring by simple chance were 3.7 x 10-99, a number considerably larger than the number of stars in the universe.

Meteorologists reported that this spring was the warmest ever recorded for our nation – in fact, it crushed the old record by so much that it represented the “largest temperature departure from average of any season on record.” The same week, Saudi authorities reported that it had rained in Mecca despite a temperature of 109 degrees, the hottest downpour in the planet’s history.

Not that our leaders seemed to notice. Last month the world’s nations, meeting in Rio for the 20th-anniversary reprise of a massive 1992 environmental summit, accomplished nothing. Unlike George H.W. Bush, who flew in for the first conclave, Barack Obama didn’t even attend. It was “a ghost of the glad, confident meeting 20 years ago,” the British journalist George Monbiot wrote; no one paid it much attention, footsteps echoing through the halls “once thronged by multitudes.” Since I wrote one of the first books for a general audience about global warming way back in 1989, and since I’ve spent the intervening decades working ineffectively to slow that warming, I can say with some confidence that we’re losing the fight, badly and quickly – losing it because, most of all, we remain in denial about the peril that human civilization is in.

When we think about global warming at all, the arguments tend to be ideological, theological and economic. But to grasp the seriousness of our predicament, you just need to do a little math. For the past year, an easy and powerful bit of arithmetical analysis first published by financial analysts in the U.K. has been making the rounds of environmental conferences and journals, but it hasn’t yet broken through to the larger public. This analysis upends most of the conventional political thinking about climate change. And it allows us to understand our precarious – our almost-but-not-quite-finally hopeless – position with three simple numbers.

The First Number: 2° Celsius

If the movie had ended in Hollywood fashion, the Copenhagen climate conference in 2009 would have marked the culmination of the global fight to slow a changing climate. The world’s nations had gathered in the December gloom of the Danish capital for what a leading climate economist, Sir Nicholas Stern of Britain, called the “most important gathering since the Second World War, given what is at stake.” As Danish energy minister Connie Hedegaard, who presided over the conference, declared at the time: “This is our chance. If we miss it, it could take years before we get a new and better one. If ever.”

In the event, of course, we missed it. Copenhagen failed spectacularly. Neither China nor the United States, which between them are responsible for 40 percent of global carbon emissions, was prepared to offer dramatic concessions, and so the conference drifted aimlessly for two weeks until world leaders jetted in for the final day. Amid considerable chaos, President Obama took the lead in drafting a face-saving “Copenhagen Accord” that fooled very few. Its purely voluntary agreements committed no one to anything, and even if countries signaled their intentions to cut carbon emissions, there was no enforcement mechanism. “Copenhagen is a crime scene tonight,” an angry Greenpeace official declared, “with the guilty men and women fleeing to the airport.” Headline writers were equally brutal: COPENHAGEN: THE MUNICH OF OUR TIMES? asked one.

The accord did contain one important number, however. In Paragraph 1, it formally recognized “the scientific view that the increase in global temperature should be below two degrees Celsius.” And in the very next paragraph, it declared that “we agree that deep cuts in global emissions are required… so as to hold the increase in global temperature below two degrees Celsius.” By insisting on two degrees – about 3.6 degrees Fahrenheit – the accord ratified positions taken earlier in 2009 by the G8, and the so-called Major Economies Forum. It was as conventional as conventional wisdom gets. The number first gained prominence, in fact, at a 1995 climate conference chaired by Angela Merkel, then the German minister of the environment and now the center-right chancellor of the nation.

Some context: So far, we’ve raised the average temperature of the planet just under 0.8 degrees Celsius, and that has caused far more damage than most scientists expected. (A third of summer sea ice in the Arctic is gone, the oceans are 30 percent more acidic, and since warm air holds more water vapor than cold, the atmosphere over the oceans is a shocking five percent wetter, loading the dice for devastating floods.) Given those impacts, in fact, many scientists have come to think that two degrees is far too lenient a target. “Any number much above one degree involves a gamble,” writes Kerry Emanuel of MIT, a leading authority on hurricanes, “and the odds become less and less favorable as the temperature goes up.” Thomas Lovejoy, once the World Bank’s chief biodiversity adviser, puts it like this: “If we’re seeing what we’re seeing today at 0.8 degrees Celsius, two degrees is simply too much.” NASA scientist James Hansen, the planet’s most prominent climatologist, is even blunter: “The target that has been talked about in international negotiations for two degrees of warming is actually a prescription for long-term disaster.” At the Copenhagen summit, a spokesman for small island nations warned that many would not survive a two-degree rise: “Some countries will flat-out disappear.” When delegates from developing nations were warned that two degrees would represent a “suicide pact” for drought-stricken Africa, many of them started chanting, “One degree, one Africa.”

Despite such well-founded misgivings, political realism bested scientific data, and the world settled on the two-degree target – indeed, it’s fair to say that it’s the only thing about climate change the world has settled on. All told, 167 countries responsible for more than 87 percent of the world’s carbon emissions have signed on to the Copenhagen Accord, endorsing the two-degree target. Only a few dozen countries have rejected it, including Kuwait, Nicaragua and Venezuela. Even the United Arab Emirates, which makes most of its money exporting oil and gas, signed on. The official position of planet Earth at the moment is that we can’t raise the temperature more than two degrees Celsius – it’s become the bottomest of bottom lines. Two degrees.

The Second Number: 565 Gigatons

Scientists estimate that humans can pour roughly 565 more gigatons of carbon dioxide into the atmosphere by midcentury and still have some reasonable hope of staying below two degrees. (“Reasonable,” in this case, means four chances in five, or somewhat worse odds than playing Russian roulette with a six-shooter.)

This idea of a global “carbon budget” emerged about a decade ago, as scientists began to calculate how much oil, coal and gas could still safely be burned. Since we’ve increased the Earth’s temperature by 0.8 degrees so far, we’re currently less than halfway to the target. But, in fact, computer models calculate that even if we stopped increasing CO2 now, the temperature would likely still rise another 0.8 degrees, as previously released carbon continues to overheat the atmosphere. That means we’re already three-quarters of the way to the two-degree target.

How good are these numbers? No one is insisting that they’re exact, but few dispute that they’re generally right. The 565-gigaton figure was derived from one of the most sophisticated computer-simulation models that have been built by climate scientists around the world over the past few decades. And the number is being further confirmed by the latest climate-simulation models currently being finalized in advance of the next report by the Intergovernmental Panel on Climate Change. “Looking at them as they come in, they hardly differ at all,” says Tom Wigley, an Australian climatologist at the National Center for Atmospheric Research. “There’s maybe 40 models in the data set now, compared with 20 before. But so far the numbers are pretty much the same. We’re just fine-tuning things. I don’t think much has changed over the last decade.” William Collins, a senior climate scientist at the Lawrence Berkeley National Laboratory, agrees. “I think the results of this round of simulations will be quite similar,” he says. “We’re not getting any free lunch from additional understanding of the climate system.”

We’re not getting any free lunch from the world’s economies, either. With only a single year’s lull in 2009 at the height of the financial crisis, we’ve continued to pour record amounts of carbon into the atmosphere, year after year. In late May, the International Energy Agency published its latest figures – CO2 emissions last year rose to 31.6 gigatons, up 3.2 percent from the year before. America had a warm winter and converted more coal-fired power plants to natural gas, so its emissions fell slightly; China kept booming, so its carbon output (which recently surpassed the U.S.) rose 9.3 percent; the Japanese shut down their fleet of nukes post-Fukushima, so their emissions edged up 2.4 percent. “There have been efforts to use more renewable energy and improve energy efficiency,” said Corinne Le Quéré, who runs England’s Tyndall Centre for Climate Change Research. “But what this shows is that so far the effects have been marginal.” In fact, study after study predicts that carbon emissions will keep growing by roughly three percent a year – and at that rate, we’ll blow through our 565-gigaton allowance in 16 years, around the time today’s preschoolers will be graduating from high school. “The new data provide further evidence that the door to a two-degree trajectory is about to close,” said Fatih Birol, the IEA’s chief economist. In fact, he continued, “When I look at this data, the trend is perfectly in line with a temperature increase of about six degrees.” That’s almost 11 degrees Fahrenheit, which would create a planet straight out of science fiction.

So, new data in hand, everyone at the Rio conference renewed their ritual calls for serious international action to move us back to a two-degree trajectory. The charade will continue in November, when the next Conference of the Parties (COP) of the U.N. Framework Convention on Climate Change convenes in Qatar. This will be COP 18 – COP 1 was held in Berlin in 1995, and since then the process has accomplished essentially nothing. Even scientists, who are notoriously reluctant to speak out, are slowly overcoming their natural preference to simply provide data. “The message has been consistent for close to 30 years now,” Collins says with a wry laugh, “and we have the instrumentation and the computer power required to present the evidence in detail. If we choose to continue on our present course of action, it should be done with a full evaluation of the evidence the scientific community has presented.” He pauses, suddenly conscious of being on the record. “I should say, a fuller evaluation of the evidence.”

So far, though, such calls have had little effect. We’re in the same position we’ve been in for a quarter-century: scientific warning followed by political inaction. Among scientists speaking off the record, disgusted candor is the rule. One senior scientist told me, “You know those new cigarette packs, where governments make them put a picture of someone with a hole in their throats? Gas pumps should have something like that.”

The Third Number: 2,795 Gigatons

This number is the scariest of all – one that, for the first time, meshes the political and scientific dimensions of our dilemma. It was highlighted last summer by the Carbon Tracker Initiative, a team of London financial analysts and environmentalists who published a report in an effort to educate investors about the possible risks that climate change poses to their stock portfolios. The number describes the amount of carbon already contained in the proven coal and oil and gas reserves of the fossil-fuel companies, and the countries (think Venezuela or Kuwait) that act like fossil-fuel companies. In short, it’s the fossil fuel we’re currently planning to burn. And the key point is that this new number – 2,795 – is higher than 565. Five times higher.

The Carbon Tracker Initiative – led by James Leaton, an environmentalist who served as an adviser at the accounting giant PricewaterhouseCoopers – combed through proprietary databases to figure out how much oil, gas and coal the world’s major energy companies hold in reserve. The numbers aren’t perfect – they don’t fully reflect the recent surge in unconventional energy sources like shale gas, and they don’t accurately reflect coal reserves, which are subject to less stringent reporting requirements than oil and gas. But for the biggest companies, the figures are quite exact: If you burned everything in the inventories of Russia’s Lukoil and America’s ExxonMobil, for instance, which lead the list of oil and gas companies, each would release more than 40 gigatons of carbon dioxide into the atmosphere.

Which is exactly why this new number, 2,795 gigatons, is such a big deal. Think of two degrees Celsius as the legal drinking limit – equivalent to the 0.08 blood-alcohol level below which you might get away with driving home. The 565 gigatons is how many drinks you could have and still stay below that limit – the six beers, say, you might consume in an evening. And the 2,795 gigatons? That’s the three 12-packs the fossil-fuel industry has on the table, already opened and ready to pour.

We have five times as much oil and coal and gas on the books as climate scientists think is safe to burn. We’d have to keep 80 percent of those reserves locked away underground to avoid that fate. Before we knew those numbers, our fate had been likely. Now, barring some massive intervention, it seems certain.

Yes, this coal and gas and oil is still technically in the soil. But it’s already economically aboveground – it’s figured into share prices, companies are borrowing money against it, nations are basing their budgets on the presumed returns from their patrimony. It explains why the big fossil-fuel companies have fought so hard to prevent the regulation of carbon dioxide – those reserves are their primary asset, the holding that gives their companies their value. It’s why they’ve worked so hard these past years to figure out how to unlock the oil in Canada’s tar sands, or how to drill miles beneath the sea, or how to frack the Appalachians.

If you told Exxon or Lukoil that, in order to avoid wrecking the climate, they couldn’t pump out their reserves, the value of their companies would plummet. John Fullerton, a former managing director at JP Morgan who now runs the Capital Institute, calculates that at today’s market value, those 2,795 gigatons of carbon emissions are worth about $27 trillion. Which is to say, if you paid attention to the scientists and kept 80 percent of it underground, you’d be writing off $20 trillion in assets. The numbers aren’t exact, of course, but that carbon bubble makes the housing bubble look small by comparison. It won’t necessarily burst – we might well burn all that carbon, in which case investors will do fine. But if we do, the planet will crater. You can have a healthy fossil-fuel balance sheet, or a relatively healthy planet – but now that we know the numbers, it looks like you can’t have both. Do the math: 2,795 is five times 565. That’s how the story ends.

So far, as I said at the start, environmental efforts to tackle global warming have failed. The planet’s emissions of carbon dioxide continue to soar, especially as developing countries emulate (and supplant) the industries of the West. Even in rich countries, small reductions in emissions offer no sign of the real break with the status quo we’d need to upend the iron logic of these three numbers. Germany is one of the only big countries that has actually tried hard to change its energy mix; on one sunny Saturday in late May, that northern-latitude nation generated nearly half its power from solar panels within its borders. That’s a small miracle – and it demonstrates that we have the technology to solve our problems. But we lack the will. So far, Germany’s the exception; the rule is ever more carbon.

This record of failure means we know a lot about what strategies don’t work. Green groups, for instance, have spent a lot of time trying to change individual lifestyles: the iconic twisty light bulb has been installed by the millions, but so have a new generation of energy-sucking flatscreen TVs. Most of us are fundamentally ambivalent about going green: We like cheap flights to warm places, and we’re certainly not going to give them up if everyone else is still taking them. Since all of us are in some way the beneficiaries of cheap fossil fuel, tackling climate change has been like trying to build a movement against yourself – it’s as if the gay-rights movement had to be constructed entirely from evangelical preachers, or the abolition movement from slaveholders.

People perceive – correctly – that their individual actions will not make a decisive difference in the atmospheric concentration of CO2; by 2010, a poll found that “while recycling is widespread in America and 73 percent of those polled are paying bills online in order to save paper,” only four percent had reduced their utility use and only three percent had purchased hybrid cars. Given a hundred years, you could conceivably change lifestyles enough to matter – but time is precisely what we lack.

A more efficient method, of course, would be to work through the political system, and environmentalists have tried that, too, with the same limited success. They’ve patiently lobbied leaders, trying to convince them of our peril and assuming that politicians would heed the warnings. Sometimes it has seemed to work. Barack Obama, for instance, campaigned more aggressively about climate change than any president before him – the night he won the nomination, he told supporters that his election would mark the moment “the rise of the oceans began to slow and the planet began to heal.” And he has achieved one significant change: a steady increase in the fuel efficiency mandated for automobiles. It’s the kind of measure, adopted a quarter-century ago, that would have helped enormously. But in light of the numbers I’ve just described, it’s obviously a very small start indeed.

At this point, effective action would require actually keeping most of the carbon the fossil-fuel industry wants to burn safely in the soil, not just changing slightly the speed at which it’s burned. And there the president, apparently haunted by the still-echoing cry of “Drill, baby, drill,” has gone out of his way to frack and mine. His secretary of interior, for instance, opened up a huge swath of the Powder River Basin in Wyoming for coal extraction: The total basin contains some 67.5 gigatons worth of carbon (or more than 10 percent of the available atmospheric space). He’s doing the same thing with Arctic and offshore drilling; in fact, as he explained on the stump in March, “You have my word that we will keep drilling everywhere we can… That’s a commitment that I make.” The next day, in a yard full of oil pipe in Cushing, Oklahoma, the president promised to work on wind and solar energy but, at the same time, to speed up fossil-fuel development: “Producing more oil and gas here at home has been, and will continue to be, a critical part of an all-of-the-above energy strategy.” That is, he’s committed to finding even more stock to add to the 2,795-gigaton inventory of unburned carbon.

Sometimes the irony is almost Borat-scale obvious: In early June, Secretary of State Hillary Clinton traveled on a Norwegian research trawler to see firsthand the growing damage from climate change. “Many of the predictions about warming in the Arctic are being surpassed by the actual data,” she said, describing the sight as “sobering.” But the discussions she traveled to Scandinavia to have with other foreign ministers were mostly about how to make sure Western nations get their share of the estimated $9 trillion in oil (that’s more than 90 billion barrels, or 37 gigatons of carbon) that will become accessible as the Arctic ice melts. Last month, the Obama administration indicated that it would give Shell permission to start drilling in sections of the Arctic.

Almost every government with deposits of hydrocarbons straddles the same divide. Canada, for instance, is a liberal democracy renowned for its internationalism – no wonder, then, that it signed on to the Kyoto treaty, promising to cut its carbon emissions substantially by 2012. But the rising price of oil suddenly made the tar sands of Alberta economically attractive – and since, as NASA climatologist James Hansen pointed out in May, they contain as much as 240 gigatons of carbon (or almost half of the available space if we take the 565 limit seriously), that meant Canada’s commitment to Kyoto was nonsense. In December, the Canadian government withdrew from the treaty before it faced fines for failing to meet its commitments.

The same kind of hypocrisy applies across the ideological board: In his speech to the Copenhagen conference, Venezuela’s Hugo Chavez quoted Rosa Luxemburg, Jean-Jacques Rousseau and “Christ the Redeemer,” insisting that “climate change is undoubtedly the most devastating environmental problem of this century.” But the next spring, in the Simon Bolivar Hall of the state-run oil company, he signed an agreement with a consortium of international players to develop the vast Orinoco tar sands as “the most significant engine for a comprehensive development of the entire territory and Venezuelan population.” The Orinoco deposits are larger than Alberta’s – taken together, they’d fill up the whole available atmospheric space.

So: the paths we have tried to tackle global warming have so far produced only gradual, halting shifts. A rapid, transformative change would require building a movement, and movements require enemies. As John F. Kennedy put it, “The civil rights movement should thank God for Bull Connor. He’s helped it as much as Abraham Lincoln.” And enemies are what climate change has lacked.

But what all these climate numbers make painfully, usefully clear is that the planet does indeed have an enemy – one far more committed to action than governments or individuals. Given this hard math, we need to view the fossil-fuel industry in a new light. It has become a rogue industry, reckless like no other force on Earth. It is Public Enemy Number One to the survival of our planetary civilization. “Lots of companies do rotten things in the course of their business – pay terrible wages, make people work in sweatshops – and we pressure them to change those practices,” says veteran anti-corporate leader Naomi Klein, who is at work on a book about the climate crisis. “But these numbers make clear that with the fossil-fuel industry, wrecking the planet is their business model. It’s what they do.”

According to the Carbon Tracker report, if Exxon burns its current reserves, it would use up more than seven percent of the available atmospheric space between us and the risk of two degrees. BP is just behind, followed by the Russian firm Gazprom, then Chevron, ConocoPhillips and Shell, each of which would fill between three and four percent. Taken together, just these six firms, of the 200 listed in the Carbon Tracker report, would use up more than a quarter of the remaining two-degree budget. Severstal, the Russian mining giant, leads the list of coal companies, followed by firms like BHP Billiton and Peabody. The numbers are simply staggering – this industry, and this industry alone, holds the power to change the physics and chemistry of our planet, and they’re planning to use it.

They’re clearly cognizant of global warming – they employ some of the world’s best scientists, after all, and they’re bidding on all those oil leases made possible by the staggering melt of Arctic ice. And yet they relentlessly search for more hydrocarbons – in early March, Exxon CEO Rex Tillerson told Wall Street analysts that the company plans to spend $37 billion a year through 2016 (about $100 million a day) searching for yet more oil and gas.

There’s not a more reckless man on the planet than Tillerson. Late last month, on the same day the Colorado fires reached their height, he told a New York audience that global warming is real, but dismissed it as an “engineering problem” that has “engineering solutions.” Such as? “Changes to weather patterns that move crop-production areas around – we’ll adapt to that.” This in a week when Kentucky farmers were reporting that corn kernels were “aborting” in record heat, threatening a spike in global food prices. “The fear factor that people want to throw out there to say, ‘We just have to stop this,’ I do not accept,” Tillerson said. Of course not – if he did accept it, he’d have to keep his reserves in the ground. Which would cost him money. It’s not an engineering problem, in other words – it’s a greed problem.

You could argue that this is simply in the nature of these companies – that having found a profitable vein, they’re compelled to keep mining it, more like efficient automatons than people with free will. But as the Supreme Court has made clear, they are people of a sort. In fact, thanks to the size of its bankroll, the fossil-fuel industry has far more free will than the rest of us. These companies don’t simply exist in a world whose hungers they fulfill – they help create the boundaries of that world.

Left to our own devices, citizens might decide to regulate carbon and stop short of the brink; according to a recent poll, nearly two-thirds of Americans would back an international agreement that cut carbon emissions 90 percent by 2050. But we aren’t left to our own devices. The Koch brothers, for instance, have a combined wealth of $50 billion, meaning they trail only Bill Gates on the list of richest Americans. They’ve made most of their money in hydrocarbons, they know any system to regulate carbon would cut those profits, and they reportedly plan to lavish as much as $200 million on this year’s elections. In 2009, for the first time, the U.S. Chamber of Commerce surpassed both the Republican and Democratic National Committees on political spending; the following year, more than 90 percent of the Chamber’s cash went to GOP candidates, many of whom deny the existence of global warming. Not long ago, the Chamber even filed a brief with the EPA urging the agency not to regulate carbon – should the world’s scientists turn out to be right and the planet heats up, the Chamber advised, “populations can acclimatize to warmer climates via a range of behavioral, physiological and technological adaptations.” As radical goes, demanding that we change our physiology seems right up there.

Environmentalists, understandably, have been loath to make the fossil-fuel industry their enemy, respecting its political power and hoping instead to convince these giants that they should turn away from coal, oil and gas and transform themselves more broadly into “energy companies.” Sometimes that strategy appeared to be working – emphasis on appeared. Around the turn of the century, for instance, BP made a brief attempt to restyle itself as “Beyond Petroleum,” adapting a logo that looked like the sun and sticking solar panels on some of its gas stations. But its investments in alternative energy were never more than a tiny fraction of its budget for hydrocarbon exploration, and after a few years, many of those were wound down as new CEOs insisted on returning to the company’s “core business.” In December, BP finally closed its solar division. Shell shut down its solar and wind efforts in 2009. The five biggest oil companies have made more than $1 trillion in profits since the millennium – there’s simply too much money to be made on oil and gas and coal to go chasing after zephyrs and sunbeams.

Much of that profit stems from a single historical accident: Alone among businesses, the fossil-fuel industry is allowed to dump its main waste, carbon dioxide, for free. Nobody else gets that break – if you own a restaurant, you have to pay someone to cart away your trash, since piling it in the street would breed rats. But the fossil-fuel industry is different, and for sound historical reasons: Until a quarter-century ago, almost no one knew that CO2 was dangerous. But now that we understand that carbon is heating the planet and acidifying the oceans, its price becomes the central issue.

If you put a price on carbon, through a direct tax or other methods, it would enlist markets in the fight against global warming. Once Exxon has to pay for the damage its carbon is doing to the atmosphere, the price of its products would rise. Consumers would get a strong signal to use less fossil fuel – every time they stopped at the pump, they’d be reminded that you don’t need a semimilitary vehicle to go to the grocery store. The economic playing field would now be a level one for nonpolluting energy sources. And you could do it all without bankrupting citizens – a so-called “fee-and-dividend” scheme would put a hefty tax on coal and gas and oil, then simply divide up the proceeds, sending everyone in the country a check each month for their share of the added costs of carbon. By switching to cleaner energy sources, most people would actually come out ahead.

There’s only one problem: Putting a price on carbon would reduce the profitability of the fossil-fuel industry. After all, the answer to the question “How high should the price of carbon be?” is “High enough to keep those carbon reserves that would take us past two degrees safely in the ground.” The higher the price on carbon, the more of those reserves would be worthless. The fight, in the end, is about whether the industry will succeed in its fight to keep its special pollution break alive past the point of climate catastrophe, or whether, in the economists’ parlance, we’ll make them internalize those externalities.

It’s not clear, of course, that the power of the fossil-fuel industry can be broken. The U.K. analysts who wrote the Carbon Tracker report and drew attention to these numbers had a relatively modest goal – they simply wanted to remind investors that climate change poses a very real risk to the stock prices of energy companies. Say something so big finally happens (a giant hurricane swamps Manhattan, a megadrought wipes out Midwest agriculture) that even the political power of the industry is inadequate to restrain legislators, who manage to regulate carbon. Suddenly those Chevron reserves would be a lot less valuable, and the stock would tank. Given that risk, the Carbon Tracker report warned investors to lessen their exposure, hedge it with some big plays in alternative energy.

“The regular process of economic evolution is that businesses are left with stranded assets all the time,” says Nick Robins, who runs HSBC’s Climate Change Centre. “Think of film cameras, or typewriters. The question is not whether this will happen. It will. Pension systems have been hit by the dot-com and credit crunch. They’ll be hit by this.” Still, it hasn’t been easy to convince investors, who have shared in the oil industry’s record profits. “The reason you get bubbles,” sighs Leaton, “is that everyone thinks they’re the best analyst – that they’ll go to the edge of the cliff and then jump back when everyone else goes over.”

So pure self-interest probably won’t spark a transformative challenge to fossil fuel. But moral outrage just might – and that’s the real meaning of this new math. It could, plausibly, give rise to a real movement.

Once, in recent corporate history, anger forced an industry to make basic changes. That was the campaign in the 1980s demanding divestment from companies doing business in South Africa. It rose first on college campuses and then spread to municipal and state governments; 155 campuses eventually divested, and by the end of the decade, more than 80 cities, 25 states and 19 counties had taken some form of binding economic action against companies connected to the apartheid regime. “The end of apartheid stands as one of the crowning accomplishments of the past century,” as Archbishop Desmond Tutu put it, “but we would not have succeeded without the help of international pressure,” especially from “the divestment movement of the 1980s.”

The fossil-fuel industry is obviously a tougher opponent, and even if you could force the hand of particular companies, you’d still have to figure out a strategy for dealing with all the sovereign nations that, in effect, act as fossil-fuel companies. But the link for college students is even more obvious in this case. If their college’s endowment portfolio has fossil-fuel stock, then their educations are being subsidized by investments that guarantee they won’t have much of a planet on which to make use of their degree. (The same logic applies to the world’s largest investors, pension funds, which are also theoretically interested in the future – that’s when their members will “enjoy their retirement.”) “Given the severity of the climate crisis, a comparable demand that our institutions dump stock from companies that are destroying the planet would not only be appropriate but effective,” says Bob Massie, a former anti-apartheid activist who helped found the Investor Network on Climate Risk. “The message is simple: We have had enough. We must sever the ties with those who profit from climate change – now.”

Movements rarely have predictable outcomes. But any campaign that weakens the fossil-fuel industry’s political standing clearly increases the chances of retiring its special breaks. Consider President Obama’s signal achievement in the climate fight, the large increase he won in mileage requirements for cars. Scientists, environmentalists and engineers had advocated such policies for decades, but until Detroit came under severe financial pressure, it was politically powerful enough to fend them off. If people come to understand the cold, mathematical truth – that the fossil-fuel industry is systematically undermining the planet’s physical systems – it might weaken it enough to matter politically. Exxon and their ilk might drop their opposition to a fee-and-dividend solution; they might even decide to become true energy companies, this time for real.

Even if such a campaign is possible, however, we may have waited too long to start it. To make a real difference – to keep us under a temperature increase of two degrees – you’d need to change carbon pricing in Washington, and then use that victory to leverage similar shifts around the world. At this point, what happens in the U.S. is most important for how it will influence China and India, where emissions are growing fastest. (In early June, researchers concluded that China has probably under-reported its emissions by up to 20 percent.) The three numbers I’ve described are daunting – they may define an essentially impossible future. But at least they provide intellectual clarity about the greatest challenge humans have ever faced. We know how much we can burn, and we know who’s planning to burn more. Climate change operates on a geological scale and time frame, but it’s not an impersonal force of nature; the more carefully you do the math, the more thoroughly you realize that this is, at bottom, a moral issue; we have met the enemy and they is Shell.

Meanwhile the tide of numbers continues. The week after the Rio conference limped to its conclusion, Arctic sea ice hit the lowest level ever recorded for that date. Last month, on a single weekend, Tropical Storm Debby dumped more than 20 inches of rain on Florida – the earliest the season’s fourth-named cyclone has ever arrived. At the same time, the largest fire in New Mexico history burned on, and the most destructive fire in Colorado’s annals claimed 346 homes in Colorado Springs – breaking a record set the week before in Fort Collins. This month, scientists issued a new study concluding that global warming has dramatically increased the likelihood of severe heat and drought – days after a heat wave across the Plains and Midwest broke records that had stood since the Dust Bowl, threatening this year’s harvest. You want a big number? In the course of this month, a quadrillion kernels of corn need to pollinate across the grain belt, something they can’t do if temperatures remain off the charts. Just like us, our crops are adapted to the Holocene, the 11,000-year period of climatic stability we’re now leaving… in the dust.

This story is from the August 2nd, 2012 issue of Rolling Stone.

Lost Letter Experiment Suggests Wealthy London Neighborhoods Are ‘More Altruistic’ (Science Daily)

ScienceDaily (Aug. 15, 2012) — Neighbourhood income deprivation has a strong negative effect on altruistic behaviour when measured by a ‘lost letter’ experiment, according to new UCL research published August 15 in PLoS One.

Researchers from UCL Anthropology used the lost letter technique to measure altruism across 20 London neighbourhoods by dropping 300 letters on the pavement and recording whether they arrived at their destination. The stamped letters were addressed by hand to a study author’s home address with a gender neutral name, and were dropped face-up and during rain free weekdays.

The results show a strong negative effect of neighbourhood income deprivation on altruistic behaviour, with an average of 87% of letters dropped in the wealthier neighbourhoods being returned compared to only an average 37% return rate in poorer neighbourhoods.

Co-author Jo Holland said: “This is the first large scale study investigating cooperation in an urban environment using the lost letter technique. This technique, first used in the 1960s by the American social psychologist Stanley Milgram, remains one of the best ways of measuring truly altruistic behaviour, as returning the letter doesn’t benefit that person and actually incurs the small hassle of taking the letter to a post box.

Co-author Professor Ruth Mace added: “Our study attempts to understand how the socio-economic characteristics of a neighbourhood affect the likelihood of people in a neighbourhood acting altruistically towards a stranger. The results show a clear trend, with letters dropped in the poorest neighbourhoods having 91% lower odds of being returned than letters dropped in the wealthiest neighbourhoods. This suggests that those living in poor neighbourhoods are less inclined to behave altruistically toward their neighbours.”

As well as measuring the number of letters returned, the researchers also looked at how other neighbourhood characteristics may help to explain the variation in altruistic behaviour — including ethnic composition and population density — but did not find them to be good predictors of lost letter return.

Corresponding author Antonio Silva said: “The fact that ethnic composition does not play a role on the likelihood of a letter being returned is particularly interesting, as other studies have suggested that ethnic mixing negatively affects social cohesion, but in our sampled London neighbourhoods this does not appear to be true.

“The level of altruism observed in a population is likely to vary according to its context. Our hypothesis that area level socio-economic characteristics could determine the levels of altruism found in individuals living in an area is confirmed by our results. Our overall findings replicate and expand on previous studies which use similar methodology.

“We show in this study that individuals living in poor neighbourhoods are less altruistic than individuals in wealthier neighbourhoods. However, the effect of income deprivation may be confounded by crime, as the poorer neighbourhoods tend to have higher rates crime which may lead to people in those neighbourhoods being generally more suspicious and therefore less likely to pick up a lost letter.

“Further research should focus on attempting to disentangle these two factors, possibly by comparing equally deprived neighbourhoods with different levels of crime. Although this study uses only one measure of altruism and therefore we should be careful in interpreting these findings, it does give us an interesting perspective on altruism in an urban context and provides a sound experimental model on which to base future studies.”

Programa de computador mimetiza evolução humana (Fapesp)

Software desenvolvido na USP de São Carlos cria e seleciona programas geradores de Árvores de Decisão, ferramentas capazes de fazer previsões. Pesquisa foi premiada nos Estados Unidos, no maior evento de computação evolutiva (Wikimedia)

16/08/2012

Por Karina Toledo

Agência FAPESP – Árvores de Decisão são ferramentas computacionais que conferem às máquinas a capacidade de fazer previsões com base na análise de dados históricos. A técnica pode, por exemplo, auxiliar o diagnóstico médico ou a análise de risco de aplicações financeiras.

Mas, para ter a melhor previsão, é necessário o melhor programa gerador de Árvores de Decisão. Para alcançar esse objetivo, pesquisadores do Instituto de Ciências Matemáticas e de Computação (ICMC) da Universidade de São Paulo (USP), em São Carlos, se inspiraram na teoria evolucionista de Charles Darwin.

“Desenvolvemos um algoritmo evolutivo, ou seja, que mimetiza o processo de evolução humana para gerar soluções”, disse Rodrigo Coelho Barros, doutorando do Laboratório de Computação Bioinspirada (BioCom) do ICMC e bolsista da FAPESP.

A computação evolutiva, explicou Barros, é uma das várias técnicas bioinspiradas, ou seja, que buscam na natureza soluções para problemas computacionais. “É notável como a natureza encontra soluções para problemas extremamente complicados. Não há dúvidas de que precisamos aprender com ela”, disse Barros.

Segundo Barros, o software desenvolvido em seu doutorado é capaz de criar automaticamente programas geradores de Árvores de Decisão. Para isso, faz cruzamentos aleatórios entre os códigos de programas já existentes gerando “filhos”.

“Esses ‘filhos’ podem eventualmente sofrer mutações e evoluir. Após um tempo, é esperado que os programas de geração de Árvores de Decisão evoluídos sejam cada vez melhores e nosso algoritmo seleciona o melhor de todos”, afirmou Barros.

Mas enquanto o processo de seleção natural na espécie humana leva centenas ou até milhares de anos, na computação dura apenas algumas horas, dependendo do problema a ser resolvido. “Estabelecemos cem gerações como limite do processo evolutivo”, contou Barros.

Inteligência artificial

Em Ciência da Computação, é denominada heurística a capacidade de um sistema fazer inovações e desenvolver técnicas para alcançar um determinado fim.

O software desenvolvido por Barros se insere na área de hiper-heurísticas, tópico recente na área de computação evolutiva que tem como objetivo a geração automática de heurísticas personalizadas para uma determinada aplicação ou conjunto de aplicações.

“É um passo preliminar em direção ao grande objetivo da inteligência artificial: o de criar máquinas capazes de desenvolver soluções para problemas sem que sejam explicitamente programadas para tal”, detalhou Barros.

O trabalho deu origem ao artigo A Hyper-Heuristic Evolutionary Algorithm for Automatically Designing Decision-Tree Algorithms, premiado em três categorias na Genetic and Evolutionary Computation Conference (GECCO), maior evento da área de computação evolutiva do mundo, realizado em julho na Filadélfia, Estados Unidos.

Além de Barros, também são autores do artigo os professores André Carlos Ponce de Leon Ferreira de Carvalho, orientador da pesquisa no ICMC, Márcio Porto Basgalupp, da Universidade Federal de São Paulo (Unifesp), e Alex Freitas, da University of Kent, no Reino Unido, que assumiu a co-orientação.

Os autores foram convidados a submeter o artigo para a revista Evolutionary Computation Journal, publicada pelo Instituto de Tecnologia de Massachusetts (MIT). “O trabalho ainda passará por revisão, mas, como foi submetido a convite, tem grande chance de ser aceito”, disse Barros.

A pesquisa, que deve ser concluída somente em 2013, também deu origem a um artigo publicado a convite no Journal of the Brazilian Computer Society, após ser eleito como melhor trabalho no Encontro Nacional de Inteligência Artificial de 2011.

Outro artigo, apresentado na 11ª International Conference on Intelligent Systems Design and Applications, realizada na Espanha em 2011, rendeu convite para publicação na revistaNeurocomputing.

Interest in Arts Predicts Social Responsibility (Science Daily)

ScienceDaily (Aug. 16, 2012) — If you sing, dance, draw, or act — and especially if you watch others do so — you probably have an altruistic streak, according to a study by researchers at the University of Illinois at Chicago.

People with an active interest in the arts contribute more to society than those with little or no such interest, the researchers found. They analyzed arts exposure, defined as attendance at museums and dance, music, opera and theater events; and arts expression, defined as making or performing art.

“Even after controlling for age, race and education, we found that participation in the arts, especially as audience, predicted civic engagement, tolerance and altruism,” said Kelly LeRoux, assistant professor of public administration at UIC and principal investigator on the study.

In contrast to earlier studies, Generation X respondents were found to be more civically engaged than older people.

LeRoux’s data came from the General Social Survey, conducted since 1972 by the National Data Program for the Sciences, known by its original initials, NORC. A national sample of 2,765 randomly selected adults participated.

“We correlated survey responses to arts-related questions to responses on altruistic actions — like donating blood, donating money, giving directions, or doing favors for a neighbor — that place the interests of others over the interests of self,” LeRoux said. “We looked at ‘norms of civility.’ Previous studies have established norms for volunteering and being active in organizations.”

The researchers measured participation in neighborhood associations, church and religious organizations, civic and fraternal organizations, sports groups, charitable organizations, political parties, professional associations and trade unions.

They measured social tolerance by two variables:

  • Gender-orientation tolerance, measured by whether respondents would agree to having gay persons speak in their community or teach in public schools, and whether they would oppose having homosexually themed books in the library.
  • Racial tolerance, measured by responses regarding various racial and ethnic groups, including African-Americans, Hispanics, and Asian Americans. Eighty percent of the study respondents were Caucasian, LeRoux said.

The researchers measured altruistic behavior by whether respondents said they had allowed a stranger to go ahead of them in line, carried a stranger’s belongings, donated blood, given directions to a stranger, lent someone an item of value, returned money to a cashier who had given too much change, or looked after a neighbor’s pets, plants or mail.

“If policymakers are concerned about a decline in community life, the arts shouldn’t be disregarded as a means to promote an active citizenry,” LeRoux said. “Our positive findings could strengthen the case for government support for the arts.”

The study was based on data from 2002, the most recent year in which the General Social Survey covered arts participation. LeRoux plans to repeat the study with results from the 2012 survey, which will include arts data.

Calgary hail storm: Cloud seeding credited for sparing city from worse disaster (The Calgary Herald)

‘The storm was a monster,’ says weather modification company

BY THANDI FLETCHER, CALGARY HERALD AUGUST 14, 2012

Paul Newell captured dramatic images in the Bearspaw area of northwest Calgary just before the start of the hailstorm on Sunday, Aug. 12, 2012.

Paul Newell captured dramatic images in the Bearspaw area of northwest Calgary just before the start of the hailstorm on Sunday, Aug. 12, 2012. Photograph by: Reader photo , Paul Newell

A ferocious storm that hammered parts of Calgary with hail stones larger than golf balls late Sunday, causing millions of dollars worth of damage, could have been much worse if cloud-seeding planes hadn’t attempted to calm it down.

“The storm was a monster,” said Terry Krauss, project director of the Alberta Severe Weather Management Society, which contracts American-based company Weather Modification Inc. to seed severe weather clouds in Alberta’s skies. The society is funded by a group of insurance companies with a goal of reducing hail damage claims.

Before the storm hit, Krauss said, the company sent all four of its cloud-seeding aircraft into the thick and swirling black clouds. The planes flew for more than 12 hours, shooting silver iodide, a chemical agent that helps limit the size of hail stones, at the top and base of the clouds, until midnight.

But despite the heavy seeding, golf-ball-sized hail stones pelted parts of Calgary late Sunday night, causing widespread damage to cars and homes.

“This one was a beast. It took everything we threw at it and still was able to wreak some havoc,” said Krauss. “I believe if we hadn’t seeded, it would have even been worse.”

Northeast Calgary was worst hit by the storm, where the hail was between five and six centimetres, said Environment Canada meteorologist John Paul Craig. Other parts of the city saw toonie-sized hail from a second storm system, said Craig.

Craig said Sunday’s storm was worse than Calgary’s last major hailstorm, which saw four-centimetre hail stones, in July 2010.

“These hail stones were just a little bit bigger,” he said.

At Royal Oak Audi in the city’s northwest, broken glass from smashed windows littered the lot Monday morning. Of the 85 new and used cars on the lot, general manager Murray Dorren said not a single car was spared from the storm.

“It’s devastating — that’s probably the best word I can come up with,” he said. “It’s unbelievable that Mother Nature can do this much damage in a very short time. I think it probably took a matter of 10 minutes and there’s millions of dollars worth of damage.

Dorren estimated the damage at about $2 million. Across the lot, the dinged-up vehicles looked like dimpled golf balls from the repetitive pounding of the sizable stones. Some windows and sunroofs were shattered, while others were pierced by the heavy hail.

“They look like bullet holes right through the windscreen,” salesman Nick Berkland said of the damage.

Insurance companies and brokers were inundated with calls all day as customers tried to file claims on their wrecked cars and homes.

Ron Biggs, claims director for Intact Insurance, said it’s too early to tell how many claims the hail event will spurn, although he said they received about two to three times their normal call volume on Monday.

Biggs said the level of damage so far appears to be similar to the July 2010 hailstorm, when Intact received about 12,000 hail damage claims.

Chief operating officer Bruce Rabik of Rogers Insurance, which insures several car dealerships in Calgary, said the damage is extensive.

“It’s certainly a bad one,” he said. “We’ve had one dealership, which they estimate 600 damaged cars. A couple other dealerships with 200 damaged cars each.”

Rabik said claims adjusters are overwhelmed with the volume of claims. He urged customers to be patient as it may take a day or two as insurance workers make their way to each home.

Shredded leaves, twigs and broken branches blanketed pathways along the Bow and Elbow rivers as city crews worked to clear them, said Calgary parks pathway lead Duane Sutherland.

“This was the worst that I’ve seen,” said Sutherland.

Once daylight broke Monday, Royal Oak resident Satya Mudlair inspected the exterior of his home, which was riddled with damage. “Lots of holes in the siding, window damage to the two bedroom windows, and the roof a little bit,” he said.

The apple tree in his backyard has also lost about half its apples, he said. Fortunately, his car was parked inside the garage and was spared any dents.

Mudlair said his insurance company told him it would take two or three weeks before the damage would be repaired. “There’s a big pile of names ahead of me,” he said.

Mudlair’s wife, Nirmalla, had just fallen asleep when she was awoken by the sound of hail stones hitting the roof.

“It was very bad. It was like, thump, thump,” she described the pelting sound. “We got scared and I kept running from room to room.”

Cloud-seeding expert Krauss said Calgary has experienced more severe weather than usual this year, although Sunday’s storm was by far the worst.

“It has been a very stormy year,” he said.

© Copyright (c) The Calgary Herald

Anunciado no Facebook, tênis da Adidas é considerado “racista” (Revista Cult)

Com correntes de borracha, calçado teve a venda suspensa

Junho 2012

No mês de junho, a fabricante de materiais esportivos Adidas anunciou em sua página do Facebook o lançamento de um novo tênis na linha outono-inverno 2012, segundo informou o jornal “Le Monde”. Desenhado pelo estilista Jeremy Scott Roundhouse, o calçado traz pulseiras de borracha simulando correntes, que muitos internautas viram como uma referência à escravidão.

Segundo a CNN, a empresa rapidamente removeu a postagem na página do Facebook, mas o assunto já havia rodado o globo gerando revolta entre internautas.

“Aparentemente não havia pessoas de cor no departamento de marketing que o aprovou”, brinca Rodwell em comentário no site “Nice Kicks”, portal destinado aos lançamentos de tênis.

A empresa, inicialmente, defendeu o designer, descrevendo seu estilo como “original” e alegre, mas o fabricante alemão emitiu um comunicado onde pede desculpas aos ofendidos com o caso e afirma que o modelo não será comercializado.

Ações afirmativas e sistema de cotas nas universidades brasileiras

Mais um passo na luta pela democratização efetiva do Ensino Superior

dhescbrasil.org.br

10 de agosto de 2012

Em 07 de agosto de 2012 o Senado Federal aprovou um projeto que tramitava a cerca de  uma década no Congresso, instituindo a reserva de 50% das vagas das universidades e institutos tecnológicos federais para estudantes que cursaram o ensino médio em escola pública.

Além disso, a lei prevê que, destas vagas, metade serão destinadas a estudantes com renda familiar per capita até um salário mínimo e meio. Também prevê que em cada estado serão destinadas vagas para pretos, pardos e indígenas, respeitando o percentual destes grupos nos estados, de acordo com os dados do IBGE.

Tais medidas visam atender a demandas históricas de ativistas que lutam pelo direito à educação e também pela democratização efetiva do ensino superior no país. Como sabemos historicamente o sistema universitário brasileiro se desenvolveu de forma restrita em termos de número de vagas e também de grupos atendidos. O ensino superior foi pensado durante muito tempo como um sistema para poucos e, com frequência, para aqueles que conseguiram se preparar para competir por uma vaga num quadro altamente competitivo.

Ao longo dos anos 1990 e principalmente dos anos 2000 ampliou-se o consenso entre diferentes setores da sociedade brasileira sobre a enorme desigualdade no acesso ao ensino superior no Brasil, expresso no paradoxo conhecido de que entre os estudantes das universidades públicas predominam os estudantes que freqüentaram escolas particulares no ensino básico, sendo o inverso também verdadeiro.

Observou-se também que os jovens brasileiros que chegavam ao ensino superior eram predominantemente de classe média e de classe alta e em sua maioria brancos, deixando de fora desta possibilidade, portanto, um grande contingente de jovens pobres, pretos, pardos e indígenas.

Em face de esta exclusão educacional, entidades não governamentais e movimentos sociais se mobilizaram para oferecer oportunidades de formação complementar para os jovens pobres, pretos, pardos e indígenas aumentarem suas chances de ingresso. Universidades, prefeituras, empresas e igrejas também se engajaram nestas iniciativas, levando a resultados relevantes em termos de aprovação destes estudantes em exames de seleção.

Também órgãos governamentais passaram a desenvolver políticas para ampliar o acesso ao ensino superior de grupos historicamente excluídos, tais como a reserva de vagas em  universidades públicas, a criação do Programa Universidade para Todos (PROUNI), destinado a fornecer bolsas de estudo em instituições privadas de ensino superior e a  ampliação do investimento em universidades federais visando o aumento da oferta de cursos e vagas.

Em 2012 é possível afirmar que estas medidas produziram efeitos positivos no que diz respeito à ampliação do acesso ao ensino superior de jovens de grupos excluídos. Entretanto, ainda permanece uma distância entre o número de jovens que concluem o ensino médio em escola pública e os que conseguem ingressar em instituição pública de ensino superior. Também ainda é desproporcional o número de estudantes negros e indígenas que chegam ao ensino superior, em comparação com sua proporção na população.

A lei aprovada pelo Senado vem justamente ampliar de forma substantiva estas oportunidades, levando a um compromisso das instituições federais de ensino superior e técnico com esta expansão. A lei também traz um importante compromisso com a igualdade racial, através da formalização do compromisso de ampliação do ingresso de estudantes negros e indígenas em proporções definidas segundo sua representação na população de cada estado da federação.

Num país que, até recentemente, tinha dificuldades em aceitar a desigualdade racial presente na sociedade, a aprovação desta lei reveste-se de grande importância, pois permite que se avance na efetiva democratização de oportunidades de ingresso no ensino superior.

Cabe-nos, agora, perguntar? Todos os problemas se resolvem com esta medida? Obviamente não. Na verdade a aprovação desta lei traz desafios importantes, como a ampliação e consolidação de permanência de estudantes de menor renda no ensino superior, através de um efetivo e eficaz programa de assistência estudantil. Também traz o desafio de continuar ampliando as oportunidades para que milhões de jovens pobres, negros e indígenas possam ter acesso e completar com sucesso o ensino médio, a fim de que possam participar da seleção de ingresso ao ensino superior.

Medidas de democratização com as que estão contidas nesta nova lei são marcos importantes no longo caminho da realização do direito à educação no Brasil. Esperamos que, após a sanção desta lei pela presidência, possamos inaugurar um novo momento nas políticas educacionais no país, com ampliação do acesso, oportunidades mais democráticas de permanência no ensino superior e pela busca de maior igualdade em todos os níveis. O caminho é longo, mas, com esta lei, será dado um grande passo.

Rosana Heringer
Relatora do Direito Humano à Educação

*   *   *

INCLUSÃO NO ENSINO SUPERIOR: RAÇA OU RENDA?

João Feres Júnior*

Grupo Estratégico de Análise da Educação Superior no Brasil – FLACSO Brasil

A decisão por unanimidade do Supremo Tribunal Federal, no dia 26 de abril de 2012, que declarou a constitucionalidade do sistema de cotas étnico-raciais para admissão de alunos ao ensino superior, teve, entre várias consequências positivas, a virtude de abrir a possibilidade para que o debate acerca da inclusão por meio do acesso à educação superior se aprofunde. Mudamos, portanto, de um contexto no qual o debate era dominantemente normativo, preocupado principalmente com a questão da legalidade e constitucionalidade da ação afirmativa étnico-racial, para um novo contexto, no qual passa a importar a discussão concreta acerca dos mecanismos e critérios adotados pelas políticas de inclusão.

Além de sua pertinência moral, a decisão do Supremo é consonante com várias análises a partir de dados estatísticos sólidos, feitas a partir do final dos anos 1970 até o presente, que mostram a relevância da variável classe e da variável raça na reprodução da desigualdade no Brasil. Esse fato nos leva a intuir que o uso de ambas as variáveis em políticas de inclusão é recomendável. Tal intuição é em geral correta, mas não podemos nos esquecer de que da análise sociológica de dados populacionais ao desenho de políticas públicas a distância é grande e não pode ser percorrida sem mediações: identificação de públicos, adoção de categorias, criação de regras, estabelecimento de objetivos, avaliação de resultados etc.

Ao abordar a questão dos critérios de seleção, primeiro cabe fazer uma ressalva de caráter histórico. O debate midiático sobre ação afirmativa foca quase exclusivamente sobre a ação afirmativa étnico-racial. Contudo, a modalidade mais frequente de ação afirmativa adotada pelas universidades públicas brasileiras hoje tem como beneficiários alunos oriundos da escola pública: 61 de um total de 98 instituições, enquanto que apenas 40 têm políticas para negros (ou pretos e pardos).

Mas isso não é só: o processo de criação dessas políticas de inclusão no ensino superior brasileiro – hoje 72% das universidades públicas brasileira têm algum tipo de ação afirmativa – não pode ser narrado sem falarmos do protagonismo do Movimento Negro e de seus simpatizantes ao articular a demanda por inclusão frente às universidades por todo o Brasil. Ao serem pressionadas por esses setores da sociedade civil organizada, as universidades reagiram, cada uma a seu modo, pouquíssimas vezes criando cotas somente para negros (4 casos), muitas vezes criando cotas para
negros e alunos de escola pública (31), e majoritariamente criando cotas para alunos de escola pública. Não houve, por outro lado, nenhum movimento independente para a inclusão de alunos pobres no ensino superior. Em suma, se não fosse pela demanda por inclusão para negros, o debate sobre o papel da universidade no Brasil democrático certamente estaria bem mais atrasado.

O ponto mais importante, contudo, é entender que as mediações entre o conhecimento sociológico e a política pública têm de ser regidas por um espírito pragmatista que segue o seguinte método: a partir de uma concordância básica acerca da situação e dos objetivos, estabelecemos ações mediadoras para a implantação de uma política e então passamos a observar seus resultados. A observação sistemática (e não impressionista) dos resultados é fundamental para que possamos regular as ações mediadoras a fim de atingir nossos objetivos, ou mesmo mudar os objetivos ou a leitura da situação. Sem esse espírito é difícil proceder de maneira progressista na abordagem de qualquer assunto que diga respeito a uma intervenção concreta na realidade.

Assim, ainda que saibamos que ambas as variáveis, classe e raça, devam ser objeto de políticas de inclusão, não existe um plano ideal para aplicá-las. Será que deveriam ser separadas (cotas para negros e cotas para escola pública) ou combinadas (cotas que somente aceitem candidatos com as duas qualificações)? Fato é que pouquíssimas universidades adotam a primeira opção, enquanto 36 das 40 universidades públicas com ação afirmativa para negros têm algum critério de classe combinado, seja ele escola pública ou renda.

Há também outra questão importante: a variável classe deve ser operacionalizada pelo critério de renda ou escola pública? No agregado, as universidades escolheram preferencialmente “escola pública”, 30 das 40, pois ele é mais eficaz do que “declaração de renda” para se auferir a classe social do ingressante – pessoas com renda informal facilmente burlariam o procedimento. Contudo, 6 universidades, entre elas as universidades estaduais do Rio de Janeiro, exemplos pioneiros de adoção de ação afirmativa no país, adotam o critério de renda. No caso das universidades fluminenses, os programas que começaram em 2003 tinham cotas para escola pública separadas de cotas para “negros e pardos” (sic), mas em 2005 a lei foi alterada passando a sobrepor um limite de renda à cota racial.

Informações advindas de pessoas que participaram do debate que levou a tal mudança apontam para o fato de que a exposição do assunto à mídia, fortemente enviesada contra tais políticas, fez com que os tomadores de decisão tentassem se proteger do argumento de que a ação afirmativa beneficiaria somente a classe média negra. A despeito da causa que levou a tal mudança, o método sugerido acima nos leva a olhar para as consequências. Dados da UENF (Universidade Estadual do Norte Fluminense Darcy Ribeiro) mostram que nos anos em que vigorou o sistema antigo, 2003 e 2004, entraram respectivamente 40 e 60 alunos não-brancos – aproximadamente 11% do total de ingressantes. A sobreposição de critérios que passou a operar no ano seguinte derrubou esse número para 19. A média de alunos não-brancos que ingressaram sob o novo regime de 2005 a 2009 é ainda menor – 13 –, o que representa parcos 3% do total de ingressantes.

Conclusão: uma política que produzia resultados foi tornada praticamente irrelevante devido à adoção de critérios que no papel parecem justos, ou adequados, ou politicamente estratégicos. Contudo, o resultado deveria ser a parte fundamental. O exemplo comprova nosso ponto de vista de que não há receitas mágicas. Se isso é verdade, então a experimentação faz-se necessária. Mas fica faltando ainda um elemento crucial nessa equação. Para avaliarmos os resultados da experimentação é preciso que as universidades com programas de inclusão tornem públicos seus dados, e isso não tem acontecido, com raríssimas exceções. Sem avaliações sólidas das políticas, corremos o risco de ficarmos eternamente no plano da conjectura e da anedota e assim não conseguir atingir o objetivo maior dessas iniciativas, que é o de democratizar o acesso à educação superior no Brasil.

Rio de Janeiro, junho de 2012

Este texto é uma contribuição do autor ao projeto Grupo Estratégico de Análise da Educação Superior
(GEA-ES), realizado pela FLACSO-Brasil com apoio da Fundação Ford.

Para antropólogo, a ideia do “eu” precisa dar lugar à de rede (Valor)

Por Carla Rodrigues | Para o Valor, do Rio

7 de agosto de 2012

Divulgação / DivulgaçãoPremiado por sua teoria ator-rede, o francês Bruno Latour discute a relação entre seres humanos e não-humanos

Ele se autodefine como um antropólogo filosófico trabalhando sobre a sociologia. Na prática, o francês Bruno Latour, 65 anos, faz o que ele chama de “antropologia da modernidade”, ao voltar seu olhar para os discursos e práticas desse período, principalmente as científicas.

Dessa pesquisa resultou um de seus livros mais famosos, “Jamais Fomos Modernos – Ensaios de Antropologia Simétrica”, lançado no Brasil em 1994 (Editora 34).

Latour, que está no Brasil pela terceira vez, apresenta na quinta uma palestra gratuita em São Paulo, no Fronteiras do Pensamento, e acaba de participar do simpósio internacional “A Vida Secreta dos Objetos: Novos Cenários da Comunicação”, realizado em São Paulo, Rio e Salvador e que acabou ontem.

Para ele, é aqui que se dará a disputa pelo debate ambiental no século XXI. Hoje empenhado na causa ecológica, Latour é conhecido e premiado por sua teoria ator-rede, uma forma de pensar a relação entre humanos e não-humanos.

Diretor científico da área de pesquisas do Instituto de Estudos Políticos de Paris, integrante de uma geração de franceses formados no pós-guerra, Latour é frequentemente acusado de ser um relativista, crítica que ele rebate com facilidade. “Eu não conheço um ator participante da ciência que não seja um relativista”, afirma.

Valor: O senhor acredita que o Brasil ocupa um lugar especial no cenário mundial neste momento em que a Europa vive uma crise?

Bruno Latour: O Brasil faz parte de minha vida desde a minha infância, pois tive três irmãs que moraram no país, por razões diferentes. Acredito que a questão ecológica do século XXI vai ser decidida aqui. Há coisas que podem ser melhoradas na Europa, do ponto de vista ambiental, mas o verdadeiro cenário desse jogo será o Brasil, porque já é muito tarde para a Ásia e a África. A questão é saber se os intelectuais e os políticos brasileiros poderão ir além dos fundamentos da modernidade. Mas a grande questão ecológica se desenrolará aqui.

Valor: Sua teoria ator-rede se refere a seres humanos e não-humanos. É uma crítica ao humanismo? O que o legado humanista nos proporcionou de tão criticável?

Latour: O humanismo é uma forma limitada de pensar o grupo dos humanos, que vejo como dependentes de muitos outros seres que não são humanos. Uma definição que isole o humano dos seres que o fabricam – tanto as divindades religiosas quanto as coisas com as quais os humanos vivem, como as árvores, mas também o alumínio para fazer estes talheres – é uma visão estreia. A perspectiva humanista foi legítima em uma determinada época, se falarmos do humanismo da metade do século XIX até a metade do século XX, antes que os ecologistas tenham chamado nossa atenção para o problema ambiental. Mas hoje não há mais nenhum sentido falar em humanismo. Este tipo de humanismo não tem os elementos necessários para absorver as grandes questões políticas atuais. Não se pode, por exemplo, fazer uma teoria consciente do problema do clima com o pensamento moral de Kant. Precisamos pensar na composição na qual seres humanos e não-humanos se relacionam. O humanismo é uma versão ultrapassada dos problemas políticos que nos dizem respeito. Hoje, trata-se de ser inteiramente humanista, ou seja, incluir todos os seres que são necessários para a existência humana.

Valor: Um dos postulados da teoria ator-rede é que, quando uma pessoa age, mais alguém está agindo junto. O senhor poderia explicar como isso funciona?

Latour: Os humanos são envolvidos por muitos outros seres, e a ideia de que uma pessoa age autonomamente, com seus próprios objetivos, não funciona nem na economia, nem na religião, nem na psicologia nem em nenhuma outra situação. Portanto, a pergunta que a teoria ator-rede coloca é: quais são os outros seres ativos no momento em que alguém age? A antropologia e a sociologia que tento desenvolver se ocupa da pesquisa desses seres. Eu posso colocar a questão de um modo inverso: como, apesar das evidências de todos os numerosos seres que participam de uma ação, continua-se a pensar como se o único ator fosse o humano dotado de uma psicologia, ciente de si mesmo, calculador, autônomo, responsável? A antropologia no Brasil é particularmente capaz de entender que não há esse “eu”, esse sujeito individual e autônomo que age no mundo, o que é uma visão muito estreita. Tenho muito contato com outros antropólogos brasileiros, como o Eduardo Viveiros de Castro (UFRJ).

Valor: O senhor veio ao Brasil para participar de um simpósio sobre novas tecnologias de comunicação. Qual é a grande afinidade entre a sua teoria ator-rede e as teorias da comunicação?

Latour: Elas são próximas porque a teoria ator-rede é essencialmente uma teoria da multiplicidade de mediações, e esses pesquisadores estão interessados em discutir o domínio da mídia e das mediações. Aqueles que se interessam por mediação – de modo positivo, e não negativamente – encontram conceitos e métodos para trabalhar com a teoria ator-rede.

Valor: Por que os jornalistas estão sempre mencionados entre os integrantes importantes da teoria ator-rede?

Latour: A formatação de informações desempenha um papel muito importante no espaço público, no qual se situa o espaço político. Não conheço muitos estudos sobre jornalismo que sejam feitos a partir da teoria ator-rede, porque essas pesquisas geralmente são feitas do ponto de vista crítico, e a teoria ator-rede não é uma crítica. Muito frequentemente, os jornalistas são simplesmente acusados de deturpar um ideal de verdade que, se não houvesse a mediação, chegaria ao público a partir de uma transmissão transparente e direta. Cientistas, políticos e economistas gostam de dizer que, se não houvesse os jornalistas, a informação seria mais transparente, mais direta, menos comprometida.

Valor: A teoria ator-rede se transformou em muitas outras coisas – cada um dos pesquisadores do grupo original seguiu por um lado, e houve uma diáspora. O senhor ainda se reconhece como um teórico da ator-rede?

Latour: O grupo original nunca foi muito unido, mas se reuniu em um momento em que a sociologia percebeu que havia negligenciado a técnica, a ciência, e os seres não-humanos. Foi uma tomada de consciência das ciências sociais de que o século XX nos legou uma série de questões – como a da dominação e a da exploração -, mas sempre com uma visão sociocentrada. A teoria ator-rede vem a ser a evidência de que é preciso se interessar pela vida secreta dos objetos.

Valor: Refaço ao senhor uma pergunta que está no livro “A Esperança de Pandora” (Edusc): de onde provém a oposição entre o campo da razão e o campo da força?

Latour: Fiz uma genealogia dessa oposição, que remonta à falsa disputa entre os sofistas e os filósofos e organizou o debate nos países ocidentais. Pretendi suspender essa separação e colocar a questão sobre qual é a força dos dispositivos racionais. Foi assim que comecei minha antropologia da ciência. E há uma segunda pergunta: quais são as razões da relação de força política, religiosa, econômica? A distinção entre força e razão faz parte de um conjunto de antigas dicotomias que não são mais capazes de nos orientar quando falamos da questão científica. Nessa dicotomia, supõe-se que a razão vai unificar a discussão. Mas, se a razão já teve esse poder, atualmente não tem mais, e precisamos encontrar outras ferramentas intelectuais para nos orientar nessa disputa. É o que eu chamo de cartografia da controvérsia. Essa é hoje uma grande questão para a democracia.

Valor: Afirmar que a ciência é social é uma forma de relativizar os resultados científicos?

Latour: Esse é um mal-entendido sobre o significado da palavra social. Evidentemente, dizer que os fatos são sociais não equivale a dizer que esse garfo é uma fabricação social – isso não faria sentido. Eu digo que esse garfo é resultado de um processo industrial que inclui uma legislação, empresas, indústrias – o que é totalmente diferente. A ciência faz parte de um coletivo – estou propositalmente evitando usar a palavra social – do mundo. Há quem acredite que a ciência, particularmente as ciências naturais, é absoluta. Mas esses são os religiosos da ciência, não os participantes da ciência. Não conheço um ator participante da ciência que não seja um relativista ou, melhor dizendo, um relacionista, porque ele sabe que conhecer é estabelecer relações dentro de um quadro de referências. A crítica aos relativistas, feita pelos absolutistas, é frequente, mas essa não é uma discussão produtiva. A discussão que me interessa é: como estabelecer as relações entre os quadros de referência, as culturas, os modos de existência, as formas de vida? Não conheço quem que, desse ponto de vista, critique o relativismo.

Valor: Pode-se resumir seu livro “Jamais Fomos Modernos” como uma crítica à modernidade. O senhor mantém as mesmas críticas em relação aos pós-modernos?

Latour: Sim. Os pós-modernos tiveram a sensibilidade de perceber que havia qualquer coisa de complicada na modernidade, mas é o mesmo movimento. Simplesmente há um retorno a alguns dos problemas que a modernidade não havia tratado, mas não há um retorno às raízes da modernidade.

Carla Rodrigues, professora da Universidade Federal Fluminense (UFF) e da Pontifícia Universidade Católica do Rio (PUC-Rio), é doutora em filosofia e pesquisadora do CNPq

© 2000 – 2012. Todos os direitos reservados ao Valor Econômico S.A. . Verifique nossos Termos de Uso em http://www.valor.com.br/termos-de-uso. Este material não pode ser publicado, reescrito, redistribuído ou transmitido por broadcast sem autorização do Valor Econômico. 

Occupy, Anthropology, and the 2011 Global Uprisings (Cultural Anthropology)

Hot spot – Occupy, Anthropology, and the 2011 Global Uprisings

Submitted by Cultural Anthropology on Fri, 2012-07-27 10:36

Introduction: Occupy, Anthropology, and the 2011 Global Uprisings

Guest Edited by Jeffrey S. Juris (Northeastern University) and Maple Razsa (Colby College)

Occupy Wall Street burst spectacularly onto the scene last fall with the take-over of New York City’s Zuccotti Park on September 17, 2011, followed by the rapid spread of occupations to cities throughout the US and the world. The movement combined mass occupations of urban public spaces with horizontal forms of organization and large-scale, directly democratic assemblies. Making effective use of the viral flows of images and information generated by the intersections of social and mass media, the occupations mobilized tens of thousands around the globe, including many new activists who had never taken part in a mass movement before, and inspired many more beyond the physical encampments themselves. Before the wave of violent police evictions in November and December of 2011 drove activists into submerged forms of organizing through the winter, the Occupy movements had already captured the public imagination. Bequeathing to us potent new memes such as the 1% (those at the top of the wealth and income scale) and the 99% (the rest of us), Occupy provided a framework for talking about issues that have been long obscured in public life such as class and socio-economic inequality and helped to shift the dominant political-economic discourse from an obsession with budget deficits and austerity to a countervailing concern for jobs, equality, and economic fairness.

In other words, prior to Occupy, much of the populist anger stemming from the 2008 financial crisis in North America and Europe had been effectively channeled by the Right into both an attack on marginalized groups—e.g. immigrants, people of color, Gays and Lesbians—and a particularly pernicious version of the already familiar critique of unbridled spending. This was especially so in the US where the Tea Party tapped into the widespread public ire over the Wall Street bailouts to bolster a far-reaching attack on “big government” through a radical program of fiscal austerity. Of course, the debt problem was a consequence rather than a cause of the crisis, the result of deregulation, predatory lending, and the spread of highly complex financial instruments facilitated by the neoliberal agenda of the very people who were now seeking to impose budgetary discipline (see Financial Crisis Hot Spot).

However, the contributions of Occupy are not exclusively, or even primarily, to be assessed in terms of their intervention in public discourse. The Occupy movements are also a response to a fundamental crisis of representative politics embodied in an embrace of more radical, directly democratic practices and forms. In their commitment to direct democracy and action the politics put into practice in the various encampments are also innovative prefigurative attempts to model alternative forms of political organization, decision making, and sociability. This turn is crucial: while neoliberalism has been endlessly critiqued it seems to live on as the only policy response—in the form of austerity—to the crisis neoliberalism itself has produced. The need for ethnographic accounts of this prefigurative politics, and its attendant challenges and contradictions, is especially urgent given that Occupy has refused official representatives and because occupiers have extended democracy beyond formal institutions into new spheres of life through a range of practices, including the collective seizure of public space, the people’s mic, horizontal organization, hand signals, and general assemblies.

It is also important to remember that Occupy was a relative latecomer—if a symbolically important one—to the social unrest the global crisis and policies of austerity have provoked. Cracks in the veneer of conformity emerged during the 2008 rebellion in Greece, where students, union members, and other social actors, galvanized by the murder of a fifteen year old student, took to the streets to challenge the worsening economic conditions (See Greece Hot Spot). Students were also among the first wave of resistance elsewhere with protests against budget cuts and increased fees in California, Croatia, the UK, and Chile. In the US signs of wider social discontent finally surfaced during the Wisconsin uprising in February 2011, which included the occupation of the Wisconsin State House in opposition to Governor Scott Walker’s attack on collective bargaining for public sector unions under the guise of budgetary discipline (cf. Collins 2012). As in Wisconsin, the widespread circulation of images from the Arab Spring continued to spark the intense feelings of solidarity, political possibility, and agency that ultimately led to the occupation of Wall Street. From the pro-democracy marches in Tunisia in response to the self-immolation of Mohammed Bouazizi to the mass occupations of Cairo’s Tahrir Square in opposition to the Egyptian dictator Hosni Mubarak, the Middle East uprisings, imbued protesters with the sense that dramatic political transformation was possible even as subsequent events have indicated that actual political outcomes are always ambivalent and uncertain (see Arab Spring Hot Spot).

Inspired by the uprisings in Tunisia and Egypt and responding to the working and middle class casualties of Spain and Europe’s debt crisis, hundreds of thousands of protesters took to the streets of Madrid on May 15, 2011 and occupied the Puerta del Sol square, sparking a wave of similar mobilizations and encampments around the Spain that would become known as 15M or the movement of the Indignados. Indeed, the combination of mass public occupations with large-scale participatory assemblies provided a template that would be enacted in Zuccotti Park, in part via the influence of Spanish activists residing in New York. That summer a similar movement of Israeli youths sprang up in Tel Aviv, using tent cities and popular assemblies to shine a light on the rising cost of housing and other living expenses.

Finally, in response to an August 2011 call by the Canadian magazine AdBusters to occupy Wall Street in the spirit of these 2011 Global uprisings, activists occupied Zuccotti Park after being rebuffed by the police in an attempt to take Wall Street itself. The occupation initially garnered little media attention, until its second week when images of police repression started going viral, leading to a surge in public sympathy and support, and ever growing numbers streaming to the encampments themselves each time another protester was maced or a group of seemingly innocent protesters rounded up, beaten, and/or arrested. Occupations quickly spread around the US and other parts of the world, generating, for a moment, a proliferating series of encampments physically rooted in local territories, yet linked up with other occupations through interpersonal and online trans-local networks. Following the evictions in the US last fall, local assemblies and working groups have continued to meet—hosting discussions, planning actions and campaigns, producing media, and building and modifying organizational forms—even as the Occupy movements prepared for their public reemergence in the spring through mobilizations such as the May Day protests and mass direct actions against NATO in Chicago and the European Central Bank in Frankfurt.

Additionally, each of these uprisings has diffused through the widespread use of social media, reflecting the mutually constitutive nature of embodied and online protest. The use of social media, in particular, has allowed the Occupy movements, as in other recent mobilizations, to penetrate deeply into the social fabric and mobilize many newcomers who have never been active before in social movements. At the same time, these emerging “logics of aggregation” within the Occupy movements have resulted in a more individualized mode of participation and a form of movement that is more singularizing (e.g. the way the 99% frame can obscure internal differences) and more dependent on the long-term occupation of public space than other recent movements (Juris 2012). A particular set of tensions and strategic dilemmas have thus plagued the Occupy movements, including a divide between newer and more seasoned activists, the difficulty of recognizing and negotiating internal differences, a lack of common political and organizational principles beyond the General Assembly model, and the difficulty of transitioning to new tactics, strategies, visions, and structures in a post-eviction era. In short, activists are now faced with fundamental questions about how to build a movement capable of actually transforming the deep inequalities they have attempted to address.

In assembling this Hot Spot on Occupy we have invited contributions from anthropologists, ethnographers, and activists writing on the above themes: the mass occupation of public spaces, directly democratic practices and forms, the use of social media, the emotions and emerging subjectivities of protest, as well as the underlying political critiques and contradictions that have arisen in the movement. Similarly, in light of the global history we outline above, the range of other social movement responses to the current global economic crisis, as well as the ongoing links between struggles in the US, Europe, Latin America, and North Africa, we have been careful to include contributors conducting research beyond the US in countries such as Greece, Slovenia, Spain, Israel, Argentina, Egypt, and Canada. In so doing, we insist that Occupy must be understood in a global rather than a populist US-centric framework.

Our collaboration on this Hot Spot—which emerged from conversations around our articles on Occupy in the May 2012 edition ofAmerican Ethnologist (Juris 2012Razsa and Kurnik 2012)—also reflects our scholarly and political commitments, as well as those of our contributors. First, it was our priority to invite scholars and activists who are directly involved with these movements rather than adding to the abundant armchair punditry on Occupy. These contributions also reflect recent trends in anthropology with respect to the growing practice of activist research, militant ethnography, public anthropology, and other forms of politically committed ethnographic research, which are taking increasingly institutionalized forms with Cultural Anthropology “Hot Spots”like this one, “Public Anthropology Reviews” in American Anthropologist, recent interventions in American Ethnologist on Egypt, Wisconsin, and Occupy, as well as Current Anthropology “Current Applications.”

In addition to providing an ethnographically and analytically informed view of and from various occupations and kindred mobilizations, this Hot Spot thus provides another example of how anthropologists are making themselves politically relevant and are engaging issues of broad public concern. Given these shifts, together with the progressive inclinations of many anthropologists and the ubiquity and inherent interest of Occupy, it should come as no surprise that so many anthropologists and ethnographers from related fields, including those within and outside the academy, have played key roles in the Occupy movements and their precursors in countries such as Greece and Spain. Indeed, in their post Carles Feixa and his collaboratorsrefer to anthropologists as the “organic intellectuals” of the 15 M movement. As many of the contributions to this Hot Spot attest, a similar case might be made for the role of activist anthropologists within Occupy more generally.

As the contributions below make clear, our emphasis on participatory and politically committed research does not imply a romanticization of resistance or a refusal to confront the contradictions, limits, and exclusions of social movements, especially along axes of class, race, gender, sexuality, and citizenship. Given the disproportionate, though by no means exclusively White, middle class participation in the US Occupy movements, such critical perspectives are essential. Each of the following entries thus combines thick ethnographic description on the part of anthropologists, ethnographers, and activists who have been directly involved in the Occupy movements or other instances of mobilization during the 2011 global uprisings—either through engagement with one more encampments and/or the themes addressed by Occupy—with critical analysis of one or more of the issues outlined above.

NOTES

[1] Occupy has thus addressed many of the same themes and drawn on many of the organizational practices associated with the global justice movements of a previous era, even as it has resonated more strongly with domestic national contexts of the Global north.

[2] The people’s mic is a form of voice amplification whereby everyone in listening distance repeats a speaker’s words so that others situated further away can also hear (See Garces, this Hot Spot).

[3] For example, in the U.S. local encampments created “Inter-Occupy” groups maintain ties with other occupations, while twitter feeds, listservs, websites, and other digital tools were used to communicate and coordinate more broadly. See our digital resources page for additional links.

REFERENCES

Collins, Jane. 2012. “Theorizing Wisconsin’s 2011 Protests: Community-Based Unionism Confronts Accumulation by Dispossession.” American Ethnologist 39 (1):6–20.

Juris, Jeffrey. 2012. “Reflections on #Occupy Everywhere: Social Media, Public Space, and Emerging Logics of Aggregation.”American Ethnologist 39 (2):259-279.

Razsa, Maple and Andrej Kurnik. 2012. “The Occupy Movement in Žižek’s Hometown: Direct Democracy and a Politics of Becoming.” American Ethnologist 39 (2):238-258.

***ESSAYS***

Prefigurative Politics

Marianne Maeckelbergh, Horizontal Decision-Making across Time and Place

Chris Garces, People’s Mic and ‘Leaderful’ Charisma

Philip Cartelli, Trying to Occupy Harvard

Public Space

Zoltán Glück, Between Wall Street and Zuccotti: Occupy and the Scale of Politics

Carles Feixa, et al., The #spanishrevolution and Beyond

Dimitris Dalakoglou,  The Movement and the “Movement” of Syntagma Square

Experience and Subjectivity

Jeffrey S. Juris, The 99% and the Production of Insurgent Subjectivity

Diane Nelson, et al., Her earliest leaf’s a flower…

Maple Razsa, The Subjective Turn: The Radicalization of Personal Experience within Occupy Slovenia

Marina Sitrin, Occupy Trust: The Role of Emotion in the New Movements

Strategy and Tactics

David Graeber, Occupy Wall Street rediscovers the radical imagination

Kate Griffiths-Dingani, May Day, Precarity, Affective Labor, and the General Strike

Angelique Haugerud, Humor and Occupy Wall Street

Karen Ho, Occupy Finance and the Paradox/Possibilities of Productivity

Social Media

Alice Mattoni, Beyond Celebration: Toward a More Nuanced Assessment of Facebook’s Role in Occupy Wall Street

John Postill, Participatory Media Research and Spain’s 15M Movement

Critical Perspectives

Yvonne Yen Liu, Decolonizing the Occupy Movement

Manissa McCleave Maharawal, Fieldnotes on Union Square, Anti-Oppression, and Occupy

Uri Gordon, Israel’s “Tent Protests:” A Domesticated Mobilization

Alex Khasnabish, Occupy Nova Scotia: The Symbolism and Politics of Space

Post Normal Science: Deadlines (Climate Etc.)

Posted on August 3, 2012

by Steven Mosher

Science has changed. More precisely, in post normal conditions the behavior of people doing science has changed.

Ravetz describes a post normal situation by the following criteria:

  1. Facts are uncertain
  2. Values are in conflict
  3. Stakes are high
  4. Immediate action is required

The difference between Kuhnian normal science, or the behavior of those doing science under normal conditions, and post normal science is best illustrated by example. We can use the recent discovery of the Higgs Boson as an example. Facts were uncertain–they always are to a degree; no values were in conflict; the stakes were not high; and, immediate action was not required. What we see in that situation is those doing science acting as we expect them to, according to our vague ideal of science. Because facts are uncertain, they listen to various conflicting theories. They try to put those theories to a test. They face a shared uncertainity and in good faith accept the questions and doubts of others interested in the same field. Their participation in politics is limited to asking for money. Because values are not in conflict no theorist takes the time to investigate his opponent’s views on evolution or smoking or taxation. Because the field of personal values is never in play, personal attacks are minimized. Personal pride may be at stake, but values rarely are. The stakes for humanity in the discovery of the Higgs are low: at least no one argues that our future depends upon the outcome. No scientist straps himself to the collider and demands that it be shut down. And finally, immediate action is not required; under no theory is the settling of the uncertainty so important as to rush the result. In normal science, according to Kuhn,  we can view the behavior of those doing science as puzzle solving. The details of a paradigm are filled out slowly and deliberately.

The situation in climate science are close to the polar opposite of this. That does not mean and should not be construed as a criticism of climate science or its claims. The simple point is this: in a PNS situation, the behavior of those doing science changes. To be sure much of their behavior remains the same. They formulate theories; they collect data, and they test their theories against the data. They don’t stop doing what we notional  describe as science. But, as foreshadowed above in the description of how high energy particle physicists behave, one can see how that behavior changes in a PNS situation. There is uncertainty, but the good faith that exists in normal science, the faith that other people are asking questions because they actually want the answer is gone. Asking questions, raising doubts, asking to see proof becomes suspect in and of itself. And those doing science are faced with a question that science cannot answer: Does this person really want the answer or are they amerchant of doubt? Such a question never gets asked in normal science. Normal science doesn’t ask this question because science cannot answer it.

Because values are in conflict the behavior of those doing science changes. In normal science no one would care if Higgs was a Christian or an atheist. No one would care if he voted liberal or conservative; but because two different value systems are in conflict in climate science, the behavior of those doing science changes. They investigate each other. They question motives. They form tribes.  And because the stakes are high the behavior of those doing science changes as well. They protest; they take money from lobby groups on both sides and worse of all they perform horrendous raps on youTube. In short, they become human; while those around them canonize them or demonize them and their findings become iconized or branded as hoaxes.

This brings us to the last aspect of a PNS situation: immediate action is required. This perhaps is the most contentious aspect of PNS, in fact I would argue it is thedefining characteristic. In all PNS situations it is almost always the case the one side sees the need for action, given the truth of their theory, while the doubtersmust of necessity see no need for immediate action. They must see no need for immediate action because their values are at risk and because the stakes are high. Another way to put this is as follows. When you are in a PNS situation, all sides must deny it. Those demanding immediate action, deny it by claiming more certainty*than is present; those refusing immediate action, do so by increasing demands for certainty. This leads to a centralization and valorization of the topic of uncertainty, and epistemology becomes a topic of discussion for those doing science. That is decidedly not normal science.

The demand for immediate action, however, is broader than simply a demand that society changes. In a PNS situation the behavior of those doing science changes. One of the clearest signs that you are in PNS is the change in behavior around deadlines. Normal science has no deadline. In normal science, the puzzle is solved when it is solved. In normal science there may be a deadline to shut down the collider for maintenance. Nobody rushes the report to keep the collider running longer than it should. And if a good result is found, the schedules can be changed to accommodate the scienceBroadly speaking, science drives the schedule; the schedule doesn’t drive the science.

The climategate mails are instructive here. As one reads through the mails it’s clear that the behavior of those doing science is not what one would call disinterested patient puzzle solving. Human beings acting in a situation where values are in conflict and stakes are high will engage in behavior that they might not otherwise. Those changes are most evident in situations surrounding deadlines. The point here is not to rehash The Crutape Lettersbut rather to relook at one incident ( there are others, notably around congressional hearings ) where deadlines came into play. The deadline in question was the deadline for submitting papers for consideration. As covered in The Crutape Letters and in The Hockeystick Illusion, the actions taken by those doing science around the“Jesus Paper” is instructive. In fact, were I to rewrite the Crutape letters I would do it from the perspective of PNS, focusing on how the behavior of those doing science deviated from the ideals of openness, transparency and letting truth come on its own good time.

Climategate is about FOIA. There were two critical paths for FOIA: one sought data, the other sought the emails of scientists. Not quite normal. Not normal in that data is usually shared; not normal in that we normally respect the privacy of those doing science. But this is PNS, and all bets are off. Values and practices from other fields, such as business and government,  are imported into the culture of science: Data hoarding is defended using IP and confidentiality agreements. Demanding private mail is defended using values imported from performing business for the public. In short, one sign that a science is post normal, is the attempt to import values and procedures from related disciplines. Put another way, PNS poses the question of governance. Who runs science and how should they run it.

The “Jesus paper” in a nutshell can be explained as follows. McIntyre and McKittrick had a paper published in the beginning of 2005. That paper needed to be rebutted in order to make Briffa’s job of writing chapter 6 easier. However, there was a deadline in play. Papers had to be accepted by a date certain. At one point Steven Schneider suggested the creation of a new category, a novelty–  provisionally accepted — so that the “jesus paper” could make the deadline. McIntyre covers the issue here. One need not re-adjudicate whether or not the IPCC rules were broken. And further these rules have nothing whatsoever ever to do with the truth of the claims in that paper. This is not about the truth of the science. What is important is the importation of the concept of a deadline into the search for truth. What is important is that the behavior of those doing science changes. Truth suddenly cares about a date. Immediate action is required. In this case immediate action is taken to see to it that the paper makes it into the chapter. Normal science takes no notice of deadlines. In PNS, deadlines matter.

Last week we saw another example of deadlines and high stakes changing the behavior of those doing science. The backstory here explains .   It appears to me that the behavior of those involved changed from what I have known it to be. It changed because they perceived that immediate action was required. A deadline had to be met. Again, as with the Jesus paper, the facts surrounding the releasedo not go to the truth of the claims. In normal science, a rushed claimed might very well get the same treatment as an unrushed claim: It will be evaluated on its merits. In PNS, either the rush to meet an IPCC deadline– as in the case of the Jesus paper, or the rush to be ready for congress –as in the Watts case, is enoughfor some doubt the science.  What has been testified to in Congress by Christy, a co author, may very well be true. But in this high stakes arena, where facts are uncertain and values are in conflict, the behavior of those doing science can and does change. Not all their behavior changes. They still observe and test and report. But the manner in which they do that changes. Results are rushed and data is held in secret. Deadlines change everything. Normal science doesn’t operate this way; if it does, quality can suffer. And yet, the demand for more certainty than is needed, the bad faith game of delaying action by asking questions, precludes a naïve return to science without deadlines.

The solution that Ravetz suggests is extended peer review and a recognition of the importance of quality. In truth, the way out of a PNS situation is not that simple. The first step out of a PNS situation is the recognition that one is in the situation to begin with. Today, few people embroiled in this debate would admit that the situation has changed how they would normally behave. An admission that this isn’t working is a cultural crisis for science. No one has the standing to describe how one should conduct science in a PNS situation. No one has the standing to chart the path out of a PNS situation. The best we can do is describe what we see. Today, I observe that deadlines change the behavior of those doing science. We see that in climategate; we see that in the events of the past week. That’s doesn’t entail anything about the truth of science performed under pressure. But it should make us pause and consider if truth will be found any faster by rushing the results and hiding the data.

*I circulated a copy of this to Michael Tobis to get his reaction. MT took issue with this characterization. MT, I believe, originated the argument that our uncertainty is a reason for action. It is true that while the certainty about the science  has been a the dominant piece of the rhetoric, there has been a second thread of rhetoric that bases action in the uncertainty about sensitivity. I would call this certainty shifting. While the uncertainty about facts of sensitivity are accepted in this path of argument the certainty is shifted to certainty about values and certainty about impacts. In short, the argument becomes that while we are uncertain about sensitivity the certainty we have about large impacts and trans-generational obligations necessitates action.

Scientists struggle with limits – and risks – of advocacy (eenews.net)

Monday, July 9, 2012

Paul Voosen, E&E reporter

Jon Krosnick has seen the frustration etched into the faces of climate scientists.

For 15 years, Krosnick has charted the rising public belief in global warming. Yet, as the field’s implications became clearer, action has remained elusive. Science seemed to hit the limits of its influence. It is a result that has prompted some researchers to cross their world’s no man’s land — from advice to activism.

As Krosnick has watched climate scientists call for government action, he began pondering a recent small dip in the public’s belief. And he wondered: Could researchers’ move into the political world be undermining their scientific message?

Jon Krosnick
Stanford’s Jon Krosnick has been studying the public’s belief in climate change for 15 years, but only recently did he decide to probe their reaction to scientists’ advocacy. Photo courtesy of Jon Krosnick.

“What if a message involves two different topics, one trustworthy and one not trustworthy?” said Krosnick, a communication and psychology professor at Stanford University. “Can the general public detect crossing that line?”

His results, not yet published, would seem to say they can.

Using a national survey, Krosnick has found that, among low-income and low-education respondents, climate scientists suffered damage to their trustworthiness and credibility when they veered from describing science into calling viewers to ask the government to halt global warming. And not only did trust in the messenger fall — even the viewers’ belief in the reality of human-caused warming dropped steeply.

It is a warning that, even as the frustration of inaction mounts and the politicization of climate science deepens, researchers must be careful in getting off the political sidelines.

“The advice that comes out of this work is that all of us, when we claim to have expertise and offer opinions on matters [in the world], need to be guarded about how far we’re willing to go,” Krosnick said. Speculation, he added, “could compromise everything.”

Krosnick’s survey is just the latest social science revelation that has reordered how natural scientists understand their role in the world. Many of these lessons have stemmed from the public’s and politicians’ reactions to climate change, which has provided a case study of how science communication works and doesn’t work. Complexity, these researchers have found, does not stop at their discipline’s verge.

For decades, most members of the natural sciences held a simple belief that the public stood lost, holding out empty mental buckets for researchers to fill with knowledge, if they could only get through to them. But, it turns out, not only are those buckets already full with a mix of ideology and cultural belief, but it is incredibly fraught, and perhaps ineffective, for scientists to suggest where those contents should be tossed.

It’s been a difficult lesson for researchers.

“Many of us have been saddened that the world has done so little about it,” said Richard Somerville, a meteorologist at the Scripps Institution of Oceanography and former author of the United Nations’ authoritative report on climate change.

“A lot of physical climate scientists, myself included, have in the past not been knowledgeable about what the social sciences have been saying,” he added. “People who know a lot about the science of communication … [are] on board now. But we just don’t see that reflected in the policy process.”

While not as outspoken as NASA’s James Hansen, who has taken a high-profile moral stand alongside groups like 350.org and Greenpeace, Somerville has been a leader in bringing scientists together to call for greenhouse gas reductions. He helped organize the 2007 Bali declaration, a pointed letter from more than 200 scientists urging negotiators to limit global CO2 levels well below 450 parts per million.

Such declarations, in the end, have done little, Somerville said.

“If you look at the effect this has had on the policy process, it is very, very small,” he said.

This failed influence has spurred scientists like Somerville to partner closely with social scientists, seeking to understand why their message has failed. It is an effort that received a seal of approval this spring, when the National Academy of Sciences, the nation’s premier research body, hosted a two-day meeting on the science of science communication. Many of those sessions pivoted on public views of climate change.

It’s a discussion that’s been long overdue. When it comes to how the public learns about expert opinions, assumptions mostly rule in the sciences, said Dan Kahan, a professor of law and psychology at Yale Law School.

“Scientists are filled with conjectures that are plausible about how people make sense about information,” Kahan said, “only some fraction of which [are] correct.”

Shifting dynamic

Krosnick’s work began with a simple, hypothetical scene: NASA’s Hansen, whose scientific work on climate change is widely respected, walks into the Oval Office.

As he has since the 1980s, Hansen rattles off the inconvertible, ever-increasing evidence of human-caused climate change. It’s a stunning litany, authoritative in scope, and one the fictional president — be it a Bush or an Obama — must judge against Hansen’s scientific credentials, backed by publications and institutions of the highest order. If Hansen stops there, one might think, the case is made.

But he doesn’t stop. Hansen continues, arguing, as a citizen, for an immediate carbon tax.

“Whoa, there!” Krosnick’s president might think. “He’s crossed into my domain, and he’s out of touch with how policy works.” And if Hansen is willing to offer opinions where he lacks expertise, the president starts to wonder: “Can I trust any of his work?”

Richard Somerville
Part of Scripps’ legendary climate team — Charles David Keeling was an early mentor — Richard Somerville helped organize the 2007 Bali declaration by climate scientists, calling for government action on CO2 emissions. Photo by Sylvia Bal Somerville.

Researchers have studied the process of persuasion for 50 years, Krosnick said. Over that time, a few vital truths have emerged, including that trust in a source matters. But looking back over past work, Krosnick found no answer to this question. The treatment was simplistic. Messengers were either trustworthy or not. No one had considered the case of two messages, one trusted and one shaky, from the same person.

The advocacy of climate scientists provided an excellent path into this shifting dynamic.

Krosnick’s team hunted down video of climate scientists first discussing the science of climate change and then, in the same interview, calling for viewers to pressure the government to act on global warming. (Out of fears of bruised feelings, Krosnick won’t disclose the specific scientists cited.) They cut the video in two edits: one showing only the science, and one showing the science and then the call to arms.

Krosnick then showed a nationally representative sample of 793 Americans one of three videos: the science-only cut, the science and political cut, and a control video about baking meatloaf (The latter being closer to politics than Krosnick might admit). The viewers were then asked a series of questions both about their opinion of the scientist’s credibility and their overall beliefs on global warming.

For a cohort of 548 respondents who either had a household income under $50,000 or no more than a high school diploma, the results were stunning and statistically significant. Across the board, the move into politics undermined the science.

The viewers’ trust in the scientist dropped 16 percentage points, from 48 to 32 percent. Their belief in the scientist’s accuracy fell from 47 to 36 percent. Their overall trust in all scientists went from 60 to 52 percent. Their belief that government should “do a lot” to stop warming fell from 62 to 49 percent. And their belief that humans have caused climate change fell 14 percentage points, from 81 to 67 percent.

Krosnick is quick to note the study’s caveats. First, educated or wealthy viewers had no significant reaction to the political call and seemed able to parse the difference between science and a personal political view. The underlying reasons for the drop are far from clear, as well — it could simply be a function of climate change’s politicization. And far more testing needs to be done to see whether this applies in other contexts.

With further evidence, though, the implications could be widespread, Krosnick said.

“Is it the case that the principle might apply broadly?” he asked. “Absolutely.”

‘Fraught with misadventure’

Krosnick’s study is likely rigorous and useful — he is known for his careful methods — but it still carries with it a simple, possibly misleading frame, several scientists said.

Most of all, it remains hooked to a premise that words float straight from the scientist’s lips to the public’s ears. The idea that people learn from scientists at all or that they are simply misunderstanding scientific conclusions is not how reality works, Yale’s Kahan said.

“The thing that goes into the ear is fraught with misadventure,” he said.

Kahan has been at the forefront of charting how the empty-bucket theory of science communication — called the deficit model — fails. People interpret new information within the context of their own cultural beliefs, peers and politics. They use their reasoning to pick the evidence that supports their views, rather than the other way around. Indeed, recent work by Kahan found that higher-educated respondents were more likely to be polarized than their less-educated peers.

Krosnick’s study will surely spur new investigations, Kahan said, though he resisted definite remarks until he could see the final work. If the study’s conditions aren’t realistic, even a simple model can have “plenty of implications for all kinds of ways of which people become exposed to science,” he said.

The survey sits well with other research in the field and carries an implication about what role scientists should play in scientific debates, added Matthew Nisbet, a communication professor at American University.

“As soon as you start talking about a policy option, you’re presenting information that is potentially threatening to people’s values or identity,” he said. The public, he added, doesn’t “view scientists and scientific information in a vacuum.”

The deficit model has remained an enduring frame for scientists, many of whom are just becoming aware of social science work on the problem. Kahan compares it to the stages of grief. The first stage was that the truth just needs to be broadcast to change minds. The second, and one still influential in the scientific world, is that if the message is just simplified, the right images used, than the deficit will be filled.

“That too, I think, is a stage of misperception about how this works,” Kahan said.

Take the hand-wringing about science education that accompanied a recent poll finding that 46 percent of the United States believed in a creationist origin for humans. It’s a result that speaks to belief, not an understanding of evolution. Many surveyed who believed in evolution would still fail to explain natural selection, mutation or genetic variance, Kahan said, just as they don’t have to understand relativity to use their GPS.

Much of science doesn’t run up against the public’s belief systems and is accepted with little fuss. It’s not as if Louis Pasteur had to sell pasteurization by using slick images of children getting sick; for nearly all of society, it was simply a useful tool. People want to defer to the experts, as long as they don’t have to concede their beliefs on the way.

“People know what’s known without having a comprehension of why that’s the truth,” Kahan said.

There remains a danger in the emerging consensus that all scientific knowledge is filtered by the motivated reasoning of political and cultural ideology, Nisbet added. Not all people can be sorted by two, or even four, variables.

“In the new ideological deficit model, we tend to assume that failures in communication are caused by conservative media and conservative psychology,” he said. “The danger in this model is that we define the public in exclusively binary terms, as liberals versus conservatives, deniers versus believers.”

‘Crossing that line’

So why do climate scientists, more than most fields, cross the line into advocacy?

Most of all, it’s because their scientific work tells them the problem is so pressing, and time dependent, given the centuries-long life span of CO2 emissions, Somerville said.

“You get to the point where the emissions are large enough that you’ve run out of options,” he said. “You can no longer limit [it]. … We may be at that point already.”

There may also be less friction for scientists to suggest communal solutions to warming because, as Nisbet’s work has found, scientists tend to skew more liberal than the general population with more than 50 percent of one U.S. science society self-identifying as “liberal.” Given this outlook, they are more likely to accept efforts like cap and trade, a bill that, in implying a “cap” on activity, rubbed conservatives wrong.

Dan Kahan
A prolific law professor and psychologist at Yale, Dan Kahan has been charting how the public comes to, and understands, science. Photo courtesy of Dan Kahan.

“Not a lot of scientists would question if this is an effective policy,” Nisbet said.

It is not that scientists are unaware that they are moving into policy prescription, either. Most would intuitively know the line between their work and its political implications.

“I think many are aware when they’re crossing that line,” said Roger Pielke Jr., an environmental studies professor at the University of Colorado, Boulder, “but they’re not aware of the consequences [of] doing so.”

This willingness to cross into advocacy could also stem from the fact that it is the next logical skirmish. The battle for public opinion on the reality of human-driven climate change is already over, Pielke said, “and it’s been won … by the people calling for action.”

While there are slight fluctuations in public belief, in general a large majority of Americans side with what scientists say about the existence and causes of climate change. It’s not unanimous, he said, but it’s larger than the numbers who supported actions like the Montreal Protocol, the bank bailout or the Iraq War.

What has shifted has been its politicization: As more Republicans have begun to disbelieve global warming, Democrats have rallied to reinforce the science. And none of it is about the actual science, of course. It’s a fact Scripps’ Somerville now understands. It’s a code, speaking for fear of the policies that could happen if the science is accepted.

Doubters of warming don’t just hear the science. A policy is attached to it in their minds.

“Here’s a fact,” Pielke said. “And you have to change your entire lifestyle.”

For all the focus on how scientists talk to the public — whether Hansen has helped or hurt his cause — Yale’s Kahan ultimately thinks the discussion will mean very little. Ask most of the public who Hansen is, and they’ll mention something about the Muppets. It can be hard to accept, for scientists and journalists, but their efforts at communication are often of little consequence, he said.

“They’re not the primary source of information,” Kahan said.

‘A credible voice’

Like many of his peers, Somerville has suffered for his acts of advocacy.

“We all get hate email,” he said. “I’ve given congressional testimony and been denounced as an arrogant elitist hiding behind a discredited organization. Every time I’m on national news, I get a spike in ugly email. … I’ve received death threats.”

There are also pressures within the scientific community. As an elder statesman, Somerville does not have to worry about his career. But he tells young scientists to keep their heads down, working on technical papers. There is peer pressure to stay out of politics, a tension felt even by Somerville’s friend, the late Stephen Schneider, also at Stanford, who was long one of the country’s premier speakers on climate science.

He was publicly lauded, but many in the climate science community grumbled, Somerville said, that Schneider should “stop being a motormouth and start publishing technical papers.”

But there is a reason tradition has sustained the distinction between advising policymakers and picking solutions, one Krosnick’s work seems to ratify, said Michael Mann, a climatologist at Pennsylvania State University and a longtime target of climate contrarians.

“It is thoroughly appropriate, as a scientist, to discuss how our scientific understanding informs matters of policy, but … we should stop short of trying to prescribe policy,” Mann said. “This distinction is, in my view, absolutely critical.”

Somerville still supports the right of scientists to speak out as concerned citizens, as he has done, and as his friend, NASA’s Hansen, has done more stridently, protesting projects like the Keystone XL pipeline. As long as great care is taken to separate the facts from the political opinion, scientists should speak their minds.

“I don’t think being a scientist deprives you of the right to have a viewpoint,” he said.

Somerville often returns to a quote from the late Sherwood Rowland, a Nobel laureate from the University of California, Irvine, who discovered the threat chlorofluorocarbons posed to ozone: “What’s the use of having developed a science well enough to make predictions if, in the end, all we’re willing to do is stand around and wait for them to come true?”

Somerville asked Rowland several times whether the same held for global warming.

“Yes, absolutely,” he replied.

It’s an argument that Krosnick has heard from his own friends in climate science. But often this fine distinction gets lost in translation, as advocacy groups present the scientist’s personal message as the message of “science.” It’s luring to offer advice — Krosnick feels it himself when reporters call — but restraint may need to rule.

“In order to preserve a credible voice in public dialogue,” Krosnick said, “it might be that scientists such as myself need to restrain ourselves as speaking as public citizens.”

Broader efforts of communication, beyond scientists, could still mobilize the public, Nisbet said. Leave aside the third of the population who are in denial or alarmed about climate change, he said, and figure out how to make it relevant to the ambivalent middle.

“We have yet to really do that on climate change,” he said.

Somerville is continuing his efforts to improve communication from scientists. Another Bali declaration is unlikely, though. What he’d really like to do is get trusted messengers from different moral realms beyond science — leaders like the Dalai Lama — to speak repeatedly on climate change.

It’s all Somerville can do. It would be too painful to accept the other option, that climate change is like racism, war or poverty — problems the world has never abolished.

“[It] may well be that it is a problem that is too difficult for humanity to solve,” he said.

Mapping the Future of Climate Change in Africa (Science Daily)

ScienceDaily (Aug. 2, 2012) — Our planet’s changing climate is devastating communities in Africa through droughts, floods and myriad other disasters.

Children in the foothills of Drakensberg mountains in South Africa who still live in traditional rondavels on family homesteads. (Credit: Todd G. Smith, CCAPS Program)

Using detailed regional climate models and geographic information systems, researchers with the Climate Change and African Political Stability (CCAPS) program developed an online mapping tool that analyzes how climate and other forces interact to threaten the security of African communities.

The program was piloted by the Robert S. Strauss Center for International Security and Law at The University of Texas at Austin in 2009 after receiving a $7.6 million five-year grant from the Minerva Initiative with the Department of Defense, according to Francis J. Gavin, professor of international affairs and director of the Strauss Center.

“The first goal was to look at whether we could more effectively identify what were the causes and locations of vulnerability in Africa, not just climate, but other kinds of vulnerability,” Gavin said.

CCAPS comprises nine research teams focusing on various aspects of climate change, their relationship to different types of conflict, the government structures that exist to mitigate them, and the effectiveness of international aid in intervening. Although most CCAPS researchers are based at The University of Texas at Austin, the Strauss Center also works closely with Trinity College Dublin, the College of William and Mary, and the University of North Texas.

“In the beginning these all began as related, but not intimately connected, topics” Gavin said, “and one of the really impressive things about the project is how all these different streams have come together.”

Africa is particularly vulnerable to the effects of climate change due to its reliance on rain-fed agriculture and the inability of many of its governments to help communities in times of need.

The region is of increasing importance for U.S. national security, according to Gavin, because of the growth of its population, economic strength and resource importance, and also due to concerns about non-state actors, weakening governments and humanitarian disasters.

Although these issues are too complex to yield a direct causal link between climate change and security concerns, he said, understanding the levels of vulnerability that exist is crucial in comprehending the full effect of this changing paradigm.

The vulnerability mapping program within CCAPS is led by Joshua Busby, assistant professor at the Lyndon B. Johnson School of Public Affairs.

To determine the vulnerability of a given location based on changing climate conditions, Busby and his team looked at four different sources: 1) the degree of physical exposure to climate hazards, 2) population size, 3) household or community resilience, and 4) the quality of governance or presence of political violence.

The first source records the different types of climate hazards which could occur in the area, including droughts, floods, wildfires, storms and coastal inundation. However, their presence alone is not enough to qualify a region as vulnerable.

The second source — population size — determines the number of people who will be impacted by these climate hazards. More people create more demand for resources, potentially making the entire population more vulnerable.

The third source looks at how resilient a community is to adverse effects, analyzing the quality of their education and health, as well as whether they have easy access to food, water and health care.

“If exposure is really bad, it may exceed the capacity of local communities to protect themselves,” Busby said, “and then it comes down to whether or not the governments are going to be willing or able to help them.”

The final source accounts for the effectiveness of a given government, the amount of accountability present, how integrated it is with the international community, how politically stable it is, and whether there is any political violence present.

Busby and his team combined the four sources of vulnerability and gave them each equal weight, adding them together to form a composite map. Their scores were then divided into a ranking of five equal parts, or quintiles, going from the 20 percent of regions with the lowest vulnerability to the 20 percent with the highest.

The researchers gathered information for the tool from a variety of sources, including historic models of physical exposure from the United Nations Environment Programme (UNEP), population estimates from LandScan, as well as household surveys and governance assessments from the World Bank’s World Development and Worldwide Governance Indicators.

This data reflects past and present vulnerability, but to understand which places in Africa would be most vulnerable to future climate change, Busby and his team relied on the regional climate model simulations designed by Edward Vizy and Kerry Cook, both members of the CCAPS team from the Jackson School of Geosciences.

Vizy and Cook ran three, 20-year nested simulations of the African continent’s climate at the regional scales of 90 and 30 kilometers, using a derivation of the Weather Research and Forecasting Model of the National Center for Atmospheric Research. One was a control simulation representative of the years 1989-2008, and the others represented the climate as it may exist in 2041-2060 and 2081-2100.

“We’re adjusting the control simulation’s CO2 concentration, model boundary conditions, and sea surface temperatures to increased greenhouse gas forcing scenario conditions derived from atmosphere-ocean global climate models. We re-run the simulation to understand how the climate will operate under a different, warmer state at spatial resolutions needed for regional impact analyses,” Vizy said.

Each simulation took two months to complete on the Rangersupercomputer at the Texas Advanced Computing Center (TACC).

“We couldn’t run these simulations without the high-performance computing resources at TACC, it would just take too long. If it takes two months running with 200 processors, I can’t fathom doing it with one processor,” Vizy said.

Researchers input data from these vulnerability maps into an online mapping tool developed by the CCAPS program to integrate its various lines of climate, conflict and aid research. CCAPS’s current mapping tool is based on a prototype developed by the team to assess conflict patterns in Africa with the help of researchers at the TACC/ACES Visualization Laboratory (Vislab), according to Ashley Moran, program manager of CCAPS.

“The mapping tool is a key part of our effort to produce new research that could support policy making and the work of practitioners and governments in Africa,” Moran said. “We want to communicate this research in ways that are of maximum use to policymakers and researchers.”

The initial prototype of the mapping tool used the ArcGIS platform to project data onto maps. Working with its partner Development Gateway, CCAPS expanded the system to incorporate conflict, vulnerability, governance and aid research data.

After completing the first version of their model, Busby and his team carried out the process of ground truthing their maps by visiting local officials and experts in several African countries, such as Kenya and South Africa.

“The experience of talking with local experts was tremendously gratifying,” Busby said. “They gave us confidence that the things we’re doing in a computer lab setting in Austin do pick up on some of the ground-level expert opinions.”

Busby and his team complemented their maps with local perspectives on the kind of impact climate was already having, leading to new insights that could help perfect the model. For example, local experts felt the model did not address areas with chronic water scarcity, an issue the researchers then corrected upon returning home.

According to Busby, the vulnerability maps serve as focal points which can give way to further analysis about the issues they illustrate.

Some of the countries most vulnerable to climate change include Somalia, Sierra Leone, Guinea, Sudan and parts of the Democratic Republic of Congo. Knowing this allows local policymakers to develop security strategies for the future, including early warning systems against floods, investments in drought-resistant agriculture, and alternative livelihoods that might facilitate resource sharing and help prevent future conflicts. The next iteration of the online mapping tool to be released later this year will also incorporate the future projections of climate exposure from the models developed by Vizy and Cook.

The CCAPS team publishes their research in journals likeClimate Dynamics and The International Studies Review, carries out regular consultations with the U.S. government and governments in Africa, and participates in conferences sponsored by concerned organizations, such as the United Nations and the United States Africa Command.

“What this project has showed us is that many of the real challenges of the 21st century aren’t always in traditional state-to-state interactions, but are transnational in nature and require new ways of dealing with,” Gavin said.

The Conversion of a Climate-Change Skeptic (N.Y.Times)

OP-ED CONTRIBUTOR

By RICHARD A. MULLER

Published: July 28, 2012

Berkeley, Calif.

CALL me a converted skeptic. Three years ago I identified problems in previous climate studies that, in my mind, threw doubt on the very existence of global warming. Last year, following an intensive research effort involving a dozen scientists, I concluded that global warming was real and that the prior estimates of the rate of warming were correct. I’m now going a step further: Humans are almost entirely the cause.

My total turnaround, in such a short time, is the result of careful and objective analysis by the Berkeley Earth Surface Temperature project, which I founded with my daughter Elizabeth. Our results show that the average temperature of the earth’s land has risen by two and a half degrees Fahrenheit over the past 250 years, including an increase of one and a half degrees over the most recent 50 years. Moreover, it appears likely that essentially all of this increase results from the human emission of greenhouse gases.

These findings are stronger than those of the Intergovernmental Panel on Climate Change, the United Nations group that defines the scientific and diplomatic consensus on global warming. In its 2007 report, the I.P.C.C. concluded only that most of the warming of the prior 50 years could be attributed to humans. It was possible, according to the I.P.C.C. consensus statement, that the warming before 1956 could be because of changes in solar activity, and that even a substantial part of the more recent warming could be natural.

Our Berkeley Earth approach used sophisticated statistical methods developed largely by our lead scientist, Robert Rohde, which allowed us to determine earth land temperature much further back in time. We carefully studied issues raised by skeptics: biases from urban heating (we duplicated our results using rural data alone), from data selection (prior groups selected fewer than 20 percent of the available temperature stations; we used virtually 100 percent), from poor station quality (we separately analyzed good stations and poor ones) and from human intervention and data adjustment (our work is completely automated and hands-off). In our papers we demonstrate that none of these potentially troublesome effects unduly biased our conclusions.

The historic temperature pattern we observed has abrupt dips that match the emissions of known explosive volcanic eruptions; the particulates from such events reflect sunlight, make for beautiful sunsets and cool the earth’s surface for a few years. There are small, rapid variations attributable to El Niño and other ocean currents such as the Gulf Stream; because of such oscillations, the “flattening” of the recent temperature rise that some people claim is not, in our view, statistically significant. What has caused the gradual but systematic rise of two and a half degrees? We tried fitting the shape to simple math functions (exponentials, polynomials), to solar activity and even to rising functions like world population. By far the best match was to the record of atmospheric carbon dioxide, measured from atmospheric samples and air trapped in polar ice.

Just as important, our record is long enough that we could search for the fingerprint of solar variability, based on the historical record of sunspots. That fingerprint is absent. Although the I.P.C.C. allowed for the possibility that variations in sunlight could have ended the “Little Ice Age,” a period of cooling from the 14th century to about 1850, our data argues strongly that the temperature rise of the past 250 years cannot be attributed to solar changes. This conclusion is, in retrospect, not too surprising; we’ve learned from satellite measurements that solar activity changes the brightness of the sun very little.

How definite is the attribution to humans? The carbon dioxide curve gives a better match than anything else we’ve tried. Its magnitude is consistent with the calculated greenhouse effect — extra warming from trapped heat radiation. These facts don’t prove causality and they shouldn’t end skepticism, but they raise the bar: to be considered seriously, an alternative explanation must match the data at least as well as carbon dioxide does. Adding methane, a second greenhouse gas, to our analysis doesn’t change the results. Moreover, our analysis does not depend on large, complex global climate models, the huge computer programs that are notorious for their hidden assumptions and adjustable parameters. Our result is based simply on the close agreement between the shape of the observed temperature rise and the known greenhouse gas increase.

It’s a scientist’s duty to be properly skeptical. I still find that much, if not most, of what is attributed to climate change is speculative, exaggerated or just plain wrong. I’ve analyzed some of the most alarmist claims, and my skepticism about them hasn’t changed.

Hurricane Katrina cannot be attributed to global warming. The number of hurricanes hitting the United States has been going down, not up; likewise for intense tornadoes. Polar bears aren’t dying from receding ice, and the Himalayan glaciers aren’t going to melt by 2035. And it’s possible that we are currently no warmer than we were a thousand years ago, during the “Medieval Warm Period” or “Medieval Optimum,” an interval of warm conditions known from historical records and indirect evidence like tree rings. And the recent warm spell in the United States happens to be more than offset by cooling elsewhere in the world, so its link to “global” warming is weaker than tenuous.

The careful analysis by our team is laid out in five scientific papers now online atBerkeleyEarth.org. That site also shows our chart of temperature from 1753 to the present, with its clear fingerprint of volcanoes and carbon dioxide, but containing no component that matches solar activity. Four of our papers have undergone extensive scrutiny by the scientific community, and the newest, a paper with the analysis of the human component, is now posted, along with the data and computer programs used. Such transparency is the heart of the scientific method; if you find our conclusions implausible, tell us of any errors of data or analysis.

What about the future? As carbon dioxide emissions increase, the temperature should continue to rise. I expect the rate of warming to proceed at a steady pace, about one and a half degrees over land in the next 50 years, less if the oceans are included. But if China continues its rapid economic growth (it has averaged 10 percent per year over the last 20 years) and its vast use of coal (it typically adds one new gigawatt per month), then that same warming could take place in less than 20 years.

Science is that narrow realm of knowledge that, in principle, is universally accepted. I embarked on this analysis to answer questions that, to my mind, had not been answered. I hope that the Berkeley Earth analysis will help settle the scientific debate regarding global warming and its human causes. Then comes the difficult part: agreeing across the political and diplomatic spectrum about what can and should be done.

Richard A. Muller, a professor of physics at the University of California, Berkeley, and a former MacArthur Foundation fellow, is the author, most recently, of “Energy for Future Presidents: The Science Behind the Headlines.”

*   *   *

Climate change study forces sceptical scientists to change minds (The Guardian)

Earth’s land shown to have warmed by 1.5C over past 250 years, with humans being almost entirely responsible

Leo Hickman
guardian.co.uk, Sunday 29 July 2012 14.03 BST

Prof Richard MullerProf Richard Muller considers himself a converted sceptic following the study’s surprise results. Photograph: Dan Tuffs for the Guardian

The Earth’s land has warmed by 1.5C over the past 250 years and “humans are almost entirely the cause”, according to a scientific study set up to address climate change sceptics’ concerns about whether human-induced global warming is occurring.

Prof Richard Muller, a physicist and climate change sceptic who founded the Berkeley Earth Surface Temperature (Best) project, said he was surprised by the findings. “We were not expecting this, but as scientists, it is our duty to let the evidence change our minds.” He added that he now considers himself a “converted sceptic” and his views had undergone a “total turnaround” in a short space of time.

“Our results show that the average temperature of the Earth’s land has risen by 2.5F over the past 250 years, including an increase of 1.5 degrees over the most recent 50 years. Moreover, it appears likely that essentially all of this increase results from the human emission of greenhouse gases,” Muller wrote in an opinion piece for the New York Times.

Can scientists in California end the war on climate change?
Study finds no grounds for climate sceptics’ concerns
Video: Berkeley Earth tracks climate change
Are climate sceptics more likely to be conspiracy theorists?

The team of scientists based at the University of California, Berkeley, gathered and merged a collection of 14.4m land temperature observations from 44,455 sites across the world dating back to 1753. Previous data sets created by Nasa, the US National Oceanic and Atmospheric Administration, and the Met Office and the University of East Anglia’s climate research unit only went back to the mid-1800s and used a fifth as many weather station records.

The funding for the project included $150,000 from the Charles G Koch Charitable Foundation, set up by the billionaire US coal magnate and key backer of the climate-sceptic Heartland Institute thinktank. The research also received $100,000 from the Fund for Innovative Climate and Energy Research, which was created by Bill Gates.

Unlike previous efforts, the temperature data from various sources was not homogenised by hand – a key criticism by climate sceptics. Instead, the statistical analysis was “completely automated to reduce human bias”. The Best team concluded that, despite their deeper analysis, their own findings closely matched the previous temperature reconstructions, “but with reduced uncertainty”.

Last October, the Best team published results that showed the average global land temperature has risen by about 1C since the mid-1950s. But the team did not look for possible fingerprints to explain this warming. The latest data analysis reached much further back in time but, crucially, also searched for the most likely cause of the rise by plotting the upward temperature curve against suspected “forcings”. It analysed the warming impact of solar activity – a popular theory among climate sceptics – but found that, over the past 250 years, the contribution of the sun has been “consistent with zero”. Volcanic eruptions were found to have caused short dips in the temperature rise in the period 1750–1850, but “only weak analogues” in the 20th century.

“Much to my surprise, by far the best match came to the record of atmospheric carbon dioxide, measured from atmospheric samples and air trapped in polar ice,” said Muller. “While this doesn’t prove that global warming is caused by human greenhouse gases, it is currently the best explanation we have found, and sets the bar for alternative explanations.”

Muller said his team’s findings went further and were stronger than the latest report published by the Intergovernmental Panel on ClimateChange.

In an unconventional move aimed at appeasing climate sceptics by allowing “full transparency”, the results have been publicly released before being peer reviewed by the Journal of Geophysical Research. All the data and analysis is now available to be freely scrutinised at the Bestwebsite. This follows the pattern of previous Best results, none of which have yet been published in peer-reviewed journals.

When the Best project was announced last year, the prominent climate sceptic blogger Anthony Watts was consulted on the methodology. He stated at the time: “I’m prepared to accept whatever result they produce, even if it proves my premise wrong.” However, tensions have since arisen between Watts and Muller.

Early indications suggest that climate sceptics are unlikely to fully accept Best’s latest results. Prof Judith Curry, a climatologist at the Georgia Institute of Technology who runs a blog popular with climate sceptics and who is a consulting member of the Best team, told the Guardian that the method used to attribute the warming to human emissions was “way over-simplistic and not at all convincing in my opinion”. She added: “I don’t think this question can be answered by the simple curve fitting used in this paper, and I don’t see that their paper adds anything to our understanding of the causes of the recent warming.”

Prof Michael Mann, the Penn State palaeoclimatologist who has faced hostility from climate sceptics for his famous “hockey stick” graph showing a rapid rise in temperatures during the 20th century, said he welcomed the Best results as they “demonstrated once again what scientists have known with some degree of certainty for nearly two decades”. He added: “I applaud Muller and his colleagues for acting as any good scientists would, following where their analyses led them, without regard for the possible political repercussions. They are certain to be attacked by the professional climate change denial crowd for their findings.”

Muller said his team’s analysis suggested there would be 1.5 degrees of warming over land in the next 50 years, but if China continues its rapid economic growth and its vast use of coal then that same warming could take place in less than 20 years.

“Science is that narrow realm of knowledge that, in principle, is universally accepted,” wrote Muller. “I embarked on this analysis to answer questions that, to my mind, had not been answered. I hope that the Berkeley Earth analysis will help settle the scientific debate regarding global warming and its human causes. Then comes the difficult part: agreeing across the political and diplomatic spectrum about what can and should be done.”

Climate Change and the Next U.S. Revolution (ZNet)

Thursday, July 26, 2012

The U.S. heat wave is slowly shaking the foundations of American politics. It may take years for the deep rumble to evolve into an above ground, institution-shattering earthquake, but U.S. society has changed for good.

The heat wave has helped convince tens of millions of Americans that climate change is real, overpowering the fake science and right-wing media – funded by corporate cash – to convince Americans otherwise.

Republicans and Democrats alike also erect roadblocks to understanding climate change. By the politicians’ complete lack of action towards addressing the issue, the “climate change is fake” movement was strengthened, since Americans presumed that any sane government would be actively trying to address an issue that had the potential to destroy civilization.

But working people have finally made up their mind. A recent poll showed that 70 percent of Americans now believe that climate change is real, up from 52 percent in 2010. And a growing number of people are recognizing that the warming of the planet is caused by human activity.

Business Week explains: “A record heat wave, drought and catastrophic wildfires are accomplishing what climate scientists could not: convincing a wide swath of Americans that global temperatures are rising.”

This means that working class families throughout the Midwest and southern states simply don’t believe what their media and politicians are telling them.

It also implies that these millions of Americans are being further politicized in a deeper sense.

Believing that climate change exists implies that you are somewhat aware about the massive consequences to humanity if the global economy doesn’t drastically change, and fast.

This awareness has revolutionary implications. As millions of Americans watch the environment destroyed – for their grandchildren or themselves – while politicians do absolutely nothing in response, or make tiny token gestures – a growing number of Americans will demand political alternatives, and fight to see them created. The American political system as it exists today cannot cope with this inevitable happening.

The New York Times explains why: “…the American political system is not ready to agree to a [climate] treaty that would force the United States, over time, to accept profound changes in its energy [coal, oil], transport [trucking and airline industry] and manufacturing [corporate] sectors.”

In short, the U.S. government will not force corporations to make less profit by behaving more eco-friendly. This is the essence of the problem.

In order for humanity to survive climate change, the economy must be radically transformed; massive investments must be made in renewable energy, public transportation, and recycling, while dirty energy sources must be quickly swept into the dustbin of history.

But the economy is currently owned by giant, privately run corporations, that will continue destroying the earth if it earns them huge profits, and they make massive “contributions” to political parties to ensure this remains so. It’s becoming increasingly obvious that government inaction on climate change is directly linked to the “special interests” of corporations that dominate these governments.

This fact of U.S. politics is present in every other capitalist country as well, which means that international agreements on reducing greenhouse gasses will remain impossible, as each country’s corporations vie for market domination, reducing pollution simply puts them at a competitive disadvantage.

This dynamic has already caused massive delays in the UN’s already inadequate efforts at addressing climate change. The Kyoto climate agreement was the by-product of years of cooperation and planning between many nations that included legally binding agreements to reduce greenhouse gasses. The Bush and Obama administrations helped destroy these efforts.

For example, Instead of building upon the foundation of the Kyoto Protocol, the Obama administration demanded a whole new structure, something that would take years to achieve. The Kyoto framework (itself insufficient) was abandoned because it included legally binding agreements, and was based on multilateral, agreed-upon reductions of greenhouse gasses.

In an article by the Guardian entitled “US Planning to Weaken Copenhagen Climate Deal,” the Obama administration’s UN position is exposed, as he dismisses the Kyoto Protocol by proposing that “…each country set its own rules and to decide unilaterally how to meet its target.”
Obama’s proposal came straight from the mouth of U.S. corporations, who wanted to ensure that there was zero accountability, zero oversight, zero climate progress, and therefore no dent to their profits. Instead of using its massive international leverage for climate justice, the U.S. has used it to promote divisiveness and inaction, to the potential detriment of billions of people globally.

The stakes are too high to hold out any hope that governments will act boldly. The Business Week article below explains the profound changes happening to the climate:

“The average temperature for the U.S. during June was 71.2 degrees Fahrenheit (21.7 Celsius), which is 2 degrees higher than the average for the 20th century, according to the National Oceanic and Atmospheric Administration. The June temperatures made the preceding 12 months the warmest since record-keeping began in 1895, the government agency said.”

Activists who are radicalized by this global problem face a crisis of what to do about it. It is difficult to put forth a positive climate change demand, since the problem is global.  Demanding that governments “act boldly” to address climate change hasn’t worked, and lesser demands seem inadequate.

The environmental rights movement continues to go through a variety of phases: individual and small group eco-“terrorism,” causing property damage to environmentally damaging companies; corporate campaigns that target especially bad polluters with high-profile direct action; and massive education programs that have been highly successful, but fall short when it comes to winning change.

Ultimately, climate activists must come face to face with political and corporate power. Corporate-owned governments are the ones with the power to adequately address the climate change issue, and they will not be swayed by good science, common sense, basic decency, or even a torched planet.

Those in power only respond to power, and the only power capable of displacing corporate power is when people unite and act collectively, as was done in Egypt, Tunisia, and is still developing throughout Europe.

Climate groups cannot view their issue as separate from other groups that are organizing against corporate power. The social movements that have emerged to battle austerity measures are natural allies, as are anti-war and labor activists. The climate solution will inevitably require revolutionary measures, which first requires that alliances and demands are put forward that unite Labor, working people in general, community, and student groups towards collective action.

One possible immediate demand is for environmental activists to unite with Labor groups over a federal jobs program, paid for by taxing the rich, that makes massive investments in jobs that are climate related, such as solar panel production, transportation, building recycling centers, home retro-fitting, etc.

Another demand could be to insist that the government convene the most knowledgeable scientists in the area of clean energy. These scientists should be given all the resources they need in order to collectively create alternative sources of clean energy that would allow for a realistic alternative to the current polluting and toxic sources of energy.

However, any type of immediate demand will meet giant corporate resistance from both political parties. Fighting for a uniting demand will thus strengthen the movement, and for this reason it is important to link climate solutions to the creation of jobs, which are the number one concern of most Americans. This unity will in turn lead allies toward a deeper understanding of the problem, and therefore deeper solutions will emerge that challenge the whole economic structure that is deaf to the needs of humans and the climate and sacrifices everything to the private profit of a few.

Shamus Cooke is a social service worker, trade unionist, and writer for Workers Action (www.workerscompass.org). He can be reached at shamuscooke@gmail.com

http://www.businessweek.com/news/2012-07-18/record-heat-wave-pushes-u-dot-s-dot-belief-in-climate-change-to-70-percent

http://www.nytimes.com/2009/12/13/weekinreview/13broder.html

http://www.guardian.co.uk/environment/2009/sep/15/europe-us-copenhagen