Arquivo da tag: Incerteza

Calls for doomsday remain unheeded (Washington Post)

By George Will

11:15 PM, Aug 20, 2012

WASHINGTON — Sometimes the news is that something was not newsworthy. The United Nation’s Rio+20 conference — 50,000 participants from 188 nations — occurred in June, without consequences. A generation has passed since the 1992 Earth Summit in Rio, which begat other conferences and protocols (e.g., Kyoto). And, by now, apocalypse fatigue — boredom from being repeatedly told the end is nigh.

This began two generations ago, in 1972, when we were warned (by computer models developed at MIT) that we were doomed. We were supposed to be pretty much extinct by now, or at least miserable. We are neither. So, what when wrong?

That year begat “The Limits to Growth,” a book from the Club of Rome, which called itself “a project on the predicament of mankind.” It sold 12 million copies, staggered The New York Times (“one of the most important documents of our age”) and argued that economic growth was doomed by intractable scarcities. Bjorn Lomborg, the Danish academic and “skeptical environmentalist,” writing in Foreign Affairs, says it “helped send the world down a path of worrying obsessively about misguided remedies for minor problems while ignoring much greater concerns,” such as poverty, which only economic growth can ameliorate.

MIT’s models foresaw the collapse of civilization because of “nonrenewable resource depletion” and population growth. “In an age more innocent of and reverential toward computers,” Lomborg writes, “the reams of cool printouts gave the book’s argument an air of scientific authority and inevitability” that “seemed to banish any possibility of disagreement.” Then — as now, regarding climate change — respect for science was said to require reverential suspension of skepticism about scientific hypotheses. Time magazine’s story about “The Limits to Growth” exemplified the media’s frisson of hysteria:

“The furnaces of Pittsburgh are cold; the assembly lines of Detroit are still. In Los Angeles, a few gaunt survivors of a plague desperately till freeway center strips … Fantastic? No, only grim inevitability if society continues its present dedication to growth and ‘progress.’”

The modelers examined 19 commodities and said 12 would be gone long before now — aluminum, copper, gold, lead, mercury, molybdenum, natural gas, oil, silver, tin, tungsten and zinc. Lomborg says:

Technological innovations have replaced mercury in batteries, dental fillings and thermometers, mercury consumption is down 98 percent and its price was down 90 percent by 2000. Since 1970, when gold reserves were estimated at 10,980 tons, 81,410 tons have been mined and estimated reserves are 51,000 tons. Since 1970, when known reserves of copper were 280 million tons, about 400 million tons have been produced globally and reserves are estimated at almost 700 million tons. Aluminum consumption has increased 16-fold since 1950, the world has consumed four times the 1950 known reserves, and known reserves could sustain current consumption for 177 years. Potential U.S. gas resources have doubled in the last six years. And so on.

The modelers missed something — human ingenuity in discovering, extracting and innovating. Which did not just appear after 1972.

Aluminum, Lomborg writes, is one of earth’s most common metals. But until the 1886 invention of the Hall-Heroult process, it was so difficult and expensive to extract that “Napoleon III had bars of aluminum exhibited alongside the French crown jewels, and he gave his honored guests aluminum forks and spoons while lesser visitors had to make do with gold utensils.”

Forty years after “The Limits to Growth” imparted momentum to environmentalism, that impulse now is often reduced to children indoctrinated to “reduce, reuse, and recycle.” Lomborg calls recycling “a feel-good gesture that provides little environmental benefit at a significant cost.” He says “we pay tribute to the pagan god of token environmentalism by spending countless hours sorting, storing and collecting used paper, which, when combined with government subsidies, yields slightly lower-quality paper in order to secure a resource” — forests — “that was never threatened in the first place.”

In 1980, economist Julian Simon made a wager in the form of a complex futures contract. He bet Paul Ehrlich (whose 1968 book “The Population Bomb” predicted “hundreds of millions of people” would starve to death in the 1970s as population growth swamped agricultural production) that by 1990 the price of any five commodities Ehrlich and his advisers picked would be lower than in 1980. Ehrlich’s group picked five metals. All were cheaper in 1990.

The bet cost Ehrlich $576.07. But that year he was awarded a $345,000 MacArthur Foundation “genius” grant and half of the $240,000 Crafoord Prize for ecological virtue. One of Ehrlich’s advisers, John Holdren, is President Barack Obama’s science adviser.

George F. Will writes about foreign and domestic politics and policy for the Washington Post Writers Group. Email:georgewill@washpost.com.

The Role of Genes in Political Behavior (Science Daily)

ScienceDaily (Aug. 27, 2012) — Politics and genetics have traditionally been considered non-overlapping fields, but over the past decade it has become clear that genes can influence political behavior, according to a review published online August 27th in Trends in Genetics. This paradigm shift has led to novel insights into why people vary in their political preferences and could have important implications for public policy.

“We’re seeing an awakening in the social sciences, and the wall that divided politics and genetics is really starting to fall apart,” says review author Peter Hatemi of the University of Sydney. “This is a big advance, because the two fields could inform each other to answer some very complex questions about individual differences in political views.”

In the past, social scientists had assumed that political preferences were shaped by social learning and environmental factors, but recent studies suggest that genes also strongly influence political traits. Twin studies show that genes have some influence on why people differ on political issues such as the death penalty, unemployment and abortion. Because this field of research is relatively new, only a handful of genes have been implicated in political ideology and partisanship, voter turnout, and political violence.

Future research, including gene-expression and sequencing studies, may lead to deeper insights into genetic influences on political views and have a greater impact on public policy. “Making the public aware of how their mind works and affects their political behavior is critically important,” Hatemi says. “This has real implications for the reduction of discrimination, foreign policy, public health, attitude change and many other political issues.”

Journal Reference:

  1. Peter K Hatemi and Rose McDermott. The Genetics of Politics: Discovery, Challenges and ProgressTrends in Genetics, August 27, 2012 DOI: 10.1016/j.tig.2012.07.004

The Effects of Discrimination Could Last a Lifetime (Science Daily)

ScienceDaily (Aug. 27, 2012) — Increased levels of depression as a result of discrimination could contribute to low birth weight babies.

Given the well-documented relationship between low birth weight and the increased risk of health problems throughout one’s lifespan, it is vital to reduce any potential contributors to low birth weight.  A new study by Valerie Earnshaw and her colleagues from Yale University sheds light on one possible causal factor.  Their findings, published online in Springer’s journal, theAnnals of Behavioral Medicine, suggest that chronic, everyday instances of discrimination against pregnant, urban women of color may play a significant role in contributing to low birth weight babies.

Twice as many black women give birth to low birth weight babies than white or Latina women in the U.S.  Reasons for this disparity are, as yet, unclear. But initial evidence suggests a link may exist between discrimination experienced while pregnant and the incidence of low birth weight.  In addition, experiences of discrimination have also been linked to depression, which causes physiological changes that can have a negative effect on a pregnancy.

Earnshaw and her colleagues interviewed 420, 14- to 21-year-old black and Latina women at 14 community health centers and hospitals in New York, during the second and third trimesters of their pregnancies, and at six and 12 months after their babies had been born.  They measured their reported experiences of discrimination.  They also measured their depressive symptoms, pregnancy distress and pregnancy symptoms.

Levels of everyday discrimination reported were generally low.  However, the impact of discrimination was the same in all the participants regardless of age, ethnicity or type of discrimination reported.  Women reporting greater levels of discrimination were more prone to depressive symptoms, and ultimately went on to have babies with lower birth weights than those reporting lower levels of discrimination.  This has implications for healthcare providers who work with pregnant teens and young women during the pre-natal period, while they have the opportunity to try and reduce the potential impacts discrimination on the pregnancy.

The authors conclude that “Given the associations between birth weight and health across the life span, it is critical to reduce discrimination directed at urban youth of color so that all children are able to begin life with greater promise for health.  In doing so, we have the possibility to eliminate disparities not only in birth weight, but in health outcomes across the lifespan.”

Data for this study came from the Centering Pregnancy Plus project, funded by the National Institute of Mental Health, and conducted in collaboration with Clinical Directors’ Network and the Centering Healthcare Institute.

Journal Reference:

  1. Valerie A. Earnshaw, Lisa Rosenthal, Jessica B. Lewis, Emily C. Stasko, Jonathan N. Tobin, Tené T. Lewis, Allecia E. Reid, Jeannette R. Ickovics. Maternal Experiences with Everyday Discrimination and Infant Birth Weight: A Test of Mediators and Moderators Among Young, Urban Women of ColorAnnals of Behavioral Medicine, 2012; DOI: 10.1007/s12160-012-9404-3

Information Overload in the Era of ‘Big Data’ (Science Daily)

ScienceDaily (Aug. 20, 2012) — Botany is plagued by the same problem as the rest of science and society: our ability to generate data quickly and cheaply is surpassing our ability to access and analyze it. In this age of big data, scientists facing too much information rely on computers to search large data sets for patterns that are beyond the capability of humans to recognize — but computers can only interpret data based on the strict set of rules in their programming.

New tools called ontologies provide the rules computers need to transform information into knowledge, by attaching meaning to data, thereby making those data retrievable by computers and more understandable to human beings. Ontology, from the Greek word for the study of being or existence, traditionally falls within the purview of philosophy, but the term is now used by computer and information scientists to describe a strategy for representing knowledge in a consistent fashion. An ontology in this contemporary sense is a description of the types of entities within a given domain and the relationships among them.

A new article in this month’s American Journal of Botany by Ramona Walls (New York Botanical Garden) and colleagues describes how scientists build ontologies such as the Plant Ontology (PO) and how these tools can transform plant science by facilitating new ways of gathering and exploring data.

When data from many divergent sources, such as data about some specific plant organ, are associated or “tagged” with particular terms from a single ontology or set of interrelated ontologies, the data become easier to find, and computers can use the logical relationships in the ontologies to correctly combine the information from the different databases. Moreover, computers can also use ontologies to aggregate data associated with the different subclasses or parts of entities.

For example, suppose a researcher is searching online for all examples of gene expression in a leaf. Any botanist performing this search would include experiments that described gene expression in petioles and midribs or in a frond. However, a search engine would not know that it needs to include these terms in its search — unless it was told that a frond is a type of leaf, and that every petiole and every midrib are parts of some leaf. It is this information that ontologies provide.

The article in the American Journal of Botany by Walls and colleagues describes what ontologies are, why they are relevant to plant science, and some of the basic principles of ontology development. It includes an overview of the ontologies that are relevant to botany, with a more detailed description of the PO and the challenges of building an ontology that covers all green plants. The article also describes four keys areas of plant science that could benefit from the use of ontologies: (1) comparative genetics, genomics, phenomics, and development; (2) taxonomy and systematics; (3) semantic applications; and (4) education. Although most of the examples in this article are drawn from plant science, the principles could apply to any group of organisms, and the article should be of interest to zoologists as well.

As genomic and phenomic data become available for more species, many different research groups are embarking on the annotation of their data and images with ontology terms. At the same time, cross-species queries are becoming more common, causing more researchers in plant science to turn to ontologies. Ontology developers are working with the scientists who generate data to make sure ontologies accurately reflect current science, and with database developers and publishers to find ways to make it easier for scientist to associate their data with ontologies.

Journal Reference:

R. L. Walls, B. Athreya, L. Cooper, J. Elser, M. A. Gandolfo, P. Jaiswal, C. J. Mungall, J. Preece, S. Rensing, B. Smith, D. W. Stevenson. Ontologies as integrative tools for plant scienceAmerican Journal of Botany, 2012; 99 (8): 1263 DOI: 10.3732/ajb.1200222

Politics and Prejudice Explored (Science Daily)

ScienceDaily (Aug. 20, 2012) — Research has associated political conservatism with prejudice toward various stereotyped groups. But research has also shown that people select and interpret evidence consistent with their own pre-existing attitudes and ideologies. In this article, Chambers and colleagues hypothesized that, contrary to what some research might indicate, prejudice is not restricted to a particular political ideology.

Rather, the conflicting values of liberals and conservatives give rise to different kinds of prejudice, with each group favoring other social groups that share their values. In the first study, three diverse groups of participants rated the ideological position and their overall impression of 34 different target groups.

Participants’ impressions fell in line with their ideology. For example, conservatives expressed more prejudice than liberals against groups that were identified as liberal (e.g., African-Americans, homosexuals), but less prejudice against groups identified as conservative (e.g., Christian fundamentalists, business people).

In the second and third studies, participants were presented with 6 divisive political issues and descriptions of racially diverse target persons for each issue. Neither liberals’ nor conservatives’ impressions of the target persons were affected by the race of the target, but both were strongly influenced by the target’s political attitudes.

From these findings the researchers conclude that prejudices commonly linked with ideology are most likely derived from perceived ideological differences and not from other characteristics like racial tolerance or intolerance.

Journal References:

J. B. Luguri, J. L. Napier, J. F. Dovidio. Reconstruing Intolerance: Abstract Thinking Reduces Conservatives’ Prejudice Against Nonnormative GroupsPsychological Science, 2012; 23 (7): 756 DOI:10.1177/0956797611433877

J. B. Luguri, J. L. Napier, J. F. Dovidio. Reconstruing Intolerance: Abstract Thinking Reduces Conservatives’ Prejudice Against Nonnormative GroupsPsychological Science, 2012; 23 (7): 756 DOI:10.1177/0956797611433877

 

*   *   *

Prejudice Comes from a Basic Human Need and Way of Thinking, New Research Suggests

ScienceDaily (Dec. 21, 2011) — Where does prejudice come from? Not from ideology, say the authors of a new paper. Instead, prejudice stems from a deeper psychological need, associated with a particular way of thinking. People who aren’t comfortable with ambiguity and want to make quick and firm decisions are also prone to making generalizations about others.

In a new article published in Current Directions in Psychological Science, a journal of the Association for Psychological Science, Arne Roets and Alain Van Hiel of Ghent University in Belgium look at what psychological scientists have learned about prejudice since the 1954 publication of an influential book, The Nature of Prejudice by Gordon Allport.

People who are prejudiced feel a much stronger need to make quick and firm judgments and decisions in order to reduce ambiguity. “Of course, everyone has to make decisions, but some people really hate uncertainty and therefore quickly rely on the most obvious information, often the first information they come across, to reduce it” Roets says. That’s also why they favor authorities and social norms which make it easier to make decisions. Then, once they’ve made up their mind, they stick to it. “If you provide information that contradicts their decision, they just ignore it.”

Roets argues that this way of thinking is linked to people’s need to categorize the world, often unconsciously. “When we meet someone, we immediately see that person as being male or female, young or old, black or white, without really being aware of this categorization,” he says. “Social categories are useful to reduce complexity, but the problem is that we also assign some properties to these categories. This can lead to prejudice and stereotyping.”

People who need to make quick judgments will judge a new person based on what they already believe about their category. “The easiest and fastest way to judge is to say, for example, ok, this person is a black man. If you just use your ideas about what black men are generally like, that’s an easy way to have an opinion of that person,” Roets says. “You say, ‘he’s part of this group, so he’s probably like this.'”

It’s virtually impossible to change the basic way that people think. Now for the good news: It’s possible to actually also use this way of thinking to reduce people’s prejudice. If people who need quick answers meet people from other groups and like them personally, they are likely to use this positive experience to form their views of the whole group. “This is very much about salient positive information taking away the aversion, anxiety, and fear of the unknown,” Roets says.

Roets’s conclusions suggest that the fundamental source of prejudice is not ideology, but rather a basic human need and way of thinking. “It really makes us think differently about how people become prejudiced or why people are prejudiced,” Roets says. “To reduce prejudice, we first have to acknowledge that it often satisfies some basic need to have quick answers and stable knowledge people rely on to make sense of the world.”

Journal Reference:

Arne Roets and Alain Van Hiel. Allport’s Prejudiced Personality Today: Need for Closure as the Motivated Cognitive Basis of PrejudiceCurrent Directions in Psychological Science, (in press)

 

*   *   *

Ironic Effects of Anti-Prejudice Messages

ScienceDaily (July 7, 2011) — Organizations and programs have been set up all over the globe in the hopes of urging people to end prejudice. According to a research article, which will be published in an upcoming issue of Psychological Science, a journal of the Association for Psychological Science, such programs may actually increase prejudices.

Lisa Legault, Jennifer Gutsell and Michael Inzlicht, from the University of Toronto Scarborough, were interested in exploring how one’s everyday environment influences people’s motivation toward prejudice reduction.

The authors conducted two experiments which looked at the effect of two different types of motivational intervention — a controlled form (telling people what they should do) and a more personal form (explaining why being non-prejudiced is enjoyable and personally valuable).

In experiment one; participants were randomly assigned one of two brochures to read: an autonomy brochure or a controlling brochure. These brochures discussed a new campus initiative to reduce prejudice. A third group was offered no motivational instructions to reduce prejudice. The authors found that, ironically, those who read the controlling brochure later demonstrated more prejudice than those who had not been urged to reduce prejudice. Those who read the brochure designed to support personal motivation showed less prejudice than those in the other two groups.

In experiment two, participants were randomly assigned a questionnaire, designed to stimulate personal or controlling motivation to reduce prejudice. The authors found that those who were exposed to controlling messages regarding prejudice reduction showed significantly more prejudice than those who did not receive any controlling cues.

The authors suggest that when interventions eliminate people’s freedom to value diversity on their own terms, they may actually be creating hostility toward the targets of prejudice.

According to Dr. Legault, “Controlling prejudice reduction practices are tempting because they are quick and easy to implement. They tell people how they should think and behave and stress the negative consequences of failing to think and behave in desirable ways.” Legault continues, “But people need to feel that they are freely choosing to be nonprejudiced, rather than having it forced upon them.”

Legault stresses the need to focus less on the requirement to reduce prejudices and start focusing more on the reasons why diversity and equality are important and beneficial to both majority and minority group members.

Story Source:

The above story is reprinted from materials provided byAssociation for Psychological Science, via EurekAlert!, a service of AAAS.

Extreme Weather Linked to Global Warming, Nobel Prize-Winning Scientist Says (Science Daily)

New scientific analysis strengthens the view that record-breaking summer heat, crop-withering drought and other extreme weather events in recent years do, indeed, result from human activity and global warming, Nobel Laureate Mario J. Molina, Ph.D. explains. (Credit: NASA Goddard Space Flight Center Image by Reto Stöckli (land surface, shallow water, clouds). Enhancements by Robert Simmon (ocean color, compositing, 3D globes, animation). Data and technical support: MODIS Land Group; MODIS Science Data Support Team; MODIS Atmosphere Group; MODIS Ocean Group Additional data: USGS EROS Data Center (topography); USGS Terrestrial Remote Sensing Flagstaff Field Center (Antarctica); Defense Meteorological Satellite Program (city lights).)

ScienceDaily (Aug. 20, 2012) — New scientific analysis strengthens the view that record-breaking summer heat, crop-withering drought and other extreme weather events in recent years do, indeed, result from human activity and global warming, Nobel Laureate Mario J. Molina, Ph.D., said at a conference in Philadelphia on August 20.

Molina, who shared the 1995 Nobel Prize in Chemistry for helping save the world from the consequences of ozone depletion, presented the keynote address at the 244thNational Meeting & Exposition of the American Chemical Society.

“People may not be aware that important changes have occurred in the scientific understanding of the extreme weather events that are in the headlines,” Molina said. “They are now more clearly connected to human activities, such as the release of carbon dioxide ― the main greenhouse gas ― from burning coal and other fossil fuels.”

Molina emphasized that there is no “absolute certainty” that global warming is causing extreme weather events. But he said that scientific insights during the last year or so strengthen the link. Even if the scientific evidence continues to fall short of the absolute certainly measure, the heat, drought, severe storms and other weather extremes may prove beneficial in making the public more aware of global warming and the need for action, said Molina.

“It’s important that people are doing more than just hearing about global warming,” he said. “People may be feeling it, experiencing the impact on food prices, getting a glimpse of what everyday life may be like in the future, unless we as a society take action.”

Molina, who is with the University of California, San Diego, suggested a course of action based on an international agreement like the Montreal Protocol that phased out substances responsible for the depletion of the ozone layer.

“The new agreement should put a price on the emission of greenhouse gases, which would make it more economically favorable for countries to do the right thing. The cost to society of abiding by it would be less than the cost of the climate change damage if society does nothing,” he said.

In the 1970s and 1980s, Molina, F. Sherwood Rowland, Ph.D., and Paul J. Crutzen, Ph.D., established that substances called CFCs in aerosol spray cans and other products could destroy the ozone layer. The ozone layer is crucial to life on Earth, forming a protective shield high in the atmosphere that blocks potentially harmful ultraviolet rays in sunlight. Molina, Rowland and Crutzen shared the Nobel Prize for that research. After a “hole” in that layer over Antarctica was discovered in 1985, scientists established that it was indeed caused by CFCs, and worked together with policymakers and industry representatives around the world to solve the problem. The result was the Montreal Protocol, which phased out the use of CFCs in 1996.

Adopted and implemented by countries around the world, the Montreal Protocol eliminated the major cause of ozone depletion, said Molina, and stands as one of the most successful international agreements. Similar agreements, such as the Kyoto Protocol, have been proposed to address climate change. But Molina said these agreements have largely failed.

Unlike the ozone depletion problem, climate change has become highly politicized and polarizing, he pointed out. Only a small set of substances were involved in ozone depletion, and it was relatively easy to get the small number of stakeholders on the same page. But the climate change topic has exploded. “Climate change is a much more pervasive issue,” he explained. “Fossil fuels, which are at the center of the problem, are so important for the economy, and it affects so many other activities. That makes climate change much more difficult to deal with than the ozone issue.”

In addition to a new international agreement, other things must happen, he said. Scientists need to better communicate the scientific facts underlying climate change. Scientists and engineers also must develop cheap alternative energy sources to reduce dependence on fossil fuels.

Molina said that it’s not certain what will happen to Earth if nothing is done to slow down or halt climate change. “But there is no doubt that the risk is very large, and we could have some consequences that are very damaging, certainly for portions of society,” he said. “It’s not very likely, but there is some possibility that we would have catastrophes.”

Cloud Brightening to Control Global Warming? Geoengineers Propose an Experiment (Science Daily)

A conceptualized image of an unmanned, wind-powered, remotely controlled ship that could be used to implement cloud brightening. (Credit: John McNeill)

ScienceDaily (Aug. 20, 2012) — Even though it sounds like science fiction, researchers are taking a second look at a controversial idea that uses futuristic ships to shoot salt water high into the sky over the oceans, creating clouds that reflect sunlight and thus counter global warming.

University of Washington atmospheric physicist Rob Wood describes a possible way to run an experiment to test the concept on a small scale in a comprehensive paper published this month in the journal Philosophical Transactions of the Royal Society.

The point of the paper — which includes updates on the latest study into what kind of ship would be best to spray the salt water into the sky, how large the water droplets should be and the potential climatological impacts — is to encourage more scientists to consider the idea of marine cloud brightening and even poke holes in it. In the paper, he and a colleague detail an experiment to test the concept.

“What we’re trying to do is make the case that this is a beneficial experiment to do,” Wood said. With enough interest in cloud brightening from the scientific community, funding for an experiment may become possible, he said.

The theory behind so-called marine cloud brightening is that adding particles, in this case sea salt, to the sky over the ocean would form large, long-lived clouds. Clouds appear when water forms around particles. Since there is a limited amount of water in the air, adding more particles creates more, but smaller, droplets.

“It turns out that a greater number of smaller drops has a greater surface area, so it means the clouds reflect a greater amount of light back into space,” Wood said. That creates a cooling effect on Earth.

Marine cloud brightening is part of a broader concept known as geoengineering which encompasses efforts to use technology to manipulate the environment. Brightening, like other geoengineering proposals, is controversial for its ethical and political ramifications and the uncertainty around its impact. But those aren’t reasons not to study it, Wood said.

“I would rather that responsible scientists test the idea than groups that might have a vested interest in proving its success,” he said. The danger with private organizations experimenting with geoengineering is that “there is an assumption that it’s got to work,” he said.

Wood and his colleagues propose trying a small-scale experiment to test feasibility and begin to study effects. The test should start by deploying sprayers on a ship or barge to ensure that they can inject enough particles of the targeted size to the appropriate elevation, Wood and a colleague wrote in the report. An airplane equipped with sensors would study the physical and chemical characteristics of the particles and how they disperse.

The next step would be to use additional airplanes to study how the cloud develops and how long it remains. The final phase of the experiment would send out five to 10 ships spread out across a 100 kilometer, or 62 mile, stretch. The resulting clouds would be large enough so that scientists could use satellites to examine them and their ability to reflect light.

Wood said there is very little chance of long-term effects from such an experiment. Based on studies of pollutants, which emit particles that cause a similar reaction in clouds, scientists know that the impact of adding particles to clouds lasts only a few days.

Still, such an experiment would be unusual in the world of climate science, where scientists observe rather than actually try to change the atmosphere.

Wood notes that running the experiment would advance knowledge around how particles like pollutants impact the climate, although the main reason to do it would be to test the geoengineering idea.

A phenomenon that inspired marine cloud brightening is ship trails: clouds that form behind the paths of ships crossing the ocean, similar to the trails that airplanes leave across the sky. Ship trails form around particles released from burning fuel.

But in some cases ship trails make clouds darker. “We don’t really know why that is,” Wood said.

Despite increasing interest from scientists like Wood, there is still strong resistance to cloud brightening.

“It’s a quick-fix idea when really what we need to do is move toward a low-carbon emission economy, which is turning out to be a long process,” Wood said. “I think we ought to know about the possibilities, just in case.”

The authors of the paper are treading cautiously.

“We stress that there would be no justification for deployment of [marine cloud brightening] unless it was clearly established that no significant adverse consequences would result. There would also need to be an international agreement firmly in favor of such action,” they wrote in the paper’s summary.

There are 25 authors on the paper, including scientists from University of Leeds, University of Edinburgh and the Pacific Northwest National Laboratory. The lead author is John Latham of the National Center for Atmospheric Research and the University of Manchester, who pioneered the idea of marine cloud brightening.

Wood’s research was supported by the UW College of the Environment Institute.

Journal Reference:

J. Latham, K. Bower, T. Choularton, H. Coe, P. Connolly, G. Cooper, T. Craft, J. Foster, A. Gadian, L. Galbraith, H. Iacovides, D. Johnston, B. Launder, B. Leslie, J. Meyer, A. Neukermans, B. Ormond, B. Parkes, P. Rasch, J. Rush, S. Salter, T. Stevenson, H. Wang, Q. Wang, R. Wood. Marine cloud brighteningPhilosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences, 2012; 370 (1974): 4217 DOI:10.1098/rsta.2012.0086

Cientistas apontam problemas da cobertura da imprensa sobre mudanças climáticas (Fapesp)

Especialistas reunidos em São Paulo para debater gestão de riscos dos extremos climáticos manifestam preocupação com dificuldades enfrentadas por jornalistas para lidar com a complexidade do tema (Wikimedia)

21/08/2012

Por Fábio de Castro

Agência FAPESP – Na avaliação de especialistas reunidos em São Paulo para discutir a gestão de riscos dos extremos climáticos e desastres, para que seja possível gerenciar de forma adequada os impactos desses eventos, é fundamental informar a sociedade – incluindo os formuladores de políticas públicas – sobre as descobertas das ciências climáticas.

No entanto, pesquisadores estão preocupados com as dificuldades encontradas na comunicação com a sociedade. A complexidade dos estudos climáticos tende a gerar distorções na cobertura jornalística do tema e o resultado pode ser uma ameaça à confiança do público em relação à ciência.

A avaliação foi feita por participantes do workshop “Gestão dos riscos dos extremos climáticos e desastres na América Central e na América do Sul – o que podemos aprender com o Relatório Especial do IPCC sobre extremos?”, realizado na semana passada na capital paulista.

O evento teve o objetivo de debater as conclusões do Relatório Especial sobre Gestão dos Riscos de Extremos Climáticos e Desastres (SREX, na sigla em inglês) – elaborado e recentemente publicado pelo Painel Intergovernamental sobre Mudanças Climáticas (IPCC) – e discutir opções para gerenciamento dos impactos dos extremos climáticos, especialmente nas Américas do Sul e Central.

O workshop foi realizado pela FAPESP e pelo Instituto Nacional de Pesquisas Espaciais (Inpe), em parceria com o IPCC, o Overseas Development Institute (ODI) e a Climate and Development Knowledge (CKDN), ambos do Reino Unido, e apoio da Agência de Clima e Poluição do Ministério de Relações Exteriores da Noruega.

Durante o evento, o tema da comunicação foi debatido por autores do IPCC-SREX, especialistas em extremos climáticos, gestores e líderes de instituições de prevenção de desastres.

De acordo com Vicente Barros, do Centro de Investigação do Mar e da Atmosfera da Universidade de Buenos Aires, o IPCC, do qual é membro, entrou há três anos em um processo de reestruturação que compreende uma mudança na estratégia de comunicação.

“A partir de 2009, o IPCC passou a ser atacado violentamente e não estávamos preparados para isso, porque nossa função era divulgar o conhecimento adquirido, mas não traduzi-lo para a imprensa. Temos agora um grupo de jornalistas que procura fazer essa mediação, mas não podemos diluir demais as informações e a última palavra na formulação da comunicação é sempre do comitê executivo, porque o peso político do que é expresso pelo painel é muito grande”, disse Barros.

A linguagem é um grande problema, segundo Barros. Se for muito complexa, não atinge o público. Se for muito simplificada, tende a distorcer as conclusões e disseminar visões que não correspondem à realidade.

“O IPCC trata de problemas muito complexos e admitimos que não podemos fazer uma divulgação que chegue a todos. Isso é um problema. Acredito que a comunicação deve permanecer nas mãos dos jornalistas, mas talvez seja preciso investir em iniciativas de treinamento desses profissionais”, disse.

Fábio Feldman, do Fórum Paulista de Mudanças Climáticas, manifestou preocupação com as dificuldades de comunicação dos cientistas com o público, que, segundo ele, possibilitam que os pesquisadores “céticos” – isto é, que negam a influência humana nos eventos de mudanças climáticas – ganhem cada vez mais espaço na mídia e no debate público.

“Vejo com preocupação um avanço do espaço dado aos negacionistas no debate público. A imprensa acha que é preciso usar necessariamente o princípio do contraditório, dando espaço e importância equânimes para as diferentes posições no debate”, disse.

De acordo com Feldman, os cientistas – especialmente aqueles ligados ao IPCC – deveriam ter uma atitude mais pró-ativa no sentido de se contrapor aos “céticos” no debate público.

Posições diferentes

Para Reynaldo Luiz Victoria, da Coordenação do Programa FAPESP de Pesquisa em Mudanças Climáticas Globais, é importante que a imprensa trate as diferentes posições de modo mais equitativo.

“Há casos específicos em que a imprensa trata questões de maneira pouco equitativa – e eventualmente sensacionalista –, mas acho que nós, como pesquisadores, não temos obrigação de reagir. A imprensa deveria nos procurar para fazer o contraponto e esclarecer o público”, disse Victoria à Agência FAPESP.

Victoria, no entanto, destacou a importância de que os “céticos” também sejam ouvidos. “Alguns são cientistas sérios e merecem um tratamento equitativo. Certamente que não se pode ignorá-los, mas, quando fazem afirmações passíveis de contestação, a imprensa deve procurar alguém que possa dar um contraponto. Os jornalistas precisam nos procurar e não o contrário”, disse.

De modo geral, a cobertura da imprensa sobre mudanças climáticas é satisfatória, segundo Victoria. “Os bons jornais publicam artigos corretos e há jornalistas muito sérios produzindo material de alta qualidade”, destacou.

Para Luci Hidalgo Nunes, professora do Departamento de Geografia da Universidade Estadual de Campinas (Unicamp), os negacionistas ganham espaço porque muitas vezes o discurso polêmico tem mais apelo midiático do que a complexidade do conhecimento científico.

“O cientista pode ter um discurso bem fundamentado, mas que é considerado enfadonho pelo público. Enquanto isso, um pesquisador com argumentos pouco estruturados pode fazer um discurso simplificado, portanto atraente para o público, e polêmico, o que rende manchetes”, disse à Agência FAPESP.

Apesar de a boa ciência ter, em relação ao debate público, uma desvantagem inerente à sua complexidade, Nunes acredita ser importante que a imprensa continue pluralista. A pesquisadora publicou um estudo no qual analisa a cobertura do jornal O Estado de S. Paulo sobre mudanças climáticas durante um ano. Segundo Nunes, um dos principais pontos positivos observados consistiu em dar voz às diferentes posições.

“Sou favorável a que a imprensa cumpra seu papel e dê todos os parâmetros, para que haja um debate democrático. Acho que isso está sendo bem feito e a própria imprensa está aberta para nos dar mais espaço. Mas precisamos nos manifestar para criar essas oportunidades”, disse.

Nunes também considera que a cobertura da imprensa sobre mudanças climáticas, de modo geral, tem sido satisfatória, ainda que irregular. “O tema ganha vulto em determinados momentos, mas não se mantém na pauta do noticiário de forma permanente”, disse.

Segundo ela, o assunto sobressaiu especialmente em 2007, com a publicação do primeiro relatório do IPCC, e em 2012 durante a RIO+20.

“Em 2007, a cobertura foi intensa, mas a popularização do tema também deu margem a distorções e exageros. O sensacionalismo é ruim para a ciência, porque faz o tema ganhar as manchetes rapidamente por algum tempo, mas no médio prazo o efeito é inverso: as pessoas percebem os exageros e passam a olhar com descrédito os resultados científicos de modo geral”, disse.

EBay bans sale of spells and hexes (CNN)

By Erin Kim @CNNMoneyTech August 16, 2012: 4:27 PM ET

Starting in September, eBay is blocking the sale of potions and other magical goods.

NEW YORK (CNNMoney) — Sorry, love spell vendors: eBay is cracking down on the sale of magical wares.

Beginning in September, the site is banning the sale of “advice, spells, curses, hexing, conjuring, magic, prayers, blessing services, magic potions, [and] healing sessions,” according to a policy update.

The company is also eliminating its category listings for psychic readings and tarot card sessions.

The update is a part of a “multi-year effort…to build trust in the marketplace and support sellers,” eBay (EBAYFortune 500) wrote in its company blog.

Has anyone actually been buying magic on eBay? It seems so: The site’s “spells and potions” category currently has more than 6,000 active listings and happy feedback from quite a few satisfied buyers.

“Best spell caster on Ebay,” one customer wrote after a recent purchase.

“Wonderful post-spells communication!” another raved. “We bought 4 spells! Highly Recommend!”

Spells and hexes aside, eBay is rolling out a long list of rule tweaks, as it does several times a year. For example, buyers will now be required to contact sellers before getting eBay involved with any issues regarding a purchase. Sellers will also be subject to a fee for ending an auction earlier than planned.

EBay also banned the sale of “work from home businesses & information,” a category that is often abused by scammers.

EBay isn’t the only online marketplace culling its listings. Etsy, a platform for homemade goods, also recently prohibited the sale of various items, including drug paraphernalia and body parts. To top of page

First Published: August 16, 2012: 4:27 PM ET

*   *   *

Etsy blocks sales of drugs and human remains

By Erin Kim @CNNMoneyTech August 10, 2012: 5:55 PM ET

NEW YORK (CNNMoney) — Etsy has become the go-to spot for homemade jewelry, knickknacks and household goods. Apparently, some have also been using the online marketplace to sell everything from drugs to human remains.

Now Etsy is cracking down.

The online marketplace recently revised its policies, excluding from its list of sellable items such products as tobacco, hazardous materials and body parts. (Hair and teeth are still OK).

“Odd as it may sound, we’ve spent long hours over the past several months extensively researching some offbeat and fascinating topics, from issues surrounding the sale of human bones to the corrosive and toxic properties of mercury,” the company wrote on its official blog on Wednesday.

Etsy says the changes are made in order to comply with legal rules and restrictions.

“But beyond that, when it comes right down to it, some things just aren’t in the spirit of Etsy,” the online company wrote. “While we understand that it is possible for certain items to be carefully and legally bought and sold, Etsy is just not the right venue for them.”

The new policy prohibits the sale of human body parts, including but not limited to “things such as skulls, bones, articulated skeletons, bodily fluids, preserved tissues or organs, and other similar products.”

Etsy banned most drug paraphernalia, though the company said it is not explicitly banning the sale of medical drugs. Instead, it’s asking that sellers remove any claims of “cure or relief of a health condition or illness.”

That set off a slew of angry posts from Etsy sellers in the company’s public forums.

“Now I need to change near[ly] a quarter of my listings or remove them,”wrote Etsy user Chrissy-jo, who operates an online store called KindredImages. “How am I going explain the use of a salve or even an aromatherapy eye pillow without making the claim that it aids in healing wounds or it helps relieve migraines?”

Another Etsy user named Irina, who runs PheonixBotanicals, wrote: “As an herbal crafter, I find the idea of being banned from listing traditional uses and folklore of plants quite disheartening.”

Sellers on Etsy operate their own shops, where they vend goods that are usually homemade. The online store plans to reach out to individual sellers to ask them to either remove a problematic listing or make changes to align with the company’s policy. To top of page

First Published: August 10, 2012: 4:10 PM ET

Gastos no País com desastres crescem 15 vezes em seis anos (O Estado de são Paulo)

JC e-mail 4564, de 17 de Agosto de 2012.

Relatório do IPCC aponta que eventos extremos aliados à alta exposição humana a situações de risco podem aumentar tragédias.

Nos últimos 30 anos, o aumento da ocorrência de desastres naturais no mundo foi responsável por perdas que saltaram de poucos bilhões de dólares em 1980 para mais de 200 bilhões em 2010. No Brasil, em somente seis anos (2004-2010), os gastos das três esferas governamentais com a reconstrução de estruturas afetadas nesses eventos evoluíram de US$ 65 milhões para mais de US$ 1 bilhão – um aumento de mais de 15 vezes.

Os dados foram citados ontem durante evento de divulgação do Relatório Especial sobre Gestão de Riscos de Extremos Climáticos e Desastres (SREX), do Painel Intergovernamental de Mudanças Climáticas (IPCC). A elaboração do documento foi motivada justamente por conta dessa elevação já observada de desastres e perdas. O alerta, porém, é para o futuro – a expectativa é de que essas situações ocorram com frequência cada vez maior em consequência do aquecimento global.

Alguns dos autores do relatório estiverem presentes ontem em São Paulo, em evento promovido pela Fapesp e pelo Instituto Nacional de Pesquisas Espaciais (Inpe), para divulgar para a comunidade científica e tomadores de decisão os resultados específicos de América Latina e Caribe. A principal conclusão é que para evitar os desastres naturais, os cuidados vão muito além de lidar com o clima.

Vulnerabilidade – “O desastre natural não tem nada de natural. É a conjunção do evento natural com a vulnerabilidade e a exposição das populações a situações críticas”, afirma Vicente Barros, da Universidade de Buenos Aires e um dos coordenadores do relatório.

Segundo ele, desde 1950 vem ocorrendo um aumento do número de dias extremamente quentes e com chuvas extremas. Apesar disso, afirma o climatologista Carlos Nobre, co-autor do trabalho, o que foi considerado como fator determinante para os desastres foi a maior exposição dos seres humanos por conta do aumento do adensamento urbano. No final das contas, acaba sendo um problema de planejamento urbano.

Com base nas pesquisas existentes, ainda não dá para dizer com elevado grau de confiança que esse aumento de eventos extremos já seja resultado das mudanças climáticas. Mas para o futuro a indicação é de que o aquecimento possivelmente irá impulsioná-los. Situações consideradas hoje extremas poderão se tornar mais comuns – chuvas ou secas que acontecem a cada 20 anos, poderão aparecer a cada cinco, dois ou até anualmente. Outra tendência também é que elas possam se inverter, chuva forte num ano, seca em outro.

Independentemente do clima, porém, o relatório alerta que o risco de desastres continuará subindo uma vez que mais pessoas estarão em situação vulnerável. “É daí que virão os problemas. É um alerta para pensarmos em formas de adaptação. O Nordeste teve uma grande seca neste ano e o que o governo fez? Mandou cesta básica. A população, assim, não se adapta”, afirma o pesquisador José Marengo, do Inpe.

Além de alertar para ações dos governos, os pesquisadores chamaram a atenção também para a necessidade de mais estudos regionais. A confiança sobre o que é mais provável de acontecer, principalmente na Amazônia, ainda não é alta. Uma das ferramentas para isso é o desenvolvimento de modelos climáticos regionais. O projeto de um está sendo coordenado pelo Inpe e pela Fapesp, que pode estar pronto em até um ano, adaptado para a realidade brasileira.

No Vale do Ribeira, Defensoria Pública defende comunidades tradicionais contra corrupção e mercado de carbono (Racismo Ambiental)

Por racismoambiental, 24/06/2012 11:45

Tania Pacheco*

“Posto diante de todos estes homens reunidos, de todas estas mulheres, de todas estas crianças (sede fecundos, multiplicai-vos e enchei a terra, assim lhes fora mandado), cujo suor não nascia do trabalho que não tinham, mas da agonia insuportável de não o ter, Deus arrependeu-se dos males que havia feito e permitido, a um ponto tal que, num arrebato de contrição, quis mudar o seu nome para um outro mais humano. Falando à multidão, anunciou: “A partir de hoje chamar-me-eis Justiça”. E a multidão respondeu-lhe: “Justiça, já nós a temos, e não nos atende”. Disse Deus: “Sendo assim, tomarei o nome de Direito”. E a multidão tornou a responder-lhe: “Direito, já nós o temos, e não nos conhece”. E Deus: “Nesse caso, ficarei com o nome de Caridade, que é um nome bonito”. Disse a multidão: “Não necessitamos de caridade, o que queremos é uma Justiça que se cumpra e um Direito que nos respeite”. José Saramago (Prefácio à obra Terra, de Sebastião Salgado).

O trecho acima foi retirado de uma peça jurídica. Um mandado de segurança com pedido de liminar impetrado no dia 6 de junho pelos Defensores Thiago de Luna Cury e Andrew Toshio Hayama, respectivamente da 2ª e da 3ª Defensorias Publicas de Registro, São Paulo, contra o Prefeito de Iporanga, região de Lageado, Vale do Ribeira. Seu objetivo: impedir que, seguindo uma prática que vem se tornando constante no estado, a autoridade municipal expulse comunidades tradicionais e desaproprie vastas extensões de terras, transformando-as em Parques Naturais a serem transacionados no mercado de carbono.

Para ganhar dinheiro a qualquer custo, não interessa investigar se nessas terras há comunidades tradicionais, quilombolas e camponeses. Não interessa se o Direito à Consulta Prévia e Informada estipulado pela Convenção 169 da OIT foi respeitado. Não interessa, inclusive, se, caso audiências públicas tivessem sido realizadas, as comunidades teriam condições de entender plenamente o que estava sendo proposto e decidir se seria de seu interesse abandonar seus territórios, suas tradições, suas gentes, uma vez que nesse tipo de unidade de conservação integral não pode haver moradores. Em parcerias com empresas e ONGs fajutas, o esquema é montado; de uma penada decretado; e o lucro é garantido e dividido entre os integrantes das quadrilhas.

Mas não foi bem assim que aconteceu em Iporanga. A Defensoria Pública agiu, e agiu pela Justiça e pelo Direito, de forma indignada, culta, forte, poética e, sempre, muito bem fundamentada nas leis. E coube ao Juiz Raphael Garcia Pinto, de Eldorado, São Paulo, reconhecê-lo em decisão do dia 11 de junho de 2012.

Este Blog defende intransigentemente a “democratização do sistema de Justiça”. E tanto no mandado como na decisão é um exemplo disso que temos presente: da prática da democracia pelos operadores do Direito. Por isso fazemos questão de socializá-los, não só como uma homenagem aos Defensores Thiago de Luna Cury e Andrew Toshio Hayama (e também ao Juiz Raphael Garcia Pinto), mas também como um exemplo a ser seguido Brasil afora, como forma de defender as comunidades e honrar a tod@s nós.

Para ver o mandado de segurança clique AQUI. Para ver a decisão clique AQUI. Boa leitura.

* Com informações enviadas por Luciana Zaffalon.

Traços exóticos da ‘partícula de Deus’ surpreendem físicos (Folha de são Paulo)

JC e-mail 4559, de 10 de Agosto de 2012.

Estudo com participação de brasileiro indica que bóson de Higgs pode não se encaixar em teoria mais aceita hoje. Análise preliminar dá pistas de partículas ainda desconhecidas; outros cientistas pedem cautela com os dados.

A partícula de Deus está, ao que parece, do jeito que o diabo gosta: malcomportada. É o que indica uma análise preliminar de dados coletados no LHC, maior acelerador de partículas do mundo.

O trabalho, feito por Oscar Éboli, do Instituto de Física da Universidade de São Paulo (USP), sugere que o chamado bóson de Higgs, que seria responsável por dar massa a tudo o que existe, não está se portando como deveria, a julgar pela teoria que previu sua existência, o Modelo Padrão. Se confirmado, o comportamento anômalo da partícula seria a deixa para uma nova era da física.

A descoberta do possível bóson, anunciada com estardalhaço no mês passado, foi comemorada como a finalização de uma etapa gloriosa no estudo das partículas fundamentais da matéria. Sua existência, em resumo, explicaria porque o Sol pode produzir sua energia e criaturas como nós podem existir.

Dada sua importância para a consistência do Universo (e fazendo uma analogia com a história bíblica da torre de Babel), o físico ganhador do Nobel Leon Lederman deu ao bóson o apelido de “partícula de Deus”.

Para analisar o bóson de Higgs, é preciso primeiro produzir uma colisão entre prótons em altíssima velocidade – função primordial do LHC. Então, do impacto de alta energia, surgem montes de novas partículas, dentre as quais o Higgs, que rapidamente decai, como se diz.

É que, por ser muito instável, o bóson se “decompõe” quando a energia da colisão diminui. Aparecem, no lugar dele, outras partículas. É esse subproduto que pode ser detectado e indicar a existência do bóson de Higgs. Contudo, isso exige a realização de muitos impactos, até que as estatísticas comecem a sugerir a presença do procurado bóson.

Os dados coletados até aqui são suficientes para apontar a existência da partícula, mas suas características específicas ainda não puderam ser determinadas. “Estamos ainda num estágio inicial da exploração das propriedades da dela”, diz Éboli. “Contudo, há uma indicação de que o Higgs decaia mais em dois fótons [partículas de luz] do que seria esperado no Modelo Padrão.”

Os resultados dessa análise preliminar foram divulgados no Arxiv.org, repositório de estudos de física na internet, e abordados na revista “Pesquisa Fapesp”.

Surpresa bem-vinda – A novidade anima os cientistas. “Para a maioria dos físicos, o Modelo Padrão é uma boa representação da natureza, mas não é a teoria final”, afirma Éboli. “Se de fato for confirmado que o Higgs está decaindo mais que o esperado em dois fótons, isso pode significar que novas partículas podem estar dentro do alcance de descoberta do LHC.”

Poderia ser o primeiro vislumbre de um novo “zoológico” de tijolos elementares da matéria. Previa-se que essas partículas exóticas começassem a aparecer com as energias elevadas do LHC.

Tudo muito interessante, mas nada resolvido. “É um trabalho muito sério, mas eu acho que ainda é muito cedo para se tirar qualquer conclusão se se trata ou não do Higgs padrão”, afirma Sérgio Novaes, pesquisador da Unesp que participa de um dos experimentos que detectaram o bóson de Higgs. “Até o final do ano as coisas estarão um pouco mais claras”, avalia ele.

População indígena no País cresceu 205% em duas décadas (Agência Brasil)

JC e-mail 4559, de 10 de Agosto de 2012.

No contexto do dia 9 de agosto, Dia Internacional dos Povos Indígenas, lideranças realizaram um protesto na sede da Advocacia Geral da União (AGU) para apelar pela suspensão da portaria 303, que autoriza a intervenção em terras indígenas sem a necessidade de consultar os índios.

Hoje (10), Instituto Brasileiro de Geografia e Estatística (IBGE) divulgou os dados do Censo 2010 que mostram que os índios no Brasil somam 896,9 mil pessoas, de 305 etnias, que falam 274 línguas indígenas. É a primeira vez que o órgão coleta informações sobre a etnia dos povos. O levantamento marca também a retomada da investigação sobre as línguas indígenas, parada por 60 anos.

Com base nos dados do Censo 2010, o IBGE revela que a população indígena no País cresceu 205% desde 1991, quando foi feito o primeiro levantamento no modelo atual. À época, os índios somavam 294 mil. O número chegou a 734 mil no Censo de 2000, 150% de aumento na comparação com 1991.

A pesquisa mostra que, dos 896,9 mil índios do País, mais da metade (63,8%) vivem em área rural. A situação é o inverso da de 2000, quando mais da metade estavam em área urbana (52%).

Na avaliação do IBGE, a explicação para o crescimento da população indígena pode estar na queda da taxa de fecundidade das mulheres em áreas rurais, apesar de o índice de 2010 não estar fechado ainda. Entre 1991 e 2000, essa taxa passou de 6,4 filhos por mulher para 5,8.

Outro fator que pode explicar o aumento do número de índios é o processo de etnogênese, quando há “reconstrução das comunidades indígenas”, que supostamente não existiam mais, explica o professor de antropologia da Universidade de Campinas (Unicamp), José Maurício Arruti.

Os dados do IBGE indicam que a maioria dos índios (57,7%) vive em 505 terras indígenas reconhecidas pelo governo até o dia 31 de dezembro de 2010, período de avaliação da pesquisa. Essas áreas equivalem a 12,5% do território nacional, sendo que maior parte fica na Região Norte – a mais populosa em indígenas (342 mil). Já na Região Sudeste, 84% dos 99,1 mil índios estão fora das terras originárias. Em seguida vem o Nordeste (54%).

Para chegar ao número total de índios, o IBGE somou aqueles que se autodeclararam indígenas (817,9 mil) com 78,9 mil que vivem em terras indígenas, mas não tinham optado por essa classificação ao responder à pergunta sobre cor ou raça. Para esse grupo, foi feita uma segunda pergunta, indagando se o entrevistado se considerava índio. O objetivo foi evitar distorções.

A responsável pela pesquisa, Nilza Pereira, explicou que a categoria índios foi inventada pela população não índia e, por isso, alguns se confundiram na autodeclaração e não se disseram indígenas em um primeiro momento. “Para o índio, ele é um xavante, um kaiapó, da cor parda, verde e até marrom”, justificou.

A terra indígena mais populosa no País é a Yanomami, com 25,7 mil habitantes (5% do total) distribuídos entre o Amazonas e Roraima. Já a etnia Tikúna (AM) é mais numerosa, com 46 mil indivíduos, sendo 39,3 mil na terra indígena e os demais fora. Em seguida, vem a etnia Guarani Kaiowá (MS), com 43 mil índios, dos quais 35 mil estão na terra indígena e 8,1 mil vivem fora.

O Censo 2010 também revelou que 37,4% índios com mais de 5 anos de idade falam línguas indígenas, apesar de anos de contato com não índios. Cerca de 120 mil não falam português. Os povos considerados índios isolados, pelas limitações da própria política de contato, com objetivo de preservá-los, não foram entrevistados e não estão contabilizados no Censo 2010.

Rooting out Rumors, Epidemics, and Crime — With Math (Science Daily)

ScienceDaily (Aug. 10, 2012) — A team of EPFL scientists has developed an algorithm that can identify the source of an epidemic or information circulating within a network, a method that could also be used to help with criminal investigations.

Investigators are well aware of how difficult it is to trace an unlawful act to its source. The job was arguably easier with old, Mafia-style criminal organizations, as their hierarchical structures more or less resembled predictable family trees.

In the Internet age, however, the networks used by organized criminals have changed. Innumerable nodes and connections escalate the complexity of these networks, making it ever more difficult to root out the guilty party. EPFL researcher Pedro Pinto of the Audiovisual Communications Laboratory and his colleagues have developed an algorithm that could become a valuable ally for investigators, criminal or otherwise, as long as a network is involved. The team’s research was published August 10, 2012, in the journal Physical Review Letters.

Finding the source of a Facebook rumor

“Using our method, we can find the source of all kinds of things circulating in a network just by ‘listening’ to a limited number of members of that network,” explains Pinto. Suppose you come across a rumor about yourself that has spread on Facebook and been sent to 500 people — your friends, or even friends of your friends. How do you find the person who started the rumor? “By looking at the messages received by just 15-20 of your friends, and taking into account the time factor, our algorithm can trace the path of that information back and find the source,” Pinto adds. This method can also be used to identify the origin of a spam message or a computer virus using only a limited number of sensors within the network.

Trace the propagation of an epidemic

Out in the real world, the algorithm can be employed to find the primary source of an infectious disease, such as cholera. “We tested our method with data on an epidemic in South Africa provided by EPFL professor Andrea Rinaldo’s Ecohydrology Laboratory,” says Pinto. “By modeling water networks, river networks, and human transport networks, we were able to find the spot where the first cases of infection appeared by monitoring only a small fraction of the villages.”

The method would also be useful in responding to terrorist attacks, such as the 1995 sarin gas attack in the Tokyo subway, in which poisonous gas released in the city’s subterranean tunnels killed 13 people and injured nearly 1,000 more. “Using this algorithm, it wouldn’t be necessary to equip every station with detectors. A sample would be sufficient to rapidly identify the origin of the attack, and action could be taken before it spreads too far,” says Pinto.

Identifying the brains behind a terrorist attack

Computer simulations of the telephone conversations that could have occurred during the terrorist attacks on September 11, 2001, were used to test Pinto’s system. “By reconstructing the message exchange inside the 9/11 terrorist network extracted from publicly released news, our system spit out the names of three potential suspects — one of whom was found to be the mastermind of the attacks, according to the official enquiry.”

The validity of this method thus has been proven a posteriori. But according to Pinto, it could also be used preventatively — for example, to understand an outbreak before it gets out of control. “By carefully selecting points in the network to test, we could more rapidly detect the spread of an epidemic,” he points out. It could also be a valuable tool for advertisers who use viral marketing strategies by leveraging the Internet and social networks to reach customers. For example, this algorithm would allow them to identify the specific Internet blogs that are the most influential for their target audience and to understand how in these articles spread throughout the online community.

Populations Survive Despite Many Deleterious Mutations: Evolutionary Model of Muller’s Ratchet Explored (Science Daily)

ScienceDaily (Aug. 10, 2012) — From protozoans to mammals, evolution has created more and more complex structures and better-adapted organisms. This is all the more astonishing as most genetic mutations are deleterious. Especially in small asexual populations that do not recombine their genes, unfavourable mutations can accumulate. This process is known as Muller’s ratchet in evolutionary biology. The ratchet, proposed by the American geneticist Hermann Joseph Muller, predicts that the genome deteriorates irreversibly, leaving populations on a one-way street to extinction.

Equilibrium of mutation and selection processes: A population can be divided into groups of individuals that carry different numbers of deleterious mutations. Groups with few mutations are amplified by selection but loose members to other groups by mutation. Groups with many mutations don’t reproduce as much, but gain members by mutation. (Credit: © Richard Neher/MPI for Developmental Biology)

In collaboration with colleagues from the US, Richard Neher from the Max Planck Institute for Developmental Biology has shown mathematically how Muller’s ratchet operates and he has investigated why populations are not inevitably doomed to extinction despite the continuous influx of deleterious mutations.

The great majority of mutations are deleterious. “Due to selection individuals with more favourable genes reproduce more successfully and deleterious mutations disappear again,” explains the population geneticist Richard Neher, leader of an independent Max Planck research group at the Max Planck Institute for Developmental Biology in Tübingen, Germany. However, in small populations such as an asexually reproducing virus early during infection, the situation is not so clear-cut. “It can then happen by chance, by stochastic processes alone, that deleterious mutations in the viruses accumulate and the mutation-free group of individuals goes extinct,” says Richard Neher. This is known as a click of Muller’s ratchet, which is irreversible — at least in Muller’s model.

Muller published his model on the evolutionary significance of deleterious mutations in 1964. Yet to date a quantitative understanding of the ratchet’s processes was lacking. Richard Neher and Boris Shraiman from the University of California in Santa Barbara have now published a new theoretical study on Muller’s ratchet. They chose a comparably simple model with only deleterious mutations all having the same effect on fitness. The scientists assumed selection against those mutations and analysed how fluctuations in the group of the fittest individuals affected the less fit ones and the whole population. Richard Neher and Boris Shraiman discovered that the key to the understanding of Muller’s ratchet lies in a slow response: If the number of the fittest individuals is reduced, the mean fitness decreases only after a delay. “This delayed feedback accelerates Muller’s ratchet,” Richard Neher comments on the results. It clicks more and more frequently.

“Our results are valid for a broad range of conditions and parameter values — for a population of viruses as well as a population of tigers.” However, he does not expect to find the model’s conditions one-to-one in nature. “Models are made to understand the essential aspects, to identify the critical processes,” he explains.

In a second study Richard Neher, Boris Shraiman and several other US-scientists from the University of California in Santa Barbara and Harvard University in Cambridge investigated how a small asexual population could escape Muller’s ratchet. “Such a population can only stay in a steady state for a long time when beneficial mutations continually compensate for the negative ones that accumulate via Muller’s ratchet,” says Richard Neher. For their model the scientists assumed a steady environment and suggest that there can be a mutation-selection balance in every population. They have calculated the rate of favourable mutations required to maintain the balance. The result was surprising: Even under unfavourable conditions, a comparably small proportion in the range of several percent of positive mutations is sufficient to sustain a population.

These findings could explain the long-term maintenance of mitochondria, the so-called power plants of the cell that have their own genome and divide asexually. By and large, evolution is driven by random events or as Richard Neher says: “Evolutionary dynamics are very stochastic.”

NOAA Raises Hurricane Season Prediction Despite Expected El Niño (Science Daily)

ScienceDaily (Aug. 10, 2012) — This year’s Atlantic hurricane season got off to a busy start, with 6 named storms to date, and may have a busy second half, according to the updated hurricane season outlook issued Aug. 9, 2012 by NOAA’s Climate Prediction Center, a division of the National Weather Service. The updated outlook still indicates a 50 percent chance of a near-normal season, but increases the chance of an above-normal season to 35 percent and decreases the chance of a below-normal season to only 15 percent from the initial outlook issued in May.

Satellite image of Hurricane Ernesto taken on Aug. 7, 2012 in the Gulf of Mexico. (Credit: NOAA)

Across the entire Atlantic Basin for the season — June 1 to November 30 — NOAA’s updated seasonal outlook projects a total (which includes the activity-to-date of tropical storms Alberto, Beryl, Debbie, Florence and hurricanes Chris and Ernesto) of:

  • 12 to 17 named storms (top winds of 39 mph or higher), including:
  • 5 to 8 hurricanes (top winds of 74 mph or higher), of which:
  • 2 to 3 could be major hurricanes (Category 3, 4 or 5; winds of at least 111 mph)

The numbers are higher from the initial outlook in May, which called for 9-15 named storms, 4-8 hurricanes and 1-3 major hurricanes. Based on a 30-year average, a normal Atlantic hurricane season produces 12 named storms, six hurricanes, and three major hurricanes.

“We are increasing the likelihood of an above-normal season because storm-conducive wind patterns and warmer-than-normal sea surface temperatures are now in place in the Atlantic,” said Gerry Bell, Ph.D., lead seasonal hurricane forecaster at the Climate Prediction Center. “These conditions are linked to the ongoing high activity era for Atlantic hurricanes that began in 1995. Also, strong early-season activity is generally indicative of a more active season.”

However, NOAA seasonal climate forecasters also announced today that El Niño will likely develop in August or September.

“El Niño is a competing factor, because it strengthens the vertical wind shear over the Atlantic, which suppresses storm development. However, we don’t expect El Niño’s influence until later in the season,” Bell said.

“We have a long way to go until the end of the season, and we shouldn’t let our guard down,” said Laura Furgione, acting director of NOAA’s National Weather Service. “Hurricanes often bring dangerous inland flooding as we saw a year ago in the Northeast with Hurricane Irene and Tropical Storm Lee. Even people who live hundreds of miles from the coast need to remain vigilant through the remainder of the season.”

“It is never too early to prepare for a hurricane,” said Tim Manning, FEMA’s deputy administrator for protection and national preparedness. “We are in the middle of hurricane season and now is the time to get ready. There are easy steps you can take to get yourself and your family prepared. Visit www.ready.gov to learn more.”

How Computation Can Predict Group Conflict: Fighting Among Captive Pigtailed Macaques Provides Clues (Science Daily)

ScienceDaily (Aug. 13, 2012) — When conflict breaks out in social groups, individuals make strategic decisions about how to behave based on their understanding of alliances and feuds in the group.

Researchers studied fighting among captive pigtailed macaques for clues about behavior and group conflict. (Credit: iStockphoto/Natthaphong Phanthumchinda)

But it’s been challenging to quantify the underlying trends that dictate how individuals make predictions, given they may only have seen a small number of fights or have limited memory.

In a new study, scientists at the Wisconsin Institute for Discovery (WID) at UW-Madison develop a computational approach to determine whether individuals behave predictably. With data from previous fights, the team looked at how much memory individuals in the group would need to make predictions themselves. The analysis proposes a novel estimate of “cognitive burden,” or the minimal amount of information an organism needs to remember to make a prediction.

The research draws from a concept called “sparse coding,” or the brain’s tendency to use fewer visual details and a small number of neurons to stow an image or scene. Previous studies support the idea that neurons in the brain react to a few large details such as the lines, edges and orientations within images rather than many smaller details.

“So what you get is a model where you have to remember fewer things but you still get very high predictive power — that’s what we’re interested in,” says Bryan Daniels, a WID researcher who led the study. “What is the trade-off? What’s the minimum amount of ‘stuff’ an individual has to remember to make good inferences about future events?”

To find out, Daniels — along with WID co-authors Jessica Flack and David Krakauer — drew comparisons from how brains and computers encode information. The results contribute to ongoing discussions about conflict in biological systems and how cognitive organisms understand their environments.

The study, published in the Aug. 13 edition of the Proceedings of the National Academy of Sciences, examined observed bouts of natural fighting in a group of 84 captive pigtailed macaques at the Yerkes National Primate Research Center. By recording individuals’ involvement — or lack thereof — in fights, the group created models that mapped the likelihood any number of individuals would engage in conflict in hypothetical situations.

To confirm the predictive power of the models, the group plugged in other data from the monkey group that was not used to create the models. Then, researchers compared these simulations with what actually happened in the group. One model looked at conflict as combinations of pairs, while another represented fights as sparse combinations of clusters, which proved to be a better tool for predicting fights. From there, by removing information until predictions became worse, Daniels and colleagues calculated the amount of information each individual needed to remember to make the most informed decision whether to fight or flee.

“We know the monkeys are making predictions, but we don’t know how good they are,” says Daniels. “But given this data, we found that the most memory it would take to figure out the regularities is about 1,000 bits of information.”

Sparse coding appears to be a strong candidate for explaining the mechanism at play in the monkey group, but the team points out that it is only one possible way to encode conflict.

Because the statistical modeling and computation frameworks can be applied to different natural datasets, the research has the potential to influence other fields of study, including behavioral science, cognition, computation, game theory and machine learning. Such models might also be useful in studying collective behaviors in other complex systems, ranging from neurons to bird flocks.

Future research will seek to find out how individuals’ knowledge of alliances and feuds fine tunes their own decisions and changes the groups’ collective pattern of conflict.

The research was supported by the National Science Foundation, the John Templeton Foundation through the Santa Fe Institute, and UW-Madison.

Should Doctors Treat Lack of Exercise as a Medical Condition? Expert Says ‘Yes’ (Science Daily)

ScienceDaily (Aug. 13, 2012) — A sedentary lifestyle is a common cause of obesity, and excessive body weight and fat in turn are considered catalysts for diabetes, high blood pressure, joint damage and other serious health problems. But what if lack of exercise itself were treated as a medical condition? Mayo Clinic physiologist Michael Joyner, M.D., argues that it should be. His commentary is published this month in The Journal of Physiology.

Physical inactivity affects the health not only of many obese patients, but also people of normal weight, such as workers with desk jobs, patients immobilized for long periods after injuries or surgery, and women on extended bed rest during pregnancies, among others, Dr. Joyner says. Prolonged lack of exercise can cause the body to become deconditioned, with wide-ranging structural and metabolic changes: the heart rate may rise excessively during physical activity, bones and muscles atrophy, physical endurance wane, and blood volume decline.

When deconditioned people try to exercise, they may tire quickly and experience dizziness or other discomfort, then give up trying to exercise and find the problem gets worse rather than better.

“I would argue that physical inactivity is the root cause of many of the common problems that we have,” Dr. Joyner says. “If we were to medicalize it, we could then develop a way, just like we’ve done for addiction, cigarettes and other things, to give people treatments, and lifelong treatments, that focus on behavioral modifications and physical activity. And then we can take public health measures, like we did for smoking, drunken driving and other things, to limit physical inactivity and promote physical activity.”

Several chronic medical conditions are associated with poor capacity to exercise, including fibromyalgia, chronic fatigue syndrome and postural orthostatic tachycardia syndrome, better known as POTS, a syndrome marked by an excessive heart rate and flu-like symptoms when standing or a given level of exercise. Too often, medication rather than progressive exercise is prescribed, Dr. Joyner says.

Texas Health Presbyterian Hospital Dallas and University of Texas Southwestern Medical Center researchers found that three months of exercise training can reverse or improve many POTS symptoms, Dr. Joyner notes. That study offers hope for such patients and shows that physicians should consider prescribing carefully monitored exercise before medication, he says.

If physical inactivity were treated as a medical condition itself rather than simply a cause or byproduct of other medical conditions, physicians may become more aware of the value of prescribing supported exercise, and more formal rehabilitation programs that include cognitive and behavioral therapy would develop, Dr. Joyner says.

For those who have been sedentary and are trying to get into exercise, Dr. Joyner advises doing it slowly and progressively.

“You just don’t jump right back into it and try to train for a marathon,” he says. “Start off with achievable goals and do it in small bites.”

There’s no need to join a gym or get a personal trainer: build as much activity as possible into daily life. Even walking just 10 minutes three times a day can go a long way toward working up to the 150 minutes a week of moderate physical activity the typical adult needs, Dr. Joyner says.

How Do They Do It? Predictions Are in for Arctic Sea Ice Low Point (Science Daily)

ScienceDaily (Aug. 14, 2012) — It’s become a sport of sorts, predicting the low point of Arctic sea ice each year. Expert scientists with decades of experience do it but so do enthusiasts, whose guesses are gamely included in a monthly predictions roundup collected by Sea Ice Outlook, an effort supported by the U.S. government.

Arctic sea ice, as seen from an ice breaker. (Credit: Bonnie Light, UW)

When averaged, the predictions have come in remarkably close to the mark in the past two years. But the low and high predictions are off by hundreds of thousands of square kilometers.

Researchers are working hard to improve their ability to more accurately predict how much Arctic sea ice will remain at the end of summer. It’s an important exercise because knowing why sea ice declines could help scientists better understand climate change and how sea ice is evolving.

This year, researchers from the University of Washington’s Polar Science Center are the first to include new NASA sea ice thickness data collected by airplane in a prediction.

They expect 4.4 million square kilometers of remaining ice (about 1.7 million square miles), just barely more than the 4.3 million kilometers in 2007, the lowest year on record for Arctic sea ice. The median of 23 predictions collected by the Sea Ice Outlook and released on Aug. 13 is 4.3 million.

“One drawback to making predictions is historically we’ve had very little information about the thickness of the ice in the current year,” said Ron Lindsay, a climatologist at the Polar Science Center, a department in the UW’s Applied Physics Laboratory.

To make their prediction, Lindsay and Jinlun Zhang, an oceanographer in the Polar Science Center, start with a widely used model pioneered by Zhang and known as the Pan-Arctic Ice Ocean Modeling and Assimilation System. That system combines available observations with a model to track sea ice volume, which includes both ice thickness and extent.

But obtaining observations about current-year ice thickness in order to build their short-term prediction is tough. NASA is currently in the process of designing a new satellite that will replace one that used to deliver ice thickness data but has since failed. In the meantime, NASA is running a program called Operation IceBridge that uses airplanes to survey sea ice as well as Arctic ice sheets.

“This is the first year they made a concerted effort to get the data from the aircraft, process it and get it into hands of scientists in a timely manner,” Lindsay said. “In the past, we’ve gotten data from submarines, moorings or satellites but none of that data was available in a timely manner. It took months or even years.”

There’s a shortcoming to the IceBridge data, however: It’s only available through March. The radar used to measure snow depth on the surface of the ice, an important element in the observation system, has trouble accurately gauging the depth once it has melted and so the data is only collected through the early spring before the thaw.

The UW scientists have developed a method for informing their prediction that is starting to be used by others. Researchers have struggled with how best to forecast the weather in the Arctic, which affects ice melt and distribution.

“Jinlun came up with the idea of using the last seven summers. Because the climate is changing so fast, only the recent summers are probably relevant,” Lindsay said.

The result is seven different possibilities of what might happen. “The average of those is our best guess,” Lindsay said.

Despite the progress in making predictions, the researchers say their abilities to foretell the future will always be limited. Because they can’t forecast the weather very far in advance and because the ice is strongly affected by winds, they have little confidence beyond what the long-term trend tells us in predictions that are made far in advance.

“The accuracy of our prediction really depends on time,” Zhang said. “Our June 1 prediction for the Sept. 15 low point has high uncertainty but as we approach the end of June or July, the uncertainty goes down and the accuracy goes up.”

In hindsight, that’s true historically for the average predictions collected by Study of Environmental Arctic Change’s Sea Ice Outlook, a project funded by the National Science Foundation and the National Oceanic and Atmospheric Administration.

While the competitive aspect of the predictions is fun, the researchers aren’t in it to win it.

“Essentially it’s not for prediction but for understanding,” Zhang said. “We do it to improve our understanding of sea ice processes, in terms of how dynamic processes affect the seasonal evolution of sea ice.”

That may not be entirely the same for the enthusiasts who contribute a prediction. One climate blog polls readers in the summer for their best estimate of the sea ice low point. It’s included among the predictions collected by the Sea Ice Outlook, with an asterisk noting it as a “public outlook.”

The National Science Foundation and NASA fund the UW research into the Arctic sea ice low point.

Nova legislação dará base científica à prevenção de desastres naturais, dizem especialistas (Fapesp)

Lei sancionada em abril obrigará municípios a elaborar carta geotécnica, instrumento multidisciplinar que orientará implantação de sistemas de alerta e planos diretores (Valter Campanato/ABr)

08/08/2012

Por Fábio de Castro

Agência FAPESP – Em janeiro de 2011, enchentes e deslizamentos deixaram cerca de mil mortos e 500 desaparecidos na Região Serrana do Rio de Janeiro. A tragédia evidenciou a precariedade dos sistemas de alerta no Brasil e foi considerada por especialistas como a prova definitiva de que era preciso investir na prevenção de desastres.

O mais importante desdobramento dessa análise foi a Lei 12.608, sancionada em abril, que estabelece a Política Nacional de Proteção e Defesa Civil e cria o sistema de informações e monitoramento de desastres, de acordo com especialistas reunidos no seminário “Caminhos da política nacional de defesa de áreas de risco”, realizado pela Escola Politécnica da Universidade de São Paulo (USP) no dia 6 de agosto.

A nova lei obriga as prefeituras a investir em planejamento urbano na prevenção de desastres do tipo enchentes e deslizamentos de terra. Segundo os especialistas, pela primeira vez a prevenção de desastres poderá ser feita com fundamento técnico e científico sólido, já que a lei determina que, para fazer o planejamento, todas as prefeituras precisarão elaborar cartas geotécnicas dos municípios.

Katia Canil, pesquisadora do Laboratório de Riscos Ambientais do Instituto de Pesquisas Tecnológicas (IPT), disse que as prefeituras terão dois anos para elaborar as cartas geotécnicas para lastrear seus planos diretores, que deverão contemplar ações de prevenção e mitigação de desastres. Os municípios que não apresentarem esse planejamento não receberão recursos federais para obras de prevenção e mitigação.

“As cartas geotécnicas são documentos cartográficos que reúnem informações sobre as características geológicas e geomorfológicas dos municípios, identificando riscos geológicos e facilitando a criação de regras para a ocupação urbana. Com a obrigatoriedade desse instrumento, expressa na lei, poderemos ter estratégias de prevenção de desastres traçadas com base no conhecimento técnico e científico”, disse Canil à Agência FAPESP.

A primeira carta geotécnica do Brasil foi feita em 1979, no município de Santos (SP), mas, ainda assim, o instrumento se manteve pouco difundido no país. Segundo Canil, a institucionalização da ferramenta será um fator importante para a adequação dos planos diretores em relação às características geotécnicas dos terrenos.

“Poucos municípios têm carta geotécnica, porque não era um instrumento obrigatório. Agora, esse panorama deve mudar. Mas a legislação irá gerar uma grande demanda de especialistas em diversas áreas, porque as cartas geotécnicas integram uma gama de dados interdisciplinares”, disse a pesquisadora do IPT.

As cartas geotécnicas reúnem documentos que resultam de levantamentos geológicos e geotécnicos de campo, além de análises laboratoriais, com o objetivo de sintetizar todo o conhecimento disponível sobre o meio físico e sua relação com os processos geológicos e humanos presentes no local. “E tudo isso precisa ser expresso em uma linguagem adequada para que os gestores compreendam”, disse Canil.

As cidades terão que se organizar para elaborar cartas geotécnicas e a capacitação técnica necessária não é trivial. “Não se trata apenas de cruzar mapas. É preciso ter experiência aliada ao treinamento em áreas como geologia, engenharia, engenharia geotécnica, cartografia, geografia, arquitetura e urbanismo”, disse Canil. O IPT já oferece um curso de capacitação para elaboração de cartas geotécnicas.

Uma dificuldade importante para a elaboração das cartas será a carência de mapeamento geológico de base nos municípios brasileiros. “A maior parte dos municípios não tem dados primários, como mapeamentos geomorfológicos, pedológicos e geológicos”, disse Canil.

Plano nacional de prevenção

A tragédia da Região Serrana fluminense, em janeiro de 2011, foi um marco que mudou o rumo das discussões sobre desastres, destacando definitivamente o papel central da prevenção, segundo Carlos Nobre, secretário de Políticas e Programas de Pesquisa e Desenvolvimento do Ministério da Ciência, Tecnologia e Inovação (MCTI).

“Aquele episódio foi um solavanco que chacoalhou a percepção brasileira para o tema dos grandes desastres. Tornou-se óbvio para os gestores e para a população que é preciso enfatizar o eixo da prevenção. Foi um marco que mudou nossa perspectiva para sempre: prevenção é fundamental”, disse durante o evento.

Segundo Nobre, que também é pesquisador do Instituto Nacional de Pesquisas Espaciais (Inpe) e membro da coordenação do Programa FAPESP de Pesquisa sobre Mudanças Climáticas Globais, a experiência internacional mostra que a prevenção pode reduzir em até 90% o número de vítimas fatais em desastres naturais, além de diminuir em cerca de 35% os danos materiais. “Além de poupar vidas, a economia com os prejuízos materiais já compensa com sobras todos os investimentos em prevenção”, disse.

De acordo com Nobre, a engenharia terá um papel cada vez mais importante na prevenção, à medida que os desastres naturais se tornarem mais extremos por consequência das mudanças climáticas.

“O engenheiro do século 21 precisará ser treinado para a engenharia da sustentabilidade – um campo transversal da engenharia que ganhará cada vez mais espaço. A engenharia, se bem conduzida, é central para solucionar alguns dos principais problemas da atualidade”, afirmou.

Segundo Nobre, além da nova legislação, que obrigará o planejamento com base em cartas geotécnicas dos municípios, o Brasil conta com diversas iniciativas na área de prevenção de desastres. Uma delas será anunciada nesta quarta-feira (08/08): o Plano Nacional de Prevenção a Desastres Naturais, que enfatiza as obras voltadas para a instalação de sistemas de alerta.

“Há obras de grande escala necessárias no Brasil, especialmente no que se refere aos sistemas de alerta. Um dos elementos importantes do novo plano é a questão do alerta precoce. Experiências internacionais mostram que um alerta feito até duas horas antes de um deslizamento é capaz de salvar vidas”, disse.

Segundo Nobre, as iniciativas do plano serão coerentes com a nova legislação. O governo federal deverá investir R$ 4,6 bilhões, nos próximos meses, em iniciativas de prevenção de desastres nos estados do Rio de Janeiro, Minas Gerais e Santa Catarina.

Mas, para pleitear verbas federais, o município deverá cumprir uma série de requisitos, como incorporar as ações de proteção e defesa civil no planejamento municipal, identificar e mapear as áreas de risco de desastres naturais, impedir novas ocupações e vistoriar edificações nessas áreas.

Segundo Nobre, outra ação voltada para a prevenção de desastres foi a implantação do Centro Nacional de Monitoramento e Alertas de Desastres Naturais (Cemaden), do MCTI, que começou a operar em dezembro de 2011, no campus do Inpe em Cachoeira Paulista (SP).

“Esse centro já tinha um papel importante na previsão de tempo, mas foi reformulado e contratou 35 profissionais. O Cemaden nasce como um emblema dos novos sistemas de alerta: uma concepção que une geólogos, meteorólogos e especialistas em desastres naturais para identificar vulnerabilidades, algo raro no mundo”, afirmou.

Segundo ele, essa nova estrutura já tem um sistema de alertas em funcionamento. “É um sistema que ainda vai precisar ser avaliado com o tempo. Mas até agora, desde dezembro de 2011, já foram lançados mais de 100 alertas. O país levará vários anos para reduzir as fatalidades como os países que têm bons sistemas de prevenção. Mas estamos no caminho certo”, disse Nobre.

Heatwave turns America’s waterways into rivers of death (The Independent)

Falling water levels are killing fish and harming exports

DAVID USBORNE

SUNDAY 05 AUGUST 2012

The cruel summer heat-wave that continues to scorch agricultural crops across much of the United States and which is prompting comparisons with the severe droughts of the 1930s and 1950s is also leading to record-breaking water temperatures in rivers and streams, including the Mississippi, as well as fast-falling navigation levels.

While in the northern reaches of the Mississippi, near Moline in Illinois, the temperature touched 90 degrees last week – warmer than the Gulf of Mexico around the Florida Keys – towards the river’s southern reaches the US Army Corps of Engineers is dredging around the clock to try to keep barges from grounding as water levels dive.

For scientists the impact of a long, hot summer that has plunged more than two-thirds of the country into drought conditions – sometimes extreme – has been particularly striking in the Great Lakes. According to the Great Lakes Environmental Research Laboratory, all are experiencing unusual spikes in water temperature this year. It is especially the case for Lake Superior, the northernmost, the deepest, and therefore the coolest.

“It’s pretty safe to say that what we’re seeing here is the warmest that we’ve seen in Lake Superior in a century,” said Jay Austin, a professor at the University of Minnesota at Duluth. The average temperature recorded for the lake last week was 68F (20C). That compares with 56F (13C) at this time last year.

It is a boon to shoreline residents who are finding normally chilly waters suddenly inviting for a dip. But the warming of the rivers, in particular, is taking a harsh toll on fish, which are dying in increasingly large numbers. Significant tolls of fresh-water species, from pike to trout, have been reported, most frequently in the Midwest.

“Most problems occur in ponds that are not deep enough for fish to retreat to cooler and more oxygen-rich water,” said Jake Allman of the Missouri Department of Conservation. “Hot water holds less oxygen than cool water. Shallow ponds get warmer than deeper ponds, and with little rain, area ponds are becoming shallower by the day. Evaporation rates are up to 11 inches per month in these conditions.”

In some instances, fish are simply left high and dry as rivers dry up entirely. It is the case of the normally rushing River Platte which has simply petered out over a 100-mile stretch in Nebraska, large parts of which are now federal disaster areas contending with so-called “exceptional drought” conditions.

“This is the worst I’ve ever seen it, and I’ve been on the river since I was a pup,” Dan Kneifel, owner of Geno’s Bait and Tackle Shop, told TheOmahaChannel.com. “The river was full of fish, and to see them all die is a travesty.”

As water levels in the Mississippi ebb, so barge operators are forced to offload cargo to keep their vessels moving. About 60 per cent of exported US corn is conveyed by the Mississippi, which is now 12ft below normal levels in some stretches. Navigation on the Mississippi has not been so severely threatened since the 1988 drought in the US. Few forget, meanwhile, that last summer towns up and down the Mississippi were battling flooding.

One welcome side-effect, however, is data showing that the so-called “dead zone” in the Gulf of Mexico around the Mississippi estuary is far less extensive this summer because the lack of rain and the slow running of the water has led to much less nitrate being washed off farmland and into the system than in normal years. The phenomenon occurs because the nitrates feed blooms of algae in Gulf waters which then decompose, stripping the water of oxygen.

Chronic 2000-04 drought, worst in 800 years, may be the ‘new normal’ (Oregon State Univ)

Public release date: 29-Jul-2012

By Beverly Law

Oregon State University

CORVALLIS, Ore. – The chronic drought that hit western North America from 2000 to 2004 left dying forests and depleted river basins in its wake and was the strongest in 800 years, scientists have concluded, but they say those conditions will become the “new normal” for most of the coming century.

Such climatic extremes have increased as a result of global warming, a group of 10 researchers reported today in Nature Geoscience. And as bad as conditions were during the 2000-04 drought, they may eventually be seen as the good old days.

Climate models and precipitation projections indicate this period will actually be closer to the “wet end” of a drier hydroclimate during the last half of the 21st century, scientists said.

Aside from its impact on forests, crops, rivers and water tables, the drought also cut carbon sequestration by an average of 51 percent in a massive region of the western United States, Canada and Mexico, although some areas were hit much harder than others. As vegetation withered, this released more carbon dioxide into the atmosphere, with the effect of amplifying global warming.

“Climatic extremes such as this will cause more large-scale droughts and forest mortality, and the ability of vegetation to sequester carbon is going to decline,” said Beverly Law, a co-author of the study, professor of global change biology and terrestrial systems science at Oregon State University, and former science director of AmeriFlux, an ecosystem observation network.

“During this drought, carbon sequestration from this region was reduced by half,” Law said. “That’s a huge drop. And if global carbon emissions don’t come down, the future will be even worse.”

This research was supported by the National Science Foundation, NASA, U.S. Department of Energy, and other agencies. The lead author was Christopher Schwalm at Northern Arizona University. Other collaborators were from the University of Colorado, University of California at Berkeley, University of British Columbia, San Diego State University, and other institutions.

It’s not clear whether or not the current drought in the Midwest, now being called one of the worst since the Dust Bowl, is related to these same forces, Law said. This study did not address that, and there are some climate mechanisms in western North America that affect that region more than other parts of the country.

But in the West, this multi-year drought was unlike anything seen in many centuries, based on tree ring data. The last two periods with drought events of similar severity were in the Middle Ages, from 977-981 and 1146-1151. The 2000-04 drought affected precipitation, soil moisture, river levels, crops, forests and grasslands.

Ordinarily, Law said, the land sink in North America is able to sequester the equivalent of about 30 percent of the carbon emitted into the atmosphere by the use of fossil fuels in the same region. However, based on projected changes in precipitation and drought severity, scientists said that this carbon sink, at least in western North America, could disappear by the end of the century.

“Areas that are already dry in the West are expected to get drier,” Law said. “We expect more extremes. And it’s these extreme periods that can really cause ecosystem damage, lead to climate-induced mortality of forests, and may cause some areas to convert from forest into shrublands or grassland.”

During the 2000-04 drought, runoff in the upper Colorado River basin was cut in half. Crop productivity in much of the West fell 5 percent. The productivity of forests and grasslands declined, along with snowpacks. Evapotranspiration decreased the most in evergreen needleleaf forests, about 33 percent.

The effects are driven by human-caused increases in temperature, with associated lower soil moisture and decreased runoff in all major water basins of the western U.S., researchers said in the study.

Although regional precipitations patterns are difficult to forecast, researchers in this report said that climate models are underestimating the extent and severity of drought, compared to actual observations. They say the situation will continue to worsen, and that 80 of the 95 years from 2006 to 2100 will have precipitation levels as low as, or lower than, this “turn of the century” drought from 2000-04.

“Towards the latter half of the 21st century the precipitation regime associated with the turn of the century drought will represent an outlier of extreme wetness,” the scientists wrote in this study.

These long-term trends are consistent with a 21st century “megadrought,” they said.

Need an Expert? Try the Crowd (Science Daily)

ScienceDaily (Aug. 14, 2012) — “It’s potentially a new way to do science.”

In 1714, the British government held a contest. They offered a large cash prize to anyone who could solve the vexing “longitude problem” — how to determine a ship’s east/west position on the open ocean — since none of their naval experts had been able to do so.

Lots of people gave it a try. One of them, a self-educated carpenter named John Harrison, invented the marine chronometer — a rugged and highly precise clock — that did the trick. For the first time, sailors could accurately determine their location at sea.

A centuries-old problem was solved. And, arguably, crowdsourcing was born.

Crowdsourcing is basically what it sounds like: posing a question or asking for help from a large group of people. Coined as a term in 2006, crowdsourcing has taken off in the internet era. Think of Wikipedia, and its thousands of unpaid contributors, now vastly larger than the Encyclopedia Britannica.

Crowdsourcing has allowed many problems to be solved that would be impossible for experts alone. Astronomers rely on an army of volunteers to scan for new galaxies. At climateprediction.net, citizens have linked their home computers to yield more than a hundred million hours of climate modeling; it’s the world’s largest forecasting experiment.

But what if experts didn’t simply ask the crowd to donate time or answer questions? What if the crowd was asked to decide what questions to ask in the first place?

Could the crowd itself be the expert?

That’s what a team at the University of Vermont decided to explore — and the answer seems to be yes.

Prediction from the people

Josh Bongard and Paul Hines, professors in UVM’s College of Engineering and Mathematical Sciences, and their students, set out to discover if volunteers who visited two different websites could pose, refine, and answer questions of each other — that could effectively predict the volunteers’ body weight and home electricity use.

The experiment, the first of its kind, was a success: the self-directed questions and answers by visitors to the websites led to computer models that effectively predict user’s monthly electricity consumption and body mass index.

Their results, “Crowdsourcing Predictors of Behavioral Outcomes,” were published in a recent edition of IEEE Transactions: Systems, Man and Cybernetics, a journal of the Institute of Electrical and Electronics Engineers.

“It’s proof of concept that a crowd actually can come up with good questions that lead to good hypotheses,” says Bongard, an expert on machine science.

In other words, the wisdom of the crowd can be harnessed to determine which variables to study, the UVM project shows — and at the same time provide a pool of data by responding to the questions they ask of each other.

“The result is a crowdsourced predictive model,” the Vermont scientists write.

Unexpected angles

Some of the questions the volunteers posed were obvious. For example, on the website dedicated to exploring body weight, visitors came up with the question: “Do you think of yourself as overweight?” And, no surprise, that proved to be the question with the most power to predict people’s body weight.

But some questions posed by the volunteers were less obvious. “We had some eye-openers,” Bongard says. “How often do you masturbate a month?” might not be the first question asked by weight-loss experts, but it proved to be the second-most-predictive question of the volunteer’s self-reported weights — more predictive than “how often do you eat during a day?”

“Sometimes the general public has intuition about stuff that experts miss — there’s a long literature on this,” Hines says.

“It’s those people who are very underweight or very overweight who might have an explanation for why they’re at these extremes — and some of those explanations might not be a simple combination of diet and exercise,” says Bongard. “There might be other things that experts missed.”

Cause and correlation

The researchers are quick to note that the variables revealed by the evolving Q&A on the experimental websites are simply correlated to outcomes — body weight and electricity use — not necessarily the cause.

“We’re not arguing that this study is actually predictive of the causes,” says Hines, “but improvements to this method may lead in that direction.”

Nor do the scientists make claim to being experts on body weight or to be providing recommendations on health or diet (though Hines is an expert on electricity, and the EnergyMinder site he and his students developed for this project has a larger aim to help citizens understand and reduce their household energy use.)

“We’re simply investigating the question: could you involve participants in the hypothesis-generation part of the scientific process?” Bongard says. “Our paper is a demonstration of this methodology.”

“Going forward, this approach may allow us to involve the public in deciding what it is that is interesting to study,” says Hines. “It’s potentially a new way to do science.”

And there are many reasons why this new approach might be helpful. In addition to forces that experts might simply not know about — “can we elicit unexpected predictors that an expert would not have come up with sitting in his office?” Hines asks — experts often have deeply held biases.

Faster discoveries

But the UVM team primarily sees their new approach as potentially helping to accelerate the process of scientific discovery. The need for expert involvement — in shaping, say, what questions to ask on a survey or what variable to change to optimize an engineering design — “can become a bottleneck to new insights,” the scientists write.

“We’re looking for an experimental platform where, instead of waiting to read a journal article every year about what’s been learned about obesity,” Bongard says, “a research site could be changing and updating new findings constantly as people add their questions and insights.”

The goal: “exponential rises,” the UVM scientists write, in the discovery of what causes behaviors and patterns — probably driven by the people who care about them the most. For example, “it might be smokers or people suffering from various diseases,” says Bongard. The team thinks this new approach to science could “mirror the exponential growth found in other online collaborative communities,” they write.

“We’re all problem-solving animals,” says Bongard, “so can we exploit that? Instead of just exploiting the cycles of your computer or your ability to say ‘yes’ or ‘no’ on a survey — can we exploit your creative brain?”

Global Warming’s Terrifying New Math (Rolling Stone)

Three simple numbers that add up to global catastrophe – and that make clear who the real enemy is

by: Bill McKibben

reckoning illoIllustration by Edel Rodriguez

If the pictures of those towering wildfires in Colorado haven’t convinced you, or the size of your AC bill this summer, here are some hard numbers about climate change: June broke or tied 3,215 high-temperature records across the United States. That followed the warmest May on record for the Northern Hemisphere – the 327th consecutive month in which the temperature of the entire globe exceeded the 20th-century average, the odds of which occurring by simple chance were 3.7 x 10-99, a number considerably larger than the number of stars in the universe.

Meteorologists reported that this spring was the warmest ever recorded for our nation – in fact, it crushed the old record by so much that it represented the “largest temperature departure from average of any season on record.” The same week, Saudi authorities reported that it had rained in Mecca despite a temperature of 109 degrees, the hottest downpour in the planet’s history.

Not that our leaders seemed to notice. Last month the world’s nations, meeting in Rio for the 20th-anniversary reprise of a massive 1992 environmental summit, accomplished nothing. Unlike George H.W. Bush, who flew in for the first conclave, Barack Obama didn’t even attend. It was “a ghost of the glad, confident meeting 20 years ago,” the British journalist George Monbiot wrote; no one paid it much attention, footsteps echoing through the halls “once thronged by multitudes.” Since I wrote one of the first books for a general audience about global warming way back in 1989, and since I’ve spent the intervening decades working ineffectively to slow that warming, I can say with some confidence that we’re losing the fight, badly and quickly – losing it because, most of all, we remain in denial about the peril that human civilization is in.

When we think about global warming at all, the arguments tend to be ideological, theological and economic. But to grasp the seriousness of our predicament, you just need to do a little math. For the past year, an easy and powerful bit of arithmetical analysis first published by financial analysts in the U.K. has been making the rounds of environmental conferences and journals, but it hasn’t yet broken through to the larger public. This analysis upends most of the conventional political thinking about climate change. And it allows us to understand our precarious – our almost-but-not-quite-finally hopeless – position with three simple numbers.

The First Number: 2° Celsius

If the movie had ended in Hollywood fashion, the Copenhagen climate conference in 2009 would have marked the culmination of the global fight to slow a changing climate. The world’s nations had gathered in the December gloom of the Danish capital for what a leading climate economist, Sir Nicholas Stern of Britain, called the “most important gathering since the Second World War, given what is at stake.” As Danish energy minister Connie Hedegaard, who presided over the conference, declared at the time: “This is our chance. If we miss it, it could take years before we get a new and better one. If ever.”

In the event, of course, we missed it. Copenhagen failed spectacularly. Neither China nor the United States, which between them are responsible for 40 percent of global carbon emissions, was prepared to offer dramatic concessions, and so the conference drifted aimlessly for two weeks until world leaders jetted in for the final day. Amid considerable chaos, President Obama took the lead in drafting a face-saving “Copenhagen Accord” that fooled very few. Its purely voluntary agreements committed no one to anything, and even if countries signaled their intentions to cut carbon emissions, there was no enforcement mechanism. “Copenhagen is a crime scene tonight,” an angry Greenpeace official declared, “with the guilty men and women fleeing to the airport.” Headline writers were equally brutal: COPENHAGEN: THE MUNICH OF OUR TIMES? asked one.

The accord did contain one important number, however. In Paragraph 1, it formally recognized “the scientific view that the increase in global temperature should be below two degrees Celsius.” And in the very next paragraph, it declared that “we agree that deep cuts in global emissions are required… so as to hold the increase in global temperature below two degrees Celsius.” By insisting on two degrees – about 3.6 degrees Fahrenheit – the accord ratified positions taken earlier in 2009 by the G8, and the so-called Major Economies Forum. It was as conventional as conventional wisdom gets. The number first gained prominence, in fact, at a 1995 climate conference chaired by Angela Merkel, then the German minister of the environment and now the center-right chancellor of the nation.

Some context: So far, we’ve raised the average temperature of the planet just under 0.8 degrees Celsius, and that has caused far more damage than most scientists expected. (A third of summer sea ice in the Arctic is gone, the oceans are 30 percent more acidic, and since warm air holds more water vapor than cold, the atmosphere over the oceans is a shocking five percent wetter, loading the dice for devastating floods.) Given those impacts, in fact, many scientists have come to think that two degrees is far too lenient a target. “Any number much above one degree involves a gamble,” writes Kerry Emanuel of MIT, a leading authority on hurricanes, “and the odds become less and less favorable as the temperature goes up.” Thomas Lovejoy, once the World Bank’s chief biodiversity adviser, puts it like this: “If we’re seeing what we’re seeing today at 0.8 degrees Celsius, two degrees is simply too much.” NASA scientist James Hansen, the planet’s most prominent climatologist, is even blunter: “The target that has been talked about in international negotiations for two degrees of warming is actually a prescription for long-term disaster.” At the Copenhagen summit, a spokesman for small island nations warned that many would not survive a two-degree rise: “Some countries will flat-out disappear.” When delegates from developing nations were warned that two degrees would represent a “suicide pact” for drought-stricken Africa, many of them started chanting, “One degree, one Africa.”

Despite such well-founded misgivings, political realism bested scientific data, and the world settled on the two-degree target – indeed, it’s fair to say that it’s the only thing about climate change the world has settled on. All told, 167 countries responsible for more than 87 percent of the world’s carbon emissions have signed on to the Copenhagen Accord, endorsing the two-degree target. Only a few dozen countries have rejected it, including Kuwait, Nicaragua and Venezuela. Even the United Arab Emirates, which makes most of its money exporting oil and gas, signed on. The official position of planet Earth at the moment is that we can’t raise the temperature more than two degrees Celsius – it’s become the bottomest of bottom lines. Two degrees.

The Second Number: 565 Gigatons

Scientists estimate that humans can pour roughly 565 more gigatons of carbon dioxide into the atmosphere by midcentury and still have some reasonable hope of staying below two degrees. (“Reasonable,” in this case, means four chances in five, or somewhat worse odds than playing Russian roulette with a six-shooter.)

This idea of a global “carbon budget” emerged about a decade ago, as scientists began to calculate how much oil, coal and gas could still safely be burned. Since we’ve increased the Earth’s temperature by 0.8 degrees so far, we’re currently less than halfway to the target. But, in fact, computer models calculate that even if we stopped increasing CO2 now, the temperature would likely still rise another 0.8 degrees, as previously released carbon continues to overheat the atmosphere. That means we’re already three-quarters of the way to the two-degree target.

How good are these numbers? No one is insisting that they’re exact, but few dispute that they’re generally right. The 565-gigaton figure was derived from one of the most sophisticated computer-simulation models that have been built by climate scientists around the world over the past few decades. And the number is being further confirmed by the latest climate-simulation models currently being finalized in advance of the next report by the Intergovernmental Panel on Climate Change. “Looking at them as they come in, they hardly differ at all,” says Tom Wigley, an Australian climatologist at the National Center for Atmospheric Research. “There’s maybe 40 models in the data set now, compared with 20 before. But so far the numbers are pretty much the same. We’re just fine-tuning things. I don’t think much has changed over the last decade.” William Collins, a senior climate scientist at the Lawrence Berkeley National Laboratory, agrees. “I think the results of this round of simulations will be quite similar,” he says. “We’re not getting any free lunch from additional understanding of the climate system.”

We’re not getting any free lunch from the world’s economies, either. With only a single year’s lull in 2009 at the height of the financial crisis, we’ve continued to pour record amounts of carbon into the atmosphere, year after year. In late May, the International Energy Agency published its latest figures – CO2 emissions last year rose to 31.6 gigatons, up 3.2 percent from the year before. America had a warm winter and converted more coal-fired power plants to natural gas, so its emissions fell slightly; China kept booming, so its carbon output (which recently surpassed the U.S.) rose 9.3 percent; the Japanese shut down their fleet of nukes post-Fukushima, so their emissions edged up 2.4 percent. “There have been efforts to use more renewable energy and improve energy efficiency,” said Corinne Le Quéré, who runs England’s Tyndall Centre for Climate Change Research. “But what this shows is that so far the effects have been marginal.” In fact, study after study predicts that carbon emissions will keep growing by roughly three percent a year – and at that rate, we’ll blow through our 565-gigaton allowance in 16 years, around the time today’s preschoolers will be graduating from high school. “The new data provide further evidence that the door to a two-degree trajectory is about to close,” said Fatih Birol, the IEA’s chief economist. In fact, he continued, “When I look at this data, the trend is perfectly in line with a temperature increase of about six degrees.” That’s almost 11 degrees Fahrenheit, which would create a planet straight out of science fiction.

So, new data in hand, everyone at the Rio conference renewed their ritual calls for serious international action to move us back to a two-degree trajectory. The charade will continue in November, when the next Conference of the Parties (COP) of the U.N. Framework Convention on Climate Change convenes in Qatar. This will be COP 18 – COP 1 was held in Berlin in 1995, and since then the process has accomplished essentially nothing. Even scientists, who are notoriously reluctant to speak out, are slowly overcoming their natural preference to simply provide data. “The message has been consistent for close to 30 years now,” Collins says with a wry laugh, “and we have the instrumentation and the computer power required to present the evidence in detail. If we choose to continue on our present course of action, it should be done with a full evaluation of the evidence the scientific community has presented.” He pauses, suddenly conscious of being on the record. “I should say, a fuller evaluation of the evidence.”

So far, though, such calls have had little effect. We’re in the same position we’ve been in for a quarter-century: scientific warning followed by political inaction. Among scientists speaking off the record, disgusted candor is the rule. One senior scientist told me, “You know those new cigarette packs, where governments make them put a picture of someone with a hole in their throats? Gas pumps should have something like that.”

The Third Number: 2,795 Gigatons

This number is the scariest of all – one that, for the first time, meshes the political and scientific dimensions of our dilemma. It was highlighted last summer by the Carbon Tracker Initiative, a team of London financial analysts and environmentalists who published a report in an effort to educate investors about the possible risks that climate change poses to their stock portfolios. The number describes the amount of carbon already contained in the proven coal and oil and gas reserves of the fossil-fuel companies, and the countries (think Venezuela or Kuwait) that act like fossil-fuel companies. In short, it’s the fossil fuel we’re currently planning to burn. And the key point is that this new number – 2,795 – is higher than 565. Five times higher.

The Carbon Tracker Initiative – led by James Leaton, an environmentalist who served as an adviser at the accounting giant PricewaterhouseCoopers – combed through proprietary databases to figure out how much oil, gas and coal the world’s major energy companies hold in reserve. The numbers aren’t perfect – they don’t fully reflect the recent surge in unconventional energy sources like shale gas, and they don’t accurately reflect coal reserves, which are subject to less stringent reporting requirements than oil and gas. But for the biggest companies, the figures are quite exact: If you burned everything in the inventories of Russia’s Lukoil and America’s ExxonMobil, for instance, which lead the list of oil and gas companies, each would release more than 40 gigatons of carbon dioxide into the atmosphere.

Which is exactly why this new number, 2,795 gigatons, is such a big deal. Think of two degrees Celsius as the legal drinking limit – equivalent to the 0.08 blood-alcohol level below which you might get away with driving home. The 565 gigatons is how many drinks you could have and still stay below that limit – the six beers, say, you might consume in an evening. And the 2,795 gigatons? That’s the three 12-packs the fossil-fuel industry has on the table, already opened and ready to pour.

We have five times as much oil and coal and gas on the books as climate scientists think is safe to burn. We’d have to keep 80 percent of those reserves locked away underground to avoid that fate. Before we knew those numbers, our fate had been likely. Now, barring some massive intervention, it seems certain.

Yes, this coal and gas and oil is still technically in the soil. But it’s already economically aboveground – it’s figured into share prices, companies are borrowing money against it, nations are basing their budgets on the presumed returns from their patrimony. It explains why the big fossil-fuel companies have fought so hard to prevent the regulation of carbon dioxide – those reserves are their primary asset, the holding that gives their companies their value. It’s why they’ve worked so hard these past years to figure out how to unlock the oil in Canada’s tar sands, or how to drill miles beneath the sea, or how to frack the Appalachians.

If you told Exxon or Lukoil that, in order to avoid wrecking the climate, they couldn’t pump out their reserves, the value of their companies would plummet. John Fullerton, a former managing director at JP Morgan who now runs the Capital Institute, calculates that at today’s market value, those 2,795 gigatons of carbon emissions are worth about $27 trillion. Which is to say, if you paid attention to the scientists and kept 80 percent of it underground, you’d be writing off $20 trillion in assets. The numbers aren’t exact, of course, but that carbon bubble makes the housing bubble look small by comparison. It won’t necessarily burst – we might well burn all that carbon, in which case investors will do fine. But if we do, the planet will crater. You can have a healthy fossil-fuel balance sheet, or a relatively healthy planet – but now that we know the numbers, it looks like you can’t have both. Do the math: 2,795 is five times 565. That’s how the story ends.

So far, as I said at the start, environmental efforts to tackle global warming have failed. The planet’s emissions of carbon dioxide continue to soar, especially as developing countries emulate (and supplant) the industries of the West. Even in rich countries, small reductions in emissions offer no sign of the real break with the status quo we’d need to upend the iron logic of these three numbers. Germany is one of the only big countries that has actually tried hard to change its energy mix; on one sunny Saturday in late May, that northern-latitude nation generated nearly half its power from solar panels within its borders. That’s a small miracle – and it demonstrates that we have the technology to solve our problems. But we lack the will. So far, Germany’s the exception; the rule is ever more carbon.

This record of failure means we know a lot about what strategies don’t work. Green groups, for instance, have spent a lot of time trying to change individual lifestyles: the iconic twisty light bulb has been installed by the millions, but so have a new generation of energy-sucking flatscreen TVs. Most of us are fundamentally ambivalent about going green: We like cheap flights to warm places, and we’re certainly not going to give them up if everyone else is still taking them. Since all of us are in some way the beneficiaries of cheap fossil fuel, tackling climate change has been like trying to build a movement against yourself – it’s as if the gay-rights movement had to be constructed entirely from evangelical preachers, or the abolition movement from slaveholders.

People perceive – correctly – that their individual actions will not make a decisive difference in the atmospheric concentration of CO2; by 2010, a poll found that “while recycling is widespread in America and 73 percent of those polled are paying bills online in order to save paper,” only four percent had reduced their utility use and only three percent had purchased hybrid cars. Given a hundred years, you could conceivably change lifestyles enough to matter – but time is precisely what we lack.

A more efficient method, of course, would be to work through the political system, and environmentalists have tried that, too, with the same limited success. They’ve patiently lobbied leaders, trying to convince them of our peril and assuming that politicians would heed the warnings. Sometimes it has seemed to work. Barack Obama, for instance, campaigned more aggressively about climate change than any president before him – the night he won the nomination, he told supporters that his election would mark the moment “the rise of the oceans began to slow and the planet began to heal.” And he has achieved one significant change: a steady increase in the fuel efficiency mandated for automobiles. It’s the kind of measure, adopted a quarter-century ago, that would have helped enormously. But in light of the numbers I’ve just described, it’s obviously a very small start indeed.

At this point, effective action would require actually keeping most of the carbon the fossil-fuel industry wants to burn safely in the soil, not just changing slightly the speed at which it’s burned. And there the president, apparently haunted by the still-echoing cry of “Drill, baby, drill,” has gone out of his way to frack and mine. His secretary of interior, for instance, opened up a huge swath of the Powder River Basin in Wyoming for coal extraction: The total basin contains some 67.5 gigatons worth of carbon (or more than 10 percent of the available atmospheric space). He’s doing the same thing with Arctic and offshore drilling; in fact, as he explained on the stump in March, “You have my word that we will keep drilling everywhere we can… That’s a commitment that I make.” The next day, in a yard full of oil pipe in Cushing, Oklahoma, the president promised to work on wind and solar energy but, at the same time, to speed up fossil-fuel development: “Producing more oil and gas here at home has been, and will continue to be, a critical part of an all-of-the-above energy strategy.” That is, he’s committed to finding even more stock to add to the 2,795-gigaton inventory of unburned carbon.

Sometimes the irony is almost Borat-scale obvious: In early June, Secretary of State Hillary Clinton traveled on a Norwegian research trawler to see firsthand the growing damage from climate change. “Many of the predictions about warming in the Arctic are being surpassed by the actual data,” she said, describing the sight as “sobering.” But the discussions she traveled to Scandinavia to have with other foreign ministers were mostly about how to make sure Western nations get their share of the estimated $9 trillion in oil (that’s more than 90 billion barrels, or 37 gigatons of carbon) that will become accessible as the Arctic ice melts. Last month, the Obama administration indicated that it would give Shell permission to start drilling in sections of the Arctic.

Almost every government with deposits of hydrocarbons straddles the same divide. Canada, for instance, is a liberal democracy renowned for its internationalism – no wonder, then, that it signed on to the Kyoto treaty, promising to cut its carbon emissions substantially by 2012. But the rising price of oil suddenly made the tar sands of Alberta economically attractive – and since, as NASA climatologist James Hansen pointed out in May, they contain as much as 240 gigatons of carbon (or almost half of the available space if we take the 565 limit seriously), that meant Canada’s commitment to Kyoto was nonsense. In December, the Canadian government withdrew from the treaty before it faced fines for failing to meet its commitments.

The same kind of hypocrisy applies across the ideological board: In his speech to the Copenhagen conference, Venezuela’s Hugo Chavez quoted Rosa Luxemburg, Jean-Jacques Rousseau and “Christ the Redeemer,” insisting that “climate change is undoubtedly the most devastating environmental problem of this century.” But the next spring, in the Simon Bolivar Hall of the state-run oil company, he signed an agreement with a consortium of international players to develop the vast Orinoco tar sands as “the most significant engine for a comprehensive development of the entire territory and Venezuelan population.” The Orinoco deposits are larger than Alberta’s – taken together, they’d fill up the whole available atmospheric space.

So: the paths we have tried to tackle global warming have so far produced only gradual, halting shifts. A rapid, transformative change would require building a movement, and movements require enemies. As John F. Kennedy put it, “The civil rights movement should thank God for Bull Connor. He’s helped it as much as Abraham Lincoln.” And enemies are what climate change has lacked.

But what all these climate numbers make painfully, usefully clear is that the planet does indeed have an enemy – one far more committed to action than governments or individuals. Given this hard math, we need to view the fossil-fuel industry in a new light. It has become a rogue industry, reckless like no other force on Earth. It is Public Enemy Number One to the survival of our planetary civilization. “Lots of companies do rotten things in the course of their business – pay terrible wages, make people work in sweatshops – and we pressure them to change those practices,” says veteran anti-corporate leader Naomi Klein, who is at work on a book about the climate crisis. “But these numbers make clear that with the fossil-fuel industry, wrecking the planet is their business model. It’s what they do.”

According to the Carbon Tracker report, if Exxon burns its current reserves, it would use up more than seven percent of the available atmospheric space between us and the risk of two degrees. BP is just behind, followed by the Russian firm Gazprom, then Chevron, ConocoPhillips and Shell, each of which would fill between three and four percent. Taken together, just these six firms, of the 200 listed in the Carbon Tracker report, would use up more than a quarter of the remaining two-degree budget. Severstal, the Russian mining giant, leads the list of coal companies, followed by firms like BHP Billiton and Peabody. The numbers are simply staggering – this industry, and this industry alone, holds the power to change the physics and chemistry of our planet, and they’re planning to use it.

They’re clearly cognizant of global warming – they employ some of the world’s best scientists, after all, and they’re bidding on all those oil leases made possible by the staggering melt of Arctic ice. And yet they relentlessly search for more hydrocarbons – in early March, Exxon CEO Rex Tillerson told Wall Street analysts that the company plans to spend $37 billion a year through 2016 (about $100 million a day) searching for yet more oil and gas.

There’s not a more reckless man on the planet than Tillerson. Late last month, on the same day the Colorado fires reached their height, he told a New York audience that global warming is real, but dismissed it as an “engineering problem” that has “engineering solutions.” Such as? “Changes to weather patterns that move crop-production areas around – we’ll adapt to that.” This in a week when Kentucky farmers were reporting that corn kernels were “aborting” in record heat, threatening a spike in global food prices. “The fear factor that people want to throw out there to say, ‘We just have to stop this,’ I do not accept,” Tillerson said. Of course not – if he did accept it, he’d have to keep his reserves in the ground. Which would cost him money. It’s not an engineering problem, in other words – it’s a greed problem.

You could argue that this is simply in the nature of these companies – that having found a profitable vein, they’re compelled to keep mining it, more like efficient automatons than people with free will. But as the Supreme Court has made clear, they are people of a sort. In fact, thanks to the size of its bankroll, the fossil-fuel industry has far more free will than the rest of us. These companies don’t simply exist in a world whose hungers they fulfill – they help create the boundaries of that world.

Left to our own devices, citizens might decide to regulate carbon and stop short of the brink; according to a recent poll, nearly two-thirds of Americans would back an international agreement that cut carbon emissions 90 percent by 2050. But we aren’t left to our own devices. The Koch brothers, for instance, have a combined wealth of $50 billion, meaning they trail only Bill Gates on the list of richest Americans. They’ve made most of their money in hydrocarbons, they know any system to regulate carbon would cut those profits, and they reportedly plan to lavish as much as $200 million on this year’s elections. In 2009, for the first time, the U.S. Chamber of Commerce surpassed both the Republican and Democratic National Committees on political spending; the following year, more than 90 percent of the Chamber’s cash went to GOP candidates, many of whom deny the existence of global warming. Not long ago, the Chamber even filed a brief with the EPA urging the agency not to regulate carbon – should the world’s scientists turn out to be right and the planet heats up, the Chamber advised, “populations can acclimatize to warmer climates via a range of behavioral, physiological and technological adaptations.” As radical goes, demanding that we change our physiology seems right up there.

Environmentalists, understandably, have been loath to make the fossil-fuel industry their enemy, respecting its political power and hoping instead to convince these giants that they should turn away from coal, oil and gas and transform themselves more broadly into “energy companies.” Sometimes that strategy appeared to be working – emphasis on appeared. Around the turn of the century, for instance, BP made a brief attempt to restyle itself as “Beyond Petroleum,” adapting a logo that looked like the sun and sticking solar panels on some of its gas stations. But its investments in alternative energy were never more than a tiny fraction of its budget for hydrocarbon exploration, and after a few years, many of those were wound down as new CEOs insisted on returning to the company’s “core business.” In December, BP finally closed its solar division. Shell shut down its solar and wind efforts in 2009. The five biggest oil companies have made more than $1 trillion in profits since the millennium – there’s simply too much money to be made on oil and gas and coal to go chasing after zephyrs and sunbeams.

Much of that profit stems from a single historical accident: Alone among businesses, the fossil-fuel industry is allowed to dump its main waste, carbon dioxide, for free. Nobody else gets that break – if you own a restaurant, you have to pay someone to cart away your trash, since piling it in the street would breed rats. But the fossil-fuel industry is different, and for sound historical reasons: Until a quarter-century ago, almost no one knew that CO2 was dangerous. But now that we understand that carbon is heating the planet and acidifying the oceans, its price becomes the central issue.

If you put a price on carbon, through a direct tax or other methods, it would enlist markets in the fight against global warming. Once Exxon has to pay for the damage its carbon is doing to the atmosphere, the price of its products would rise. Consumers would get a strong signal to use less fossil fuel – every time they stopped at the pump, they’d be reminded that you don’t need a semimilitary vehicle to go to the grocery store. The economic playing field would now be a level one for nonpolluting energy sources. And you could do it all without bankrupting citizens – a so-called “fee-and-dividend” scheme would put a hefty tax on coal and gas and oil, then simply divide up the proceeds, sending everyone in the country a check each month for their share of the added costs of carbon. By switching to cleaner energy sources, most people would actually come out ahead.

There’s only one problem: Putting a price on carbon would reduce the profitability of the fossil-fuel industry. After all, the answer to the question “How high should the price of carbon be?” is “High enough to keep those carbon reserves that would take us past two degrees safely in the ground.” The higher the price on carbon, the more of those reserves would be worthless. The fight, in the end, is about whether the industry will succeed in its fight to keep its special pollution break alive past the point of climate catastrophe, or whether, in the economists’ parlance, we’ll make them internalize those externalities.

It’s not clear, of course, that the power of the fossil-fuel industry can be broken. The U.K. analysts who wrote the Carbon Tracker report and drew attention to these numbers had a relatively modest goal – they simply wanted to remind investors that climate change poses a very real risk to the stock prices of energy companies. Say something so big finally happens (a giant hurricane swamps Manhattan, a megadrought wipes out Midwest agriculture) that even the political power of the industry is inadequate to restrain legislators, who manage to regulate carbon. Suddenly those Chevron reserves would be a lot less valuable, and the stock would tank. Given that risk, the Carbon Tracker report warned investors to lessen their exposure, hedge it with some big plays in alternative energy.

“The regular process of economic evolution is that businesses are left with stranded assets all the time,” says Nick Robins, who runs HSBC’s Climate Change Centre. “Think of film cameras, or typewriters. The question is not whether this will happen. It will. Pension systems have been hit by the dot-com and credit crunch. They’ll be hit by this.” Still, it hasn’t been easy to convince investors, who have shared in the oil industry’s record profits. “The reason you get bubbles,” sighs Leaton, “is that everyone thinks they’re the best analyst – that they’ll go to the edge of the cliff and then jump back when everyone else goes over.”

So pure self-interest probably won’t spark a transformative challenge to fossil fuel. But moral outrage just might – and that’s the real meaning of this new math. It could, plausibly, give rise to a real movement.

Once, in recent corporate history, anger forced an industry to make basic changes. That was the campaign in the 1980s demanding divestment from companies doing business in South Africa. It rose first on college campuses and then spread to municipal and state governments; 155 campuses eventually divested, and by the end of the decade, more than 80 cities, 25 states and 19 counties had taken some form of binding economic action against companies connected to the apartheid regime. “The end of apartheid stands as one of the crowning accomplishments of the past century,” as Archbishop Desmond Tutu put it, “but we would not have succeeded without the help of international pressure,” especially from “the divestment movement of the 1980s.”

The fossil-fuel industry is obviously a tougher opponent, and even if you could force the hand of particular companies, you’d still have to figure out a strategy for dealing with all the sovereign nations that, in effect, act as fossil-fuel companies. But the link for college students is even more obvious in this case. If their college’s endowment portfolio has fossil-fuel stock, then their educations are being subsidized by investments that guarantee they won’t have much of a planet on which to make use of their degree. (The same logic applies to the world’s largest investors, pension funds, which are also theoretically interested in the future – that’s when their members will “enjoy their retirement.”) “Given the severity of the climate crisis, a comparable demand that our institutions dump stock from companies that are destroying the planet would not only be appropriate but effective,” says Bob Massie, a former anti-apartheid activist who helped found the Investor Network on Climate Risk. “The message is simple: We have had enough. We must sever the ties with those who profit from climate change – now.”

Movements rarely have predictable outcomes. But any campaign that weakens the fossil-fuel industry’s political standing clearly increases the chances of retiring its special breaks. Consider President Obama’s signal achievement in the climate fight, the large increase he won in mileage requirements for cars. Scientists, environmentalists and engineers had advocated such policies for decades, but until Detroit came under severe financial pressure, it was politically powerful enough to fend them off. If people come to understand the cold, mathematical truth – that the fossil-fuel industry is systematically undermining the planet’s physical systems – it might weaken it enough to matter politically. Exxon and their ilk might drop their opposition to a fee-and-dividend solution; they might even decide to become true energy companies, this time for real.

Even if such a campaign is possible, however, we may have waited too long to start it. To make a real difference – to keep us under a temperature increase of two degrees – you’d need to change carbon pricing in Washington, and then use that victory to leverage similar shifts around the world. At this point, what happens in the U.S. is most important for how it will influence China and India, where emissions are growing fastest. (In early June, researchers concluded that China has probably under-reported its emissions by up to 20 percent.) The three numbers I’ve described are daunting – they may define an essentially impossible future. But at least they provide intellectual clarity about the greatest challenge humans have ever faced. We know how much we can burn, and we know who’s planning to burn more. Climate change operates on a geological scale and time frame, but it’s not an impersonal force of nature; the more carefully you do the math, the more thoroughly you realize that this is, at bottom, a moral issue; we have met the enemy and they is Shell.

Meanwhile the tide of numbers continues. The week after the Rio conference limped to its conclusion, Arctic sea ice hit the lowest level ever recorded for that date. Last month, on a single weekend, Tropical Storm Debby dumped more than 20 inches of rain on Florida – the earliest the season’s fourth-named cyclone has ever arrived. At the same time, the largest fire in New Mexico history burned on, and the most destructive fire in Colorado’s annals claimed 346 homes in Colorado Springs – breaking a record set the week before in Fort Collins. This month, scientists issued a new study concluding that global warming has dramatically increased the likelihood of severe heat and drought – days after a heat wave across the Plains and Midwest broke records that had stood since the Dust Bowl, threatening this year’s harvest. You want a big number? In the course of this month, a quadrillion kernels of corn need to pollinate across the grain belt, something they can’t do if temperatures remain off the charts. Just like us, our crops are adapted to the Holocene, the 11,000-year period of climatic stability we’re now leaving… in the dust.

This story is from the August 2nd, 2012 issue of Rolling Stone.