Arquivo da tag: Previsão

How Computation Can Predict Group Conflict: Fighting Among Captive Pigtailed Macaques Provides Clues (Science Daily)

ScienceDaily (Aug. 13, 2012) — When conflict breaks out in social groups, individuals make strategic decisions about how to behave based on their understanding of alliances and feuds in the group.

Researchers studied fighting among captive pigtailed macaques for clues about behavior and group conflict. (Credit: iStockphoto/Natthaphong Phanthumchinda)

But it’s been challenging to quantify the underlying trends that dictate how individuals make predictions, given they may only have seen a small number of fights or have limited memory.

In a new study, scientists at the Wisconsin Institute for Discovery (WID) at UW-Madison develop a computational approach to determine whether individuals behave predictably. With data from previous fights, the team looked at how much memory individuals in the group would need to make predictions themselves. The analysis proposes a novel estimate of “cognitive burden,” or the minimal amount of information an organism needs to remember to make a prediction.

The research draws from a concept called “sparse coding,” or the brain’s tendency to use fewer visual details and a small number of neurons to stow an image or scene. Previous studies support the idea that neurons in the brain react to a few large details such as the lines, edges and orientations within images rather than many smaller details.

“So what you get is a model where you have to remember fewer things but you still get very high predictive power — that’s what we’re interested in,” says Bryan Daniels, a WID researcher who led the study. “What is the trade-off? What’s the minimum amount of ‘stuff’ an individual has to remember to make good inferences about future events?”

To find out, Daniels — along with WID co-authors Jessica Flack and David Krakauer — drew comparisons from how brains and computers encode information. The results contribute to ongoing discussions about conflict in biological systems and how cognitive organisms understand their environments.

The study, published in the Aug. 13 edition of the Proceedings of the National Academy of Sciences, examined observed bouts of natural fighting in a group of 84 captive pigtailed macaques at the Yerkes National Primate Research Center. By recording individuals’ involvement — or lack thereof — in fights, the group created models that mapped the likelihood any number of individuals would engage in conflict in hypothetical situations.

To confirm the predictive power of the models, the group plugged in other data from the monkey group that was not used to create the models. Then, researchers compared these simulations with what actually happened in the group. One model looked at conflict as combinations of pairs, while another represented fights as sparse combinations of clusters, which proved to be a better tool for predicting fights. From there, by removing information until predictions became worse, Daniels and colleagues calculated the amount of information each individual needed to remember to make the most informed decision whether to fight or flee.

“We know the monkeys are making predictions, but we don’t know how good they are,” says Daniels. “But given this data, we found that the most memory it would take to figure out the regularities is about 1,000 bits of information.”

Sparse coding appears to be a strong candidate for explaining the mechanism at play in the monkey group, but the team points out that it is only one possible way to encode conflict.

Because the statistical modeling and computation frameworks can be applied to different natural datasets, the research has the potential to influence other fields of study, including behavioral science, cognition, computation, game theory and machine learning. Such models might also be useful in studying collective behaviors in other complex systems, ranging from neurons to bird flocks.

Future research will seek to find out how individuals’ knowledge of alliances and feuds fine tunes their own decisions and changes the groups’ collective pattern of conflict.

The research was supported by the National Science Foundation, the John Templeton Foundation through the Santa Fe Institute, and UW-Madison.

Why Are People Overconfident So Often? It’s All About Social Status (Science Daily)

ScienceDaily (Aug. 13, 2012) — Researchers have long known that people are very frequently overconfident — that they tend to believe they are more physically talented, socially adept, and skilled at their job than they actually are. For example, 94% of college professors think they do above average work (which is nearly impossible, statistically speaking). But this overconfidence can also have detrimental effects on their performance and decision-making. So why, in light of these negative consequences, is overconfidence still so pervasive?

The lure of social status promotes overconfidence, explains Haas School Associate Professor Cameron Anderson. He co-authored a new study, “A Status-Enhancement Account of Overconfidence,” with Sebastien Brion, assistant professor of managing people in organizations, IESE Business School, University of Navarra, Haas School colleagues Don Moore, associate professor of management, and Jessica A. Kennedy, now a post-doctoral fellow at the Wharton School of Business. The study will be published in theJournal of Personality and Social Psychology (forthcoming).

“Our studies found that overconfidence helped people attain social status. People who believed they were better than others, even when they weren’t, were given a higher place in the social ladder. And the motive to attain higher social status thus spurred overconfidence,” says Anderson, the Lorraine Tyson Mitchell Chair in Leadership and Communication II at the Haas School.

Social status is the respect, prominence, and influence individuals enjoy in the eyes of others. Within work groups, for example, higher status individuals tend to be more admired, listened to, and have more sway over the group’s discussions and decisions. These “alphas” of the group have more clout and prestige than other members. Anderson says these research findings are important because they help shed light on a longstanding puzzle: why overconfidence is so common, in spite of its risks. His findings suggest that falsely believing one is better than others has profound social benefits for the individual.

Moreover, these findings suggest one reason why in organizational settings, incompetent people are so often promoted over their more competent peers. “In organizations, people are very easily swayed by others’ confidence even when that confidence is unjustified,” says Anderson. “Displays of confidence are given an inordinate amount of weight.”

The studies suggest that organizations would benefit from taking individuals’ confidence with a grain of salt. Yes, confidence can be a sign of a person’s actual abilities, but it is often not a very good sign. Many individuals are confident in their abilities even though they lack true skills or competence.

The authors conducted six experiments to measure why people become overconfident and how overconfidence equates to a rise in social stature. For example:

In Study 2, the researchers examined 242 MBA students in their project teams and asked them to look over a list of historical names, historical events, and books and poems, and then to identify which ones they knew or recognized. Terms included Maximilien Robespierre, Lusitania, Wounded Knee, Pygmalion, and Doctor Faustus. Unbeknownst to the participants, some of the names were made up. These so-called “foils” included Bonnie Prince Lorenzo, Queen Shaddock, Galileo Lovano, Murphy’s Last Ride, and Windemere Wild. The researchers deemed those who picked the most foils the most overly confident because they believed they were more knowledgeable than they actually were. In a survey at the end of the semester, those same overly confident individuals (who said they had recognized the most foils) achieved the highest social status within their groups.

It is important to note that group members did not think of their high status peers as overconfident, but simply that they were terrific. “This overconfidence did not come across as narcissistic,” explains Anderson. “The most overconfident people were considered the most beloved.”

Study 4 sought to discover the types of behaviors that make overconfident people appear to be so wonderful (even when they were not). Behaviors such as body language, vocal tone, rates of participation were captured on video as groups worked together in a laboratory setting. These videos revealed that overconfident individuals spoke more often, spoke with a confident vocal tone, provided more information and answers, and acted calmly and relaxed as they worked with their peers. In fact, overconfident individuals were more convincing in their displays of ability than individuals who were actually highly competent.

“These big participators were not obnoxious, they didn’t say, ‘I’m really good at this.’ Instead, their behavior was much more subtle. They simply participated more and exhibited more comfort with the task — even though they were no more competent than anyone else,” says Anderson.

Two final studies found that it is the “desire” for status that encourages people to be more overconfident. For example, in Study 6, participants read one of two stories and were asked to imagine themselves as the protagonist in the story. The first story was a simple, bland narrative of losing then finding one’s keys. The second story asked the reader to imagine him/herself getting a new job with a prestigious company. The job had many opportunities to obtain higher status, including a promotion, a bonus, and a fast track to the top. Those participants who read the new job scenario rated their desire for status much higher than those who read the story of the lost keys.

After they were finished reading, participants were asked to rate themselves on a number of competencies such as critical thinking skills, intelligence, and the ability to work in teams. Those who had read the new job story (which stimulated their desire for status) rated their skills and talent much higher than did the first group. Their desire for status amplified their overconfidence.

De-emphasizing the natural tendency toward overconfidence may prove difficult but Prof. Anderson hopes this research will give people the incentive to look for more objective indices of ability and merit in others, instead of overvaluing unsubstantiated confidence.

How Do They Do It? Predictions Are in for Arctic Sea Ice Low Point (Science Daily)

ScienceDaily (Aug. 14, 2012) — It’s become a sport of sorts, predicting the low point of Arctic sea ice each year. Expert scientists with decades of experience do it but so do enthusiasts, whose guesses are gamely included in a monthly predictions roundup collected by Sea Ice Outlook, an effort supported by the U.S. government.

Arctic sea ice, as seen from an ice breaker. (Credit: Bonnie Light, UW)

When averaged, the predictions have come in remarkably close to the mark in the past two years. But the low and high predictions are off by hundreds of thousands of square kilometers.

Researchers are working hard to improve their ability to more accurately predict how much Arctic sea ice will remain at the end of summer. It’s an important exercise because knowing why sea ice declines could help scientists better understand climate change and how sea ice is evolving.

This year, researchers from the University of Washington’s Polar Science Center are the first to include new NASA sea ice thickness data collected by airplane in a prediction.

They expect 4.4 million square kilometers of remaining ice (about 1.7 million square miles), just barely more than the 4.3 million kilometers in 2007, the lowest year on record for Arctic sea ice. The median of 23 predictions collected by the Sea Ice Outlook and released on Aug. 13 is 4.3 million.

“One drawback to making predictions is historically we’ve had very little information about the thickness of the ice in the current year,” said Ron Lindsay, a climatologist at the Polar Science Center, a department in the UW’s Applied Physics Laboratory.

To make their prediction, Lindsay and Jinlun Zhang, an oceanographer in the Polar Science Center, start with a widely used model pioneered by Zhang and known as the Pan-Arctic Ice Ocean Modeling and Assimilation System. That system combines available observations with a model to track sea ice volume, which includes both ice thickness and extent.

But obtaining observations about current-year ice thickness in order to build their short-term prediction is tough. NASA is currently in the process of designing a new satellite that will replace one that used to deliver ice thickness data but has since failed. In the meantime, NASA is running a program called Operation IceBridge that uses airplanes to survey sea ice as well as Arctic ice sheets.

“This is the first year they made a concerted effort to get the data from the aircraft, process it and get it into hands of scientists in a timely manner,” Lindsay said. “In the past, we’ve gotten data from submarines, moorings or satellites but none of that data was available in a timely manner. It took months or even years.”

There’s a shortcoming to the IceBridge data, however: It’s only available through March. The radar used to measure snow depth on the surface of the ice, an important element in the observation system, has trouble accurately gauging the depth once it has melted and so the data is only collected through the early spring before the thaw.

The UW scientists have developed a method for informing their prediction that is starting to be used by others. Researchers have struggled with how best to forecast the weather in the Arctic, which affects ice melt and distribution.

“Jinlun came up with the idea of using the last seven summers. Because the climate is changing so fast, only the recent summers are probably relevant,” Lindsay said.

The result is seven different possibilities of what might happen. “The average of those is our best guess,” Lindsay said.

Despite the progress in making predictions, the researchers say their abilities to foretell the future will always be limited. Because they can’t forecast the weather very far in advance and because the ice is strongly affected by winds, they have little confidence beyond what the long-term trend tells us in predictions that are made far in advance.

“The accuracy of our prediction really depends on time,” Zhang said. “Our June 1 prediction for the Sept. 15 low point has high uncertainty but as we approach the end of June or July, the uncertainty goes down and the accuracy goes up.”

In hindsight, that’s true historically for the average predictions collected by Study of Environmental Arctic Change’s Sea Ice Outlook, a project funded by the National Science Foundation and the National Oceanic and Atmospheric Administration.

While the competitive aspect of the predictions is fun, the researchers aren’t in it to win it.

“Essentially it’s not for prediction but for understanding,” Zhang said. “We do it to improve our understanding of sea ice processes, in terms of how dynamic processes affect the seasonal evolution of sea ice.”

That may not be entirely the same for the enthusiasts who contribute a prediction. One climate blog polls readers in the summer for their best estimate of the sea ice low point. It’s included among the predictions collected by the Sea Ice Outlook, with an asterisk noting it as a “public outlook.”

The National Science Foundation and NASA fund the UW research into the Arctic sea ice low point.

Nova legislação dará base científica à prevenção de desastres naturais, dizem especialistas (Fapesp)

Lei sancionada em abril obrigará municípios a elaborar carta geotécnica, instrumento multidisciplinar que orientará implantação de sistemas de alerta e planos diretores (Valter Campanato/ABr)

08/08/2012

Por Fábio de Castro

Agência FAPESP – Em janeiro de 2011, enchentes e deslizamentos deixaram cerca de mil mortos e 500 desaparecidos na Região Serrana do Rio de Janeiro. A tragédia evidenciou a precariedade dos sistemas de alerta no Brasil e foi considerada por especialistas como a prova definitiva de que era preciso investir na prevenção de desastres.

O mais importante desdobramento dessa análise foi a Lei 12.608, sancionada em abril, que estabelece a Política Nacional de Proteção e Defesa Civil e cria o sistema de informações e monitoramento de desastres, de acordo com especialistas reunidos no seminário “Caminhos da política nacional de defesa de áreas de risco”, realizado pela Escola Politécnica da Universidade de São Paulo (USP) no dia 6 de agosto.

A nova lei obriga as prefeituras a investir em planejamento urbano na prevenção de desastres do tipo enchentes e deslizamentos de terra. Segundo os especialistas, pela primeira vez a prevenção de desastres poderá ser feita com fundamento técnico e científico sólido, já que a lei determina que, para fazer o planejamento, todas as prefeituras precisarão elaborar cartas geotécnicas dos municípios.

Katia Canil, pesquisadora do Laboratório de Riscos Ambientais do Instituto de Pesquisas Tecnológicas (IPT), disse que as prefeituras terão dois anos para elaborar as cartas geotécnicas para lastrear seus planos diretores, que deverão contemplar ações de prevenção e mitigação de desastres. Os municípios que não apresentarem esse planejamento não receberão recursos federais para obras de prevenção e mitigação.

“As cartas geotécnicas são documentos cartográficos que reúnem informações sobre as características geológicas e geomorfológicas dos municípios, identificando riscos geológicos e facilitando a criação de regras para a ocupação urbana. Com a obrigatoriedade desse instrumento, expressa na lei, poderemos ter estratégias de prevenção de desastres traçadas com base no conhecimento técnico e científico”, disse Canil à Agência FAPESP.

A primeira carta geotécnica do Brasil foi feita em 1979, no município de Santos (SP), mas, ainda assim, o instrumento se manteve pouco difundido no país. Segundo Canil, a institucionalização da ferramenta será um fator importante para a adequação dos planos diretores em relação às características geotécnicas dos terrenos.

“Poucos municípios têm carta geotécnica, porque não era um instrumento obrigatório. Agora, esse panorama deve mudar. Mas a legislação irá gerar uma grande demanda de especialistas em diversas áreas, porque as cartas geotécnicas integram uma gama de dados interdisciplinares”, disse a pesquisadora do IPT.

As cartas geotécnicas reúnem documentos que resultam de levantamentos geológicos e geotécnicos de campo, além de análises laboratoriais, com o objetivo de sintetizar todo o conhecimento disponível sobre o meio físico e sua relação com os processos geológicos e humanos presentes no local. “E tudo isso precisa ser expresso em uma linguagem adequada para que os gestores compreendam”, disse Canil.

As cidades terão que se organizar para elaborar cartas geotécnicas e a capacitação técnica necessária não é trivial. “Não se trata apenas de cruzar mapas. É preciso ter experiência aliada ao treinamento em áreas como geologia, engenharia, engenharia geotécnica, cartografia, geografia, arquitetura e urbanismo”, disse Canil. O IPT já oferece um curso de capacitação para elaboração de cartas geotécnicas.

Uma dificuldade importante para a elaboração das cartas será a carência de mapeamento geológico de base nos municípios brasileiros. “A maior parte dos municípios não tem dados primários, como mapeamentos geomorfológicos, pedológicos e geológicos”, disse Canil.

Plano nacional de prevenção

A tragédia da Região Serrana fluminense, em janeiro de 2011, foi um marco que mudou o rumo das discussões sobre desastres, destacando definitivamente o papel central da prevenção, segundo Carlos Nobre, secretário de Políticas e Programas de Pesquisa e Desenvolvimento do Ministério da Ciência, Tecnologia e Inovação (MCTI).

“Aquele episódio foi um solavanco que chacoalhou a percepção brasileira para o tema dos grandes desastres. Tornou-se óbvio para os gestores e para a população que é preciso enfatizar o eixo da prevenção. Foi um marco que mudou nossa perspectiva para sempre: prevenção é fundamental”, disse durante o evento.

Segundo Nobre, que também é pesquisador do Instituto Nacional de Pesquisas Espaciais (Inpe) e membro da coordenação do Programa FAPESP de Pesquisa sobre Mudanças Climáticas Globais, a experiência internacional mostra que a prevenção pode reduzir em até 90% o número de vítimas fatais em desastres naturais, além de diminuir em cerca de 35% os danos materiais. “Além de poupar vidas, a economia com os prejuízos materiais já compensa com sobras todos os investimentos em prevenção”, disse.

De acordo com Nobre, a engenharia terá um papel cada vez mais importante na prevenção, à medida que os desastres naturais se tornarem mais extremos por consequência das mudanças climáticas.

“O engenheiro do século 21 precisará ser treinado para a engenharia da sustentabilidade – um campo transversal da engenharia que ganhará cada vez mais espaço. A engenharia, se bem conduzida, é central para solucionar alguns dos principais problemas da atualidade”, afirmou.

Segundo Nobre, além da nova legislação, que obrigará o planejamento com base em cartas geotécnicas dos municípios, o Brasil conta com diversas iniciativas na área de prevenção de desastres. Uma delas será anunciada nesta quarta-feira (08/08): o Plano Nacional de Prevenção a Desastres Naturais, que enfatiza as obras voltadas para a instalação de sistemas de alerta.

“Há obras de grande escala necessárias no Brasil, especialmente no que se refere aos sistemas de alerta. Um dos elementos importantes do novo plano é a questão do alerta precoce. Experiências internacionais mostram que um alerta feito até duas horas antes de um deslizamento é capaz de salvar vidas”, disse.

Segundo Nobre, as iniciativas do plano serão coerentes com a nova legislação. O governo federal deverá investir R$ 4,6 bilhões, nos próximos meses, em iniciativas de prevenção de desastres nos estados do Rio de Janeiro, Minas Gerais e Santa Catarina.

Mas, para pleitear verbas federais, o município deverá cumprir uma série de requisitos, como incorporar as ações de proteção e defesa civil no planejamento municipal, identificar e mapear as áreas de risco de desastres naturais, impedir novas ocupações e vistoriar edificações nessas áreas.

Segundo Nobre, outra ação voltada para a prevenção de desastres foi a implantação do Centro Nacional de Monitoramento e Alertas de Desastres Naturais (Cemaden), do MCTI, que começou a operar em dezembro de 2011, no campus do Inpe em Cachoeira Paulista (SP).

“Esse centro já tinha um papel importante na previsão de tempo, mas foi reformulado e contratou 35 profissionais. O Cemaden nasce como um emblema dos novos sistemas de alerta: uma concepção que une geólogos, meteorólogos e especialistas em desastres naturais para identificar vulnerabilidades, algo raro no mundo”, afirmou.

Segundo ele, essa nova estrutura já tem um sistema de alertas em funcionamento. “É um sistema que ainda vai precisar ser avaliado com o tempo. Mas até agora, desde dezembro de 2011, já foram lançados mais de 100 alertas. O país levará vários anos para reduzir as fatalidades como os países que têm bons sistemas de prevenção. Mas estamos no caminho certo”, disse Nobre.

Heatwave turns America’s waterways into rivers of death (The Independent)

Falling water levels are killing fish and harming exports

DAVID USBORNE

SUNDAY 05 AUGUST 2012

The cruel summer heat-wave that continues to scorch agricultural crops across much of the United States and which is prompting comparisons with the severe droughts of the 1930s and 1950s is also leading to record-breaking water temperatures in rivers and streams, including the Mississippi, as well as fast-falling navigation levels.

While in the northern reaches of the Mississippi, near Moline in Illinois, the temperature touched 90 degrees last week – warmer than the Gulf of Mexico around the Florida Keys – towards the river’s southern reaches the US Army Corps of Engineers is dredging around the clock to try to keep barges from grounding as water levels dive.

For scientists the impact of a long, hot summer that has plunged more than two-thirds of the country into drought conditions – sometimes extreme – has been particularly striking in the Great Lakes. According to the Great Lakes Environmental Research Laboratory, all are experiencing unusual spikes in water temperature this year. It is especially the case for Lake Superior, the northernmost, the deepest, and therefore the coolest.

“It’s pretty safe to say that what we’re seeing here is the warmest that we’ve seen in Lake Superior in a century,” said Jay Austin, a professor at the University of Minnesota at Duluth. The average temperature recorded for the lake last week was 68F (20C). That compares with 56F (13C) at this time last year.

It is a boon to shoreline residents who are finding normally chilly waters suddenly inviting for a dip. But the warming of the rivers, in particular, is taking a harsh toll on fish, which are dying in increasingly large numbers. Significant tolls of fresh-water species, from pike to trout, have been reported, most frequently in the Midwest.

“Most problems occur in ponds that are not deep enough for fish to retreat to cooler and more oxygen-rich water,” said Jake Allman of the Missouri Department of Conservation. “Hot water holds less oxygen than cool water. Shallow ponds get warmer than deeper ponds, and with little rain, area ponds are becoming shallower by the day. Evaporation rates are up to 11 inches per month in these conditions.”

In some instances, fish are simply left high and dry as rivers dry up entirely. It is the case of the normally rushing River Platte which has simply petered out over a 100-mile stretch in Nebraska, large parts of which are now federal disaster areas contending with so-called “exceptional drought” conditions.

“This is the worst I’ve ever seen it, and I’ve been on the river since I was a pup,” Dan Kneifel, owner of Geno’s Bait and Tackle Shop, told TheOmahaChannel.com. “The river was full of fish, and to see them all die is a travesty.”

As water levels in the Mississippi ebb, so barge operators are forced to offload cargo to keep their vessels moving. About 60 per cent of exported US corn is conveyed by the Mississippi, which is now 12ft below normal levels in some stretches. Navigation on the Mississippi has not been so severely threatened since the 1988 drought in the US. Few forget, meanwhile, that last summer towns up and down the Mississippi were battling flooding.

One welcome side-effect, however, is data showing that the so-called “dead zone” in the Gulf of Mexico around the Mississippi estuary is far less extensive this summer because the lack of rain and the slow running of the water has led to much less nitrate being washed off farmland and into the system than in normal years. The phenomenon occurs because the nitrates feed blooms of algae in Gulf waters which then decompose, stripping the water of oxygen.

Chronic 2000-04 drought, worst in 800 years, may be the ‘new normal’ (Oregon State Univ)

Public release date: 29-Jul-2012

By Beverly Law

Oregon State University

CORVALLIS, Ore. – The chronic drought that hit western North America from 2000 to 2004 left dying forests and depleted river basins in its wake and was the strongest in 800 years, scientists have concluded, but they say those conditions will become the “new normal” for most of the coming century.

Such climatic extremes have increased as a result of global warming, a group of 10 researchers reported today in Nature Geoscience. And as bad as conditions were during the 2000-04 drought, they may eventually be seen as the good old days.

Climate models and precipitation projections indicate this period will actually be closer to the “wet end” of a drier hydroclimate during the last half of the 21st century, scientists said.

Aside from its impact on forests, crops, rivers and water tables, the drought also cut carbon sequestration by an average of 51 percent in a massive region of the western United States, Canada and Mexico, although some areas were hit much harder than others. As vegetation withered, this released more carbon dioxide into the atmosphere, with the effect of amplifying global warming.

“Climatic extremes such as this will cause more large-scale droughts and forest mortality, and the ability of vegetation to sequester carbon is going to decline,” said Beverly Law, a co-author of the study, professor of global change biology and terrestrial systems science at Oregon State University, and former science director of AmeriFlux, an ecosystem observation network.

“During this drought, carbon sequestration from this region was reduced by half,” Law said. “That’s a huge drop. And if global carbon emissions don’t come down, the future will be even worse.”

This research was supported by the National Science Foundation, NASA, U.S. Department of Energy, and other agencies. The lead author was Christopher Schwalm at Northern Arizona University. Other collaborators were from the University of Colorado, University of California at Berkeley, University of British Columbia, San Diego State University, and other institutions.

It’s not clear whether or not the current drought in the Midwest, now being called one of the worst since the Dust Bowl, is related to these same forces, Law said. This study did not address that, and there are some climate mechanisms in western North America that affect that region more than other parts of the country.

But in the West, this multi-year drought was unlike anything seen in many centuries, based on tree ring data. The last two periods with drought events of similar severity were in the Middle Ages, from 977-981 and 1146-1151. The 2000-04 drought affected precipitation, soil moisture, river levels, crops, forests and grasslands.

Ordinarily, Law said, the land sink in North America is able to sequester the equivalent of about 30 percent of the carbon emitted into the atmosphere by the use of fossil fuels in the same region. However, based on projected changes in precipitation and drought severity, scientists said that this carbon sink, at least in western North America, could disappear by the end of the century.

“Areas that are already dry in the West are expected to get drier,” Law said. “We expect more extremes. And it’s these extreme periods that can really cause ecosystem damage, lead to climate-induced mortality of forests, and may cause some areas to convert from forest into shrublands or grassland.”

During the 2000-04 drought, runoff in the upper Colorado River basin was cut in half. Crop productivity in much of the West fell 5 percent. The productivity of forests and grasslands declined, along with snowpacks. Evapotranspiration decreased the most in evergreen needleleaf forests, about 33 percent.

The effects are driven by human-caused increases in temperature, with associated lower soil moisture and decreased runoff in all major water basins of the western U.S., researchers said in the study.

Although regional precipitations patterns are difficult to forecast, researchers in this report said that climate models are underestimating the extent and severity of drought, compared to actual observations. They say the situation will continue to worsen, and that 80 of the 95 years from 2006 to 2100 will have precipitation levels as low as, or lower than, this “turn of the century” drought from 2000-04.

“Towards the latter half of the 21st century the precipitation regime associated with the turn of the century drought will represent an outlier of extreme wetness,” the scientists wrote in this study.

These long-term trends are consistent with a 21st century “megadrought,” they said.

Need an Expert? Try the Crowd (Science Daily)

ScienceDaily (Aug. 14, 2012) — “It’s potentially a new way to do science.”

In 1714, the British government held a contest. They offered a large cash prize to anyone who could solve the vexing “longitude problem” — how to determine a ship’s east/west position on the open ocean — since none of their naval experts had been able to do so.

Lots of people gave it a try. One of them, a self-educated carpenter named John Harrison, invented the marine chronometer — a rugged and highly precise clock — that did the trick. For the first time, sailors could accurately determine their location at sea.

A centuries-old problem was solved. And, arguably, crowdsourcing was born.

Crowdsourcing is basically what it sounds like: posing a question or asking for help from a large group of people. Coined as a term in 2006, crowdsourcing has taken off in the internet era. Think of Wikipedia, and its thousands of unpaid contributors, now vastly larger than the Encyclopedia Britannica.

Crowdsourcing has allowed many problems to be solved that would be impossible for experts alone. Astronomers rely on an army of volunteers to scan for new galaxies. At climateprediction.net, citizens have linked their home computers to yield more than a hundred million hours of climate modeling; it’s the world’s largest forecasting experiment.

But what if experts didn’t simply ask the crowd to donate time or answer questions? What if the crowd was asked to decide what questions to ask in the first place?

Could the crowd itself be the expert?

That’s what a team at the University of Vermont decided to explore — and the answer seems to be yes.

Prediction from the people

Josh Bongard and Paul Hines, professors in UVM’s College of Engineering and Mathematical Sciences, and their students, set out to discover if volunteers who visited two different websites could pose, refine, and answer questions of each other — that could effectively predict the volunteers’ body weight and home electricity use.

The experiment, the first of its kind, was a success: the self-directed questions and answers by visitors to the websites led to computer models that effectively predict user’s monthly electricity consumption and body mass index.

Their results, “Crowdsourcing Predictors of Behavioral Outcomes,” were published in a recent edition of IEEE Transactions: Systems, Man and Cybernetics, a journal of the Institute of Electrical and Electronics Engineers.

“It’s proof of concept that a crowd actually can come up with good questions that lead to good hypotheses,” says Bongard, an expert on machine science.

In other words, the wisdom of the crowd can be harnessed to determine which variables to study, the UVM project shows — and at the same time provide a pool of data by responding to the questions they ask of each other.

“The result is a crowdsourced predictive model,” the Vermont scientists write.

Unexpected angles

Some of the questions the volunteers posed were obvious. For example, on the website dedicated to exploring body weight, visitors came up with the question: “Do you think of yourself as overweight?” And, no surprise, that proved to be the question with the most power to predict people’s body weight.

But some questions posed by the volunteers were less obvious. “We had some eye-openers,” Bongard says. “How often do you masturbate a month?” might not be the first question asked by weight-loss experts, but it proved to be the second-most-predictive question of the volunteer’s self-reported weights — more predictive than “how often do you eat during a day?”

“Sometimes the general public has intuition about stuff that experts miss — there’s a long literature on this,” Hines says.

“It’s those people who are very underweight or very overweight who might have an explanation for why they’re at these extremes — and some of those explanations might not be a simple combination of diet and exercise,” says Bongard. “There might be other things that experts missed.”

Cause and correlation

The researchers are quick to note that the variables revealed by the evolving Q&A on the experimental websites are simply correlated to outcomes — body weight and electricity use — not necessarily the cause.

“We’re not arguing that this study is actually predictive of the causes,” says Hines, “but improvements to this method may lead in that direction.”

Nor do the scientists make claim to being experts on body weight or to be providing recommendations on health or diet (though Hines is an expert on electricity, and the EnergyMinder site he and his students developed for this project has a larger aim to help citizens understand and reduce their household energy use.)

“We’re simply investigating the question: could you involve participants in the hypothesis-generation part of the scientific process?” Bongard says. “Our paper is a demonstration of this methodology.”

“Going forward, this approach may allow us to involve the public in deciding what it is that is interesting to study,” says Hines. “It’s potentially a new way to do science.”

And there are many reasons why this new approach might be helpful. In addition to forces that experts might simply not know about — “can we elicit unexpected predictors that an expert would not have come up with sitting in his office?” Hines asks — experts often have deeply held biases.

Faster discoveries

But the UVM team primarily sees their new approach as potentially helping to accelerate the process of scientific discovery. The need for expert involvement — in shaping, say, what questions to ask on a survey or what variable to change to optimize an engineering design — “can become a bottleneck to new insights,” the scientists write.

“We’re looking for an experimental platform where, instead of waiting to read a journal article every year about what’s been learned about obesity,” Bongard says, “a research site could be changing and updating new findings constantly as people add their questions and insights.”

The goal: “exponential rises,” the UVM scientists write, in the discovery of what causes behaviors and patterns — probably driven by the people who care about them the most. For example, “it might be smokers or people suffering from various diseases,” says Bongard. The team thinks this new approach to science could “mirror the exponential growth found in other online collaborative communities,” they write.

“We’re all problem-solving animals,” says Bongard, “so can we exploit that? Instead of just exploiting the cycles of your computer or your ability to say ‘yes’ or ‘no’ on a survey — can we exploit your creative brain?”

Global Warming’s Terrifying New Math (Rolling Stone)

Three simple numbers that add up to global catastrophe – and that make clear who the real enemy is

by: Bill McKibben

reckoning illoIllustration by Edel Rodriguez

If the pictures of those towering wildfires in Colorado haven’t convinced you, or the size of your AC bill this summer, here are some hard numbers about climate change: June broke or tied 3,215 high-temperature records across the United States. That followed the warmest May on record for the Northern Hemisphere – the 327th consecutive month in which the temperature of the entire globe exceeded the 20th-century average, the odds of which occurring by simple chance were 3.7 x 10-99, a number considerably larger than the number of stars in the universe.

Meteorologists reported that this spring was the warmest ever recorded for our nation – in fact, it crushed the old record by so much that it represented the “largest temperature departure from average of any season on record.” The same week, Saudi authorities reported that it had rained in Mecca despite a temperature of 109 degrees, the hottest downpour in the planet’s history.

Not that our leaders seemed to notice. Last month the world’s nations, meeting in Rio for the 20th-anniversary reprise of a massive 1992 environmental summit, accomplished nothing. Unlike George H.W. Bush, who flew in for the first conclave, Barack Obama didn’t even attend. It was “a ghost of the glad, confident meeting 20 years ago,” the British journalist George Monbiot wrote; no one paid it much attention, footsteps echoing through the halls “once thronged by multitudes.” Since I wrote one of the first books for a general audience about global warming way back in 1989, and since I’ve spent the intervening decades working ineffectively to slow that warming, I can say with some confidence that we’re losing the fight, badly and quickly – losing it because, most of all, we remain in denial about the peril that human civilization is in.

When we think about global warming at all, the arguments tend to be ideological, theological and economic. But to grasp the seriousness of our predicament, you just need to do a little math. For the past year, an easy and powerful bit of arithmetical analysis first published by financial analysts in the U.K. has been making the rounds of environmental conferences and journals, but it hasn’t yet broken through to the larger public. This analysis upends most of the conventional political thinking about climate change. And it allows us to understand our precarious – our almost-but-not-quite-finally hopeless – position with three simple numbers.

The First Number: 2° Celsius

If the movie had ended in Hollywood fashion, the Copenhagen climate conference in 2009 would have marked the culmination of the global fight to slow a changing climate. The world’s nations had gathered in the December gloom of the Danish capital for what a leading climate economist, Sir Nicholas Stern of Britain, called the “most important gathering since the Second World War, given what is at stake.” As Danish energy minister Connie Hedegaard, who presided over the conference, declared at the time: “This is our chance. If we miss it, it could take years before we get a new and better one. If ever.”

In the event, of course, we missed it. Copenhagen failed spectacularly. Neither China nor the United States, which between them are responsible for 40 percent of global carbon emissions, was prepared to offer dramatic concessions, and so the conference drifted aimlessly for two weeks until world leaders jetted in for the final day. Amid considerable chaos, President Obama took the lead in drafting a face-saving “Copenhagen Accord” that fooled very few. Its purely voluntary agreements committed no one to anything, and even if countries signaled their intentions to cut carbon emissions, there was no enforcement mechanism. “Copenhagen is a crime scene tonight,” an angry Greenpeace official declared, “with the guilty men and women fleeing to the airport.” Headline writers were equally brutal: COPENHAGEN: THE MUNICH OF OUR TIMES? asked one.

The accord did contain one important number, however. In Paragraph 1, it formally recognized “the scientific view that the increase in global temperature should be below two degrees Celsius.” And in the very next paragraph, it declared that “we agree that deep cuts in global emissions are required… so as to hold the increase in global temperature below two degrees Celsius.” By insisting on two degrees – about 3.6 degrees Fahrenheit – the accord ratified positions taken earlier in 2009 by the G8, and the so-called Major Economies Forum. It was as conventional as conventional wisdom gets. The number first gained prominence, in fact, at a 1995 climate conference chaired by Angela Merkel, then the German minister of the environment and now the center-right chancellor of the nation.

Some context: So far, we’ve raised the average temperature of the planet just under 0.8 degrees Celsius, and that has caused far more damage than most scientists expected. (A third of summer sea ice in the Arctic is gone, the oceans are 30 percent more acidic, and since warm air holds more water vapor than cold, the atmosphere over the oceans is a shocking five percent wetter, loading the dice for devastating floods.) Given those impacts, in fact, many scientists have come to think that two degrees is far too lenient a target. “Any number much above one degree involves a gamble,” writes Kerry Emanuel of MIT, a leading authority on hurricanes, “and the odds become less and less favorable as the temperature goes up.” Thomas Lovejoy, once the World Bank’s chief biodiversity adviser, puts it like this: “If we’re seeing what we’re seeing today at 0.8 degrees Celsius, two degrees is simply too much.” NASA scientist James Hansen, the planet’s most prominent climatologist, is even blunter: “The target that has been talked about in international negotiations for two degrees of warming is actually a prescription for long-term disaster.” At the Copenhagen summit, a spokesman for small island nations warned that many would not survive a two-degree rise: “Some countries will flat-out disappear.” When delegates from developing nations were warned that two degrees would represent a “suicide pact” for drought-stricken Africa, many of them started chanting, “One degree, one Africa.”

Despite such well-founded misgivings, political realism bested scientific data, and the world settled on the two-degree target – indeed, it’s fair to say that it’s the only thing about climate change the world has settled on. All told, 167 countries responsible for more than 87 percent of the world’s carbon emissions have signed on to the Copenhagen Accord, endorsing the two-degree target. Only a few dozen countries have rejected it, including Kuwait, Nicaragua and Venezuela. Even the United Arab Emirates, which makes most of its money exporting oil and gas, signed on. The official position of planet Earth at the moment is that we can’t raise the temperature more than two degrees Celsius – it’s become the bottomest of bottom lines. Two degrees.

The Second Number: 565 Gigatons

Scientists estimate that humans can pour roughly 565 more gigatons of carbon dioxide into the atmosphere by midcentury and still have some reasonable hope of staying below two degrees. (“Reasonable,” in this case, means four chances in five, or somewhat worse odds than playing Russian roulette with a six-shooter.)

This idea of a global “carbon budget” emerged about a decade ago, as scientists began to calculate how much oil, coal and gas could still safely be burned. Since we’ve increased the Earth’s temperature by 0.8 degrees so far, we’re currently less than halfway to the target. But, in fact, computer models calculate that even if we stopped increasing CO2 now, the temperature would likely still rise another 0.8 degrees, as previously released carbon continues to overheat the atmosphere. That means we’re already three-quarters of the way to the two-degree target.

How good are these numbers? No one is insisting that they’re exact, but few dispute that they’re generally right. The 565-gigaton figure was derived from one of the most sophisticated computer-simulation models that have been built by climate scientists around the world over the past few decades. And the number is being further confirmed by the latest climate-simulation models currently being finalized in advance of the next report by the Intergovernmental Panel on Climate Change. “Looking at them as they come in, they hardly differ at all,” says Tom Wigley, an Australian climatologist at the National Center for Atmospheric Research. “There’s maybe 40 models in the data set now, compared with 20 before. But so far the numbers are pretty much the same. We’re just fine-tuning things. I don’t think much has changed over the last decade.” William Collins, a senior climate scientist at the Lawrence Berkeley National Laboratory, agrees. “I think the results of this round of simulations will be quite similar,” he says. “We’re not getting any free lunch from additional understanding of the climate system.”

We’re not getting any free lunch from the world’s economies, either. With only a single year’s lull in 2009 at the height of the financial crisis, we’ve continued to pour record amounts of carbon into the atmosphere, year after year. In late May, the International Energy Agency published its latest figures – CO2 emissions last year rose to 31.6 gigatons, up 3.2 percent from the year before. America had a warm winter and converted more coal-fired power plants to natural gas, so its emissions fell slightly; China kept booming, so its carbon output (which recently surpassed the U.S.) rose 9.3 percent; the Japanese shut down their fleet of nukes post-Fukushima, so their emissions edged up 2.4 percent. “There have been efforts to use more renewable energy and improve energy efficiency,” said Corinne Le Quéré, who runs England’s Tyndall Centre for Climate Change Research. “But what this shows is that so far the effects have been marginal.” In fact, study after study predicts that carbon emissions will keep growing by roughly three percent a year – and at that rate, we’ll blow through our 565-gigaton allowance in 16 years, around the time today’s preschoolers will be graduating from high school. “The new data provide further evidence that the door to a two-degree trajectory is about to close,” said Fatih Birol, the IEA’s chief economist. In fact, he continued, “When I look at this data, the trend is perfectly in line with a temperature increase of about six degrees.” That’s almost 11 degrees Fahrenheit, which would create a planet straight out of science fiction.

So, new data in hand, everyone at the Rio conference renewed their ritual calls for serious international action to move us back to a two-degree trajectory. The charade will continue in November, when the next Conference of the Parties (COP) of the U.N. Framework Convention on Climate Change convenes in Qatar. This will be COP 18 – COP 1 was held in Berlin in 1995, and since then the process has accomplished essentially nothing. Even scientists, who are notoriously reluctant to speak out, are slowly overcoming their natural preference to simply provide data. “The message has been consistent for close to 30 years now,” Collins says with a wry laugh, “and we have the instrumentation and the computer power required to present the evidence in detail. If we choose to continue on our present course of action, it should be done with a full evaluation of the evidence the scientific community has presented.” He pauses, suddenly conscious of being on the record. “I should say, a fuller evaluation of the evidence.”

So far, though, such calls have had little effect. We’re in the same position we’ve been in for a quarter-century: scientific warning followed by political inaction. Among scientists speaking off the record, disgusted candor is the rule. One senior scientist told me, “You know those new cigarette packs, where governments make them put a picture of someone with a hole in their throats? Gas pumps should have something like that.”

The Third Number: 2,795 Gigatons

This number is the scariest of all – one that, for the first time, meshes the political and scientific dimensions of our dilemma. It was highlighted last summer by the Carbon Tracker Initiative, a team of London financial analysts and environmentalists who published a report in an effort to educate investors about the possible risks that climate change poses to their stock portfolios. The number describes the amount of carbon already contained in the proven coal and oil and gas reserves of the fossil-fuel companies, and the countries (think Venezuela or Kuwait) that act like fossil-fuel companies. In short, it’s the fossil fuel we’re currently planning to burn. And the key point is that this new number – 2,795 – is higher than 565. Five times higher.

The Carbon Tracker Initiative – led by James Leaton, an environmentalist who served as an adviser at the accounting giant PricewaterhouseCoopers – combed through proprietary databases to figure out how much oil, gas and coal the world’s major energy companies hold in reserve. The numbers aren’t perfect – they don’t fully reflect the recent surge in unconventional energy sources like shale gas, and they don’t accurately reflect coal reserves, which are subject to less stringent reporting requirements than oil and gas. But for the biggest companies, the figures are quite exact: If you burned everything in the inventories of Russia’s Lukoil and America’s ExxonMobil, for instance, which lead the list of oil and gas companies, each would release more than 40 gigatons of carbon dioxide into the atmosphere.

Which is exactly why this new number, 2,795 gigatons, is such a big deal. Think of two degrees Celsius as the legal drinking limit – equivalent to the 0.08 blood-alcohol level below which you might get away with driving home. The 565 gigatons is how many drinks you could have and still stay below that limit – the six beers, say, you might consume in an evening. And the 2,795 gigatons? That’s the three 12-packs the fossil-fuel industry has on the table, already opened and ready to pour.

We have five times as much oil and coal and gas on the books as climate scientists think is safe to burn. We’d have to keep 80 percent of those reserves locked away underground to avoid that fate. Before we knew those numbers, our fate had been likely. Now, barring some massive intervention, it seems certain.

Yes, this coal and gas and oil is still technically in the soil. But it’s already economically aboveground – it’s figured into share prices, companies are borrowing money against it, nations are basing their budgets on the presumed returns from their patrimony. It explains why the big fossil-fuel companies have fought so hard to prevent the regulation of carbon dioxide – those reserves are their primary asset, the holding that gives their companies their value. It’s why they’ve worked so hard these past years to figure out how to unlock the oil in Canada’s tar sands, or how to drill miles beneath the sea, or how to frack the Appalachians.

If you told Exxon or Lukoil that, in order to avoid wrecking the climate, they couldn’t pump out their reserves, the value of their companies would plummet. John Fullerton, a former managing director at JP Morgan who now runs the Capital Institute, calculates that at today’s market value, those 2,795 gigatons of carbon emissions are worth about $27 trillion. Which is to say, if you paid attention to the scientists and kept 80 percent of it underground, you’d be writing off $20 trillion in assets. The numbers aren’t exact, of course, but that carbon bubble makes the housing bubble look small by comparison. It won’t necessarily burst – we might well burn all that carbon, in which case investors will do fine. But if we do, the planet will crater. You can have a healthy fossil-fuel balance sheet, or a relatively healthy planet – but now that we know the numbers, it looks like you can’t have both. Do the math: 2,795 is five times 565. That’s how the story ends.

So far, as I said at the start, environmental efforts to tackle global warming have failed. The planet’s emissions of carbon dioxide continue to soar, especially as developing countries emulate (and supplant) the industries of the West. Even in rich countries, small reductions in emissions offer no sign of the real break with the status quo we’d need to upend the iron logic of these three numbers. Germany is one of the only big countries that has actually tried hard to change its energy mix; on one sunny Saturday in late May, that northern-latitude nation generated nearly half its power from solar panels within its borders. That’s a small miracle – and it demonstrates that we have the technology to solve our problems. But we lack the will. So far, Germany’s the exception; the rule is ever more carbon.

This record of failure means we know a lot about what strategies don’t work. Green groups, for instance, have spent a lot of time trying to change individual lifestyles: the iconic twisty light bulb has been installed by the millions, but so have a new generation of energy-sucking flatscreen TVs. Most of us are fundamentally ambivalent about going green: We like cheap flights to warm places, and we’re certainly not going to give them up if everyone else is still taking them. Since all of us are in some way the beneficiaries of cheap fossil fuel, tackling climate change has been like trying to build a movement against yourself – it’s as if the gay-rights movement had to be constructed entirely from evangelical preachers, or the abolition movement from slaveholders.

People perceive – correctly – that their individual actions will not make a decisive difference in the atmospheric concentration of CO2; by 2010, a poll found that “while recycling is widespread in America and 73 percent of those polled are paying bills online in order to save paper,” only four percent had reduced their utility use and only three percent had purchased hybrid cars. Given a hundred years, you could conceivably change lifestyles enough to matter – but time is precisely what we lack.

A more efficient method, of course, would be to work through the political system, and environmentalists have tried that, too, with the same limited success. They’ve patiently lobbied leaders, trying to convince them of our peril and assuming that politicians would heed the warnings. Sometimes it has seemed to work. Barack Obama, for instance, campaigned more aggressively about climate change than any president before him – the night he won the nomination, he told supporters that his election would mark the moment “the rise of the oceans began to slow and the planet began to heal.” And he has achieved one significant change: a steady increase in the fuel efficiency mandated for automobiles. It’s the kind of measure, adopted a quarter-century ago, that would have helped enormously. But in light of the numbers I’ve just described, it’s obviously a very small start indeed.

At this point, effective action would require actually keeping most of the carbon the fossil-fuel industry wants to burn safely in the soil, not just changing slightly the speed at which it’s burned. And there the president, apparently haunted by the still-echoing cry of “Drill, baby, drill,” has gone out of his way to frack and mine. His secretary of interior, for instance, opened up a huge swath of the Powder River Basin in Wyoming for coal extraction: The total basin contains some 67.5 gigatons worth of carbon (or more than 10 percent of the available atmospheric space). He’s doing the same thing with Arctic and offshore drilling; in fact, as he explained on the stump in March, “You have my word that we will keep drilling everywhere we can… That’s a commitment that I make.” The next day, in a yard full of oil pipe in Cushing, Oklahoma, the president promised to work on wind and solar energy but, at the same time, to speed up fossil-fuel development: “Producing more oil and gas here at home has been, and will continue to be, a critical part of an all-of-the-above energy strategy.” That is, he’s committed to finding even more stock to add to the 2,795-gigaton inventory of unburned carbon.

Sometimes the irony is almost Borat-scale obvious: In early June, Secretary of State Hillary Clinton traveled on a Norwegian research trawler to see firsthand the growing damage from climate change. “Many of the predictions about warming in the Arctic are being surpassed by the actual data,” she said, describing the sight as “sobering.” But the discussions she traveled to Scandinavia to have with other foreign ministers were mostly about how to make sure Western nations get their share of the estimated $9 trillion in oil (that’s more than 90 billion barrels, or 37 gigatons of carbon) that will become accessible as the Arctic ice melts. Last month, the Obama administration indicated that it would give Shell permission to start drilling in sections of the Arctic.

Almost every government with deposits of hydrocarbons straddles the same divide. Canada, for instance, is a liberal democracy renowned for its internationalism – no wonder, then, that it signed on to the Kyoto treaty, promising to cut its carbon emissions substantially by 2012. But the rising price of oil suddenly made the tar sands of Alberta economically attractive – and since, as NASA climatologist James Hansen pointed out in May, they contain as much as 240 gigatons of carbon (or almost half of the available space if we take the 565 limit seriously), that meant Canada’s commitment to Kyoto was nonsense. In December, the Canadian government withdrew from the treaty before it faced fines for failing to meet its commitments.

The same kind of hypocrisy applies across the ideological board: In his speech to the Copenhagen conference, Venezuela’s Hugo Chavez quoted Rosa Luxemburg, Jean-Jacques Rousseau and “Christ the Redeemer,” insisting that “climate change is undoubtedly the most devastating environmental problem of this century.” But the next spring, in the Simon Bolivar Hall of the state-run oil company, he signed an agreement with a consortium of international players to develop the vast Orinoco tar sands as “the most significant engine for a comprehensive development of the entire territory and Venezuelan population.” The Orinoco deposits are larger than Alberta’s – taken together, they’d fill up the whole available atmospheric space.

So: the paths we have tried to tackle global warming have so far produced only gradual, halting shifts. A rapid, transformative change would require building a movement, and movements require enemies. As John F. Kennedy put it, “The civil rights movement should thank God for Bull Connor. He’s helped it as much as Abraham Lincoln.” And enemies are what climate change has lacked.

But what all these climate numbers make painfully, usefully clear is that the planet does indeed have an enemy – one far more committed to action than governments or individuals. Given this hard math, we need to view the fossil-fuel industry in a new light. It has become a rogue industry, reckless like no other force on Earth. It is Public Enemy Number One to the survival of our planetary civilization. “Lots of companies do rotten things in the course of their business – pay terrible wages, make people work in sweatshops – and we pressure them to change those practices,” says veteran anti-corporate leader Naomi Klein, who is at work on a book about the climate crisis. “But these numbers make clear that with the fossil-fuel industry, wrecking the planet is their business model. It’s what they do.”

According to the Carbon Tracker report, if Exxon burns its current reserves, it would use up more than seven percent of the available atmospheric space between us and the risk of two degrees. BP is just behind, followed by the Russian firm Gazprom, then Chevron, ConocoPhillips and Shell, each of which would fill between three and four percent. Taken together, just these six firms, of the 200 listed in the Carbon Tracker report, would use up more than a quarter of the remaining two-degree budget. Severstal, the Russian mining giant, leads the list of coal companies, followed by firms like BHP Billiton and Peabody. The numbers are simply staggering – this industry, and this industry alone, holds the power to change the physics and chemistry of our planet, and they’re planning to use it.

They’re clearly cognizant of global warming – they employ some of the world’s best scientists, after all, and they’re bidding on all those oil leases made possible by the staggering melt of Arctic ice. And yet they relentlessly search for more hydrocarbons – in early March, Exxon CEO Rex Tillerson told Wall Street analysts that the company plans to spend $37 billion a year through 2016 (about $100 million a day) searching for yet more oil and gas.

There’s not a more reckless man on the planet than Tillerson. Late last month, on the same day the Colorado fires reached their height, he told a New York audience that global warming is real, but dismissed it as an “engineering problem” that has “engineering solutions.” Such as? “Changes to weather patterns that move crop-production areas around – we’ll adapt to that.” This in a week when Kentucky farmers were reporting that corn kernels were “aborting” in record heat, threatening a spike in global food prices. “The fear factor that people want to throw out there to say, ‘We just have to stop this,’ I do not accept,” Tillerson said. Of course not – if he did accept it, he’d have to keep his reserves in the ground. Which would cost him money. It’s not an engineering problem, in other words – it’s a greed problem.

You could argue that this is simply in the nature of these companies – that having found a profitable vein, they’re compelled to keep mining it, more like efficient automatons than people with free will. But as the Supreme Court has made clear, they are people of a sort. In fact, thanks to the size of its bankroll, the fossil-fuel industry has far more free will than the rest of us. These companies don’t simply exist in a world whose hungers they fulfill – they help create the boundaries of that world.

Left to our own devices, citizens might decide to regulate carbon and stop short of the brink; according to a recent poll, nearly two-thirds of Americans would back an international agreement that cut carbon emissions 90 percent by 2050. But we aren’t left to our own devices. The Koch brothers, for instance, have a combined wealth of $50 billion, meaning they trail only Bill Gates on the list of richest Americans. They’ve made most of their money in hydrocarbons, they know any system to regulate carbon would cut those profits, and they reportedly plan to lavish as much as $200 million on this year’s elections. In 2009, for the first time, the U.S. Chamber of Commerce surpassed both the Republican and Democratic National Committees on political spending; the following year, more than 90 percent of the Chamber’s cash went to GOP candidates, many of whom deny the existence of global warming. Not long ago, the Chamber even filed a brief with the EPA urging the agency not to regulate carbon – should the world’s scientists turn out to be right and the planet heats up, the Chamber advised, “populations can acclimatize to warmer climates via a range of behavioral, physiological and technological adaptations.” As radical goes, demanding that we change our physiology seems right up there.

Environmentalists, understandably, have been loath to make the fossil-fuel industry their enemy, respecting its political power and hoping instead to convince these giants that they should turn away from coal, oil and gas and transform themselves more broadly into “energy companies.” Sometimes that strategy appeared to be working – emphasis on appeared. Around the turn of the century, for instance, BP made a brief attempt to restyle itself as “Beyond Petroleum,” adapting a logo that looked like the sun and sticking solar panels on some of its gas stations. But its investments in alternative energy were never more than a tiny fraction of its budget for hydrocarbon exploration, and after a few years, many of those were wound down as new CEOs insisted on returning to the company’s “core business.” In December, BP finally closed its solar division. Shell shut down its solar and wind efforts in 2009. The five biggest oil companies have made more than $1 trillion in profits since the millennium – there’s simply too much money to be made on oil and gas and coal to go chasing after zephyrs and sunbeams.

Much of that profit stems from a single historical accident: Alone among businesses, the fossil-fuel industry is allowed to dump its main waste, carbon dioxide, for free. Nobody else gets that break – if you own a restaurant, you have to pay someone to cart away your trash, since piling it in the street would breed rats. But the fossil-fuel industry is different, and for sound historical reasons: Until a quarter-century ago, almost no one knew that CO2 was dangerous. But now that we understand that carbon is heating the planet and acidifying the oceans, its price becomes the central issue.

If you put a price on carbon, through a direct tax or other methods, it would enlist markets in the fight against global warming. Once Exxon has to pay for the damage its carbon is doing to the atmosphere, the price of its products would rise. Consumers would get a strong signal to use less fossil fuel – every time they stopped at the pump, they’d be reminded that you don’t need a semimilitary vehicle to go to the grocery store. The economic playing field would now be a level one for nonpolluting energy sources. And you could do it all without bankrupting citizens – a so-called “fee-and-dividend” scheme would put a hefty tax on coal and gas and oil, then simply divide up the proceeds, sending everyone in the country a check each month for their share of the added costs of carbon. By switching to cleaner energy sources, most people would actually come out ahead.

There’s only one problem: Putting a price on carbon would reduce the profitability of the fossil-fuel industry. After all, the answer to the question “How high should the price of carbon be?” is “High enough to keep those carbon reserves that would take us past two degrees safely in the ground.” The higher the price on carbon, the more of those reserves would be worthless. The fight, in the end, is about whether the industry will succeed in its fight to keep its special pollution break alive past the point of climate catastrophe, or whether, in the economists’ parlance, we’ll make them internalize those externalities.

It’s not clear, of course, that the power of the fossil-fuel industry can be broken. The U.K. analysts who wrote the Carbon Tracker report and drew attention to these numbers had a relatively modest goal – they simply wanted to remind investors that climate change poses a very real risk to the stock prices of energy companies. Say something so big finally happens (a giant hurricane swamps Manhattan, a megadrought wipes out Midwest agriculture) that even the political power of the industry is inadequate to restrain legislators, who manage to regulate carbon. Suddenly those Chevron reserves would be a lot less valuable, and the stock would tank. Given that risk, the Carbon Tracker report warned investors to lessen their exposure, hedge it with some big plays in alternative energy.

“The regular process of economic evolution is that businesses are left with stranded assets all the time,” says Nick Robins, who runs HSBC’s Climate Change Centre. “Think of film cameras, or typewriters. The question is not whether this will happen. It will. Pension systems have been hit by the dot-com and credit crunch. They’ll be hit by this.” Still, it hasn’t been easy to convince investors, who have shared in the oil industry’s record profits. “The reason you get bubbles,” sighs Leaton, “is that everyone thinks they’re the best analyst – that they’ll go to the edge of the cliff and then jump back when everyone else goes over.”

So pure self-interest probably won’t spark a transformative challenge to fossil fuel. But moral outrage just might – and that’s the real meaning of this new math. It could, plausibly, give rise to a real movement.

Once, in recent corporate history, anger forced an industry to make basic changes. That was the campaign in the 1980s demanding divestment from companies doing business in South Africa. It rose first on college campuses and then spread to municipal and state governments; 155 campuses eventually divested, and by the end of the decade, more than 80 cities, 25 states and 19 counties had taken some form of binding economic action against companies connected to the apartheid regime. “The end of apartheid stands as one of the crowning accomplishments of the past century,” as Archbishop Desmond Tutu put it, “but we would not have succeeded without the help of international pressure,” especially from “the divestment movement of the 1980s.”

The fossil-fuel industry is obviously a tougher opponent, and even if you could force the hand of particular companies, you’d still have to figure out a strategy for dealing with all the sovereign nations that, in effect, act as fossil-fuel companies. But the link for college students is even more obvious in this case. If their college’s endowment portfolio has fossil-fuel stock, then their educations are being subsidized by investments that guarantee they won’t have much of a planet on which to make use of their degree. (The same logic applies to the world’s largest investors, pension funds, which are also theoretically interested in the future – that’s when their members will “enjoy their retirement.”) “Given the severity of the climate crisis, a comparable demand that our institutions dump stock from companies that are destroying the planet would not only be appropriate but effective,” says Bob Massie, a former anti-apartheid activist who helped found the Investor Network on Climate Risk. “The message is simple: We have had enough. We must sever the ties with those who profit from climate change – now.”

Movements rarely have predictable outcomes. But any campaign that weakens the fossil-fuel industry’s political standing clearly increases the chances of retiring its special breaks. Consider President Obama’s signal achievement in the climate fight, the large increase he won in mileage requirements for cars. Scientists, environmentalists and engineers had advocated such policies for decades, but until Detroit came under severe financial pressure, it was politically powerful enough to fend them off. If people come to understand the cold, mathematical truth – that the fossil-fuel industry is systematically undermining the planet’s physical systems – it might weaken it enough to matter politically. Exxon and their ilk might drop their opposition to a fee-and-dividend solution; they might even decide to become true energy companies, this time for real.

Even if such a campaign is possible, however, we may have waited too long to start it. To make a real difference – to keep us under a temperature increase of two degrees – you’d need to change carbon pricing in Washington, and then use that victory to leverage similar shifts around the world. At this point, what happens in the U.S. is most important for how it will influence China and India, where emissions are growing fastest. (In early June, researchers concluded that China has probably under-reported its emissions by up to 20 percent.) The three numbers I’ve described are daunting – they may define an essentially impossible future. But at least they provide intellectual clarity about the greatest challenge humans have ever faced. We know how much we can burn, and we know who’s planning to burn more. Climate change operates on a geological scale and time frame, but it’s not an impersonal force of nature; the more carefully you do the math, the more thoroughly you realize that this is, at bottom, a moral issue; we have met the enemy and they is Shell.

Meanwhile the tide of numbers continues. The week after the Rio conference limped to its conclusion, Arctic sea ice hit the lowest level ever recorded for that date. Last month, on a single weekend, Tropical Storm Debby dumped more than 20 inches of rain on Florida – the earliest the season’s fourth-named cyclone has ever arrived. At the same time, the largest fire in New Mexico history burned on, and the most destructive fire in Colorado’s annals claimed 346 homes in Colorado Springs – breaking a record set the week before in Fort Collins. This month, scientists issued a new study concluding that global warming has dramatically increased the likelihood of severe heat and drought – days after a heat wave across the Plains and Midwest broke records that had stood since the Dust Bowl, threatening this year’s harvest. You want a big number? In the course of this month, a quadrillion kernels of corn need to pollinate across the grain belt, something they can’t do if temperatures remain off the charts. Just like us, our crops are adapted to the Holocene, the 11,000-year period of climatic stability we’re now leaving… in the dust.

This story is from the August 2nd, 2012 issue of Rolling Stone.

Climate models that predict more droughts win further scientific support (Washington Post)

The drought of 2012: It has been more than a half-century since a drought this extensive hit the United States, NOAA reported July 16. The effects are growing and may cost the U.S. economy $50 billion.

By Hristio Boytchev, Published: August 13

The United States will suffer a series of severe droughts in the next two decades, according to a new study published in the journal Nature Climate Change. Moreover, global warming will play an increasingly important role in their abundance and severity, claims Aiguo Dai, the study’s author.

His findings bolster conclusions from climate models used by researchers around the globe that have predicted severe and widespread droughts in coming decades over many land areas. Those models had been questioned because they did not fully reflect actual drought patterns when they were applied to conditions in the past. However, using a statistical method with data about sea surface temperatures, Dai, a climate researcher at the federally funded National Center for Atmospheric Research, found that the model accurately portrayed historic climate events.

“We can now be more confident that the models are correct,” Dai said, “but unfortunately, their predictions are dire.”

In the United States, the main culprit currently is a cold cycle in the surface temperature of the eastern Pacific Ocean. It decreases precipitation, especially over the western part of the country. “We had a similar situation in the Dust Bowl era of the 1930s,” said Dai, who works at the research center’s headquarters in Boulder, Colo.

While current models cannot predict the severity of a drought in a given year, they can assess its probability. “Considering the current trend, I was not surprised by the 2012 drought,” Dai said.

The Pacific cycle is expected to last for the next one or two decades, bringing more aridity. On top of that comes climate change. “Global warming has a subtle effect on drought at the moment,” Dai said, “but by the end of the cold cycle, global warming might take over and continue to cause dryness.”

While the variations in sea temperatures primarily influence precipitation, global warming is expected to bring droughts by increasing evaporation over land. Additionally, Dai predicts more dryness in South America, Southern Europe and Africa.

“The similarity between the observed droughts and the projections from climate models here is striking,” said Peter Cox, a professor of climate system dynamics at Britain’s University of Exeter, who was not involved in Dai’s research. He said he also agrees that the latest models suggest increasing drought to be consistent with man-made climate change.

Programa de computador mimetiza evolução humana (Fapesp)

Software desenvolvido na USP de São Carlos cria e seleciona programas geradores de Árvores de Decisão, ferramentas capazes de fazer previsões. Pesquisa foi premiada nos Estados Unidos, no maior evento de computação evolutiva (Wikimedia)

16/08/2012

Por Karina Toledo

Agência FAPESP – Árvores de Decisão são ferramentas computacionais que conferem às máquinas a capacidade de fazer previsões com base na análise de dados históricos. A técnica pode, por exemplo, auxiliar o diagnóstico médico ou a análise de risco de aplicações financeiras.

Mas, para ter a melhor previsão, é necessário o melhor programa gerador de Árvores de Decisão. Para alcançar esse objetivo, pesquisadores do Instituto de Ciências Matemáticas e de Computação (ICMC) da Universidade de São Paulo (USP), em São Carlos, se inspiraram na teoria evolucionista de Charles Darwin.

“Desenvolvemos um algoritmo evolutivo, ou seja, que mimetiza o processo de evolução humana para gerar soluções”, disse Rodrigo Coelho Barros, doutorando do Laboratório de Computação Bioinspirada (BioCom) do ICMC e bolsista da FAPESP.

A computação evolutiva, explicou Barros, é uma das várias técnicas bioinspiradas, ou seja, que buscam na natureza soluções para problemas computacionais. “É notável como a natureza encontra soluções para problemas extremamente complicados. Não há dúvidas de que precisamos aprender com ela”, disse Barros.

Segundo Barros, o software desenvolvido em seu doutorado é capaz de criar automaticamente programas geradores de Árvores de Decisão. Para isso, faz cruzamentos aleatórios entre os códigos de programas já existentes gerando “filhos”.

“Esses ‘filhos’ podem eventualmente sofrer mutações e evoluir. Após um tempo, é esperado que os programas de geração de Árvores de Decisão evoluídos sejam cada vez melhores e nosso algoritmo seleciona o melhor de todos”, afirmou Barros.

Mas enquanto o processo de seleção natural na espécie humana leva centenas ou até milhares de anos, na computação dura apenas algumas horas, dependendo do problema a ser resolvido. “Estabelecemos cem gerações como limite do processo evolutivo”, contou Barros.

Inteligência artificial

Em Ciência da Computação, é denominada heurística a capacidade de um sistema fazer inovações e desenvolver técnicas para alcançar um determinado fim.

O software desenvolvido por Barros se insere na área de hiper-heurísticas, tópico recente na área de computação evolutiva que tem como objetivo a geração automática de heurísticas personalizadas para uma determinada aplicação ou conjunto de aplicações.

“É um passo preliminar em direção ao grande objetivo da inteligência artificial: o de criar máquinas capazes de desenvolver soluções para problemas sem que sejam explicitamente programadas para tal”, detalhou Barros.

O trabalho deu origem ao artigo A Hyper-Heuristic Evolutionary Algorithm for Automatically Designing Decision-Tree Algorithms, premiado em três categorias na Genetic and Evolutionary Computation Conference (GECCO), maior evento da área de computação evolutiva do mundo, realizado em julho na Filadélfia, Estados Unidos.

Além de Barros, também são autores do artigo os professores André Carlos Ponce de Leon Ferreira de Carvalho, orientador da pesquisa no ICMC, Márcio Porto Basgalupp, da Universidade Federal de São Paulo (Unifesp), e Alex Freitas, da University of Kent, no Reino Unido, que assumiu a co-orientação.

Os autores foram convidados a submeter o artigo para a revista Evolutionary Computation Journal, publicada pelo Instituto de Tecnologia de Massachusetts (MIT). “O trabalho ainda passará por revisão, mas, como foi submetido a convite, tem grande chance de ser aceito”, disse Barros.

A pesquisa, que deve ser concluída somente em 2013, também deu origem a um artigo publicado a convite no Journal of the Brazilian Computer Society, após ser eleito como melhor trabalho no Encontro Nacional de Inteligência Artificial de 2011.

Outro artigo, apresentado na 11ª International Conference on Intelligent Systems Design and Applications, realizada na Espanha em 2011, rendeu convite para publicação na revistaNeurocomputing.

Search Technology That Can Gauge Opinion and Predict the Future (Science Daily)

ScienceDaily (Aug. 16, 2012) — Inspired by a system for categorising books proposed by an Indian librarian more than 50 years ago, a team of EU-funded researchers have developed a new kind of internet search that takes into account factors such as opinion, bias, context, time and location. The new technology, which could soon be in use commercially, can display trends in public opinion about a topic, company or person over time — and it can even be used to predict the future.

‘Do a search for the word “climate” on Google or another search engine and what you will get back is basically a list of results featuring that word: there’s no categorisation, no specific order, no context. Current search engines do not take into account the dimensions of diversity: factors such as when the information was published, if there is a bias toward one opinion or another inherent in the content and structure, who published it and when,’ explains Fausto Giunchiglia, a professor of computer science at the University of Trento in Italy.

But can search technology be made to identify and embrace diversity? Can a search engine tell you, for example, how public opinion about climate change has changed over the last decade? Or how hot the weather will be a century from now, by aggregating current and past estimates from different sources?

It seems that it can, thanks to a pioneering combination of modern science and a decades-old classification method, brought together by European researchers in the LivingKnowledge (1) project. Supported by EUR 4.8 million in funding from the European Commission, the LivingKnowledge team, coordinated by Prof. Giunchiglia, adopted a multidisciplinary approach to developing new search technology, drawing on fields as diverse as computer science, social science, semiotics and library science.

Indeed, the so-called father of library science, Sirkali Ramamrita Ranganathan, an Indian librarian, served as a source of inspiration for the researchers. In the 1920s and 1930s, Ranganathan developed the first major analytico-synthetic, or faceted, classification system. Using this approach, objects — books, in the case of Ranganathan; web and database content, in the case of the LivingKnowlege team — are assigned multiple characteristics and attributes (facets), enabling the classification to be ordered in multiple ways, rather than in a single, predetermined, taxonomic order. Using the system, an article about the effects on agriculture of climate change written in Norway in 1990 might be classified as ‘Geography; Climate; Climate change; Agriculture; Research; Norway; 1990.’

In order to understand the classification system better and implement it in search engine technology, the LivingKnowledge researchers turned to the Indian Statistical Institute, a project partner, which uses faceted classification on a daily basis.

‘Using their knowledge we were able to turn Ranganathan’s pseudo-algorithm into a computer algorithm and the computer scientists were able to use it to mine data from the web, extract its meaning and context, assign facets to it, and use these to structure the information based on the dimensions of diversity,’ Prof. Giunchiglia says.

Researchers at the University of Pavia in Italy, another partner, drew on their expertise in extracting meaning from web content — not just from text and multimedia content, but also from the way the information is structured and laid out — in order to infer bias and opinions, adding another facet to the data.

‘We are able to identify the bias of authors on a certain subject and whether their opinions are positive or negative,’ the LivingKnowledge coordinator says. ‘Facts are facts, but any information about an event, or on any subject, is often surrounded by opinions and bias.’

From libraries of the 1930s to space travel in 2034…

The technology was implemented in a testbed, now available as open source software, and used for trials based around two intriguing application scenarios.

Working with Austrian social research institute SORA, the team used the LivingKnowledge system to identify social trends and monitor public opinion in both quantitative and qualitative terms. Used for media content analysis, the system could help a company understand the impact of a new advertising campaign, showing how it has affected brand recognition over time and which social groups have been most receptive. Alternatively, a government might use the system to gauge public opinion about a new policy, or a politician could use it to respond in the most publicly acceptable way to a rival candidate’s claims.

With Barcelona Media, a non-profit research foundation supported by Yahoo!, and with the Netherlands-based Internet Memory Foundation, the LivingKnowledge team looked not only at current and past trends, but extrapolated them and drew on forecasts extracted from existing data to try to predict the future. Their Future Predictor application is able to make searches based on questions such as ‘What will oil prices be in 2050?’ or ‘How much will global temperatures rise over the next 100 years?’ and find relevant information and forecasts from today’s web. For example, a search for the year 2034 turns up ‘space travel’ as the most relevant topic indexed in today’s news.

‘More immediately, this application scenario provides functionality for detecting trends even before these trends become apparent in daily events — based on integrated search and navigation capabilities for finding diverse, multi-dimensional information depending on content, bias and time,’ Prof. Giunchiglia explains.

Several of the project partners have plans to implement the technology commercially, and the project coordinator intends to set up a non-profit foundation to build on the LivingKnowledge results at a time when demand for this sort of technology is only likely to increase.

As Prof. Giunchiglia points out, Google fundamentally changed the world by providing everyone with access to much of the world’s information, but it did it for people: currently only humans can understand the meaning of all that data, so much so that information overload is a common problem. As we move into a ‘big data’ age in which information about everything and anything is available at the touch of a button, the meaning of that information needs to be understandable not just by humans but also by machines, so quantity must come combined with quality. The LivingKnowledge approach addresses that problem.

‘When we started the project, no one was talking about big data. Now everyone is and there is increasing interest in this sort of technology,’ Prof. Giunchiglia says. ‘The future will be all about big data — we can’t say whether it will be good or bad, but it will certainly be different.’

Global Warming Causes More Extreme Shifts of the Southern Hemisphere’s Largest Rain Band, Study Suggests (Science Daily)

ScienceDaily (Aug. 16, 2012) — The changes will result from the South Pacific rain band responding to greenhouse warming. The South Pacific rain band is largest and most persistent of the Southern Hemisphere spanning the Pacific from south of the Equator, south-eastward to French Polynesia.

Infrared satellite image obtained with the Geostationary Meteorological Satellite-5. (Credit: NOAA)

Occasionally, the rain band moves northwards towards the Equator by 1000 kilometres, inducing extreme climate events.

The international study, led by CSIRO oceanographer Dr Wenju Cai, focuses on how the frequency of such movement may change in the future. The study finds the frequency will almost double in the next 100 years, with a corresponding intensification of the rain band.

Dr Wenju and colleagues turned to the extensive archives of general circulation models submitted for the fourth and fifth IPCC Assessments and found that increases in greenhouse gases are projected to enhance equatorial Pacific warming. In turn, and in spite of disagreement about the future of El Niño events, this warming leads to the increased frequency of extreme excursions of the rain band.

During moderate El Niño events with warming in the equatorial eastern Pacific, the rain band moves north-eastward by 300 kilometres. Countries located within the bands’ normal position such as Vanuatu, Samoa, and the southern Cook Islands experience forest fires and droughts as well as increased frequency of tropical cyclones, whereas countries to which the rain band moves experience extreme floods.

“During extreme El Niño events, such as 1982/83 and 1997/98, the band moved northward by up to 1000 kilometres. The shift brings more severe extremes, including cyclones to regions such as French Polynesia that are not accustomed to such events,” said Dr Cai, a scientist at the Wealth from Oceans Flagship.

“Understanding changes in the frequency of these events as the climate changes proceed is therefore of broad scientific and socio-economic interest.”

A central issue for community adaptation in Australia and across the Pacific is understanding how the warming atmosphere and oceans will influence the intensity and frequency of extreme events. The impact associated with the observed extreme excursions includes massive droughts, severe food shortage, and coral reef mortality through thermally-induced coral bleaching across the South Pacific.

“Understanding changes in the frequency of these events as the climate changes proceed is therefore of broad scientific and socio-economic interest.”

The paper, “More extreme swings of the South Pacific Convergence Zone due to greenhouse warming,” was co-authored by Australian scientists Dr Simon Borlace, Mr Tim Cowan from CSIRO and Drs Scott Power and Jo Brown, two Bureau of Meteorology scientists at the Centre for Australian Weather and Climate Research, who were joined by French, US, UK, and Cook Island scientists.

The research effort from Australian scientists was supported by the Australian Climate Change Science Program, the CSIRO Office of Chief Executive Science Leader program, and the Pacific-Australia Climate Change Science and Adaptation Planning Program.

Organisms Cope With Environmental Uncertainty by Guessing the Future (Science Daily)

ScienceDaily (Aug. 16, 2012) — In uncertain environments, organisms not only react to signals, but also use molecular processes to make guesses about the future, according to a study by Markus Arnoldini et al. from ETH Zurich and Eawag, the Swiss Federal Institute of Aquatic Science and Technology. The authors report in PLoS Computational Biology that if environmental signals are unreliable, organisms are expected to evolve the ability to take random decisions about adapting to cope with adverse situations.

Most organisms live in ever-changing environments, and are at times exposed to adverse conditions that are not preceded by any signal. Examples for such conditions include exposure to chemicals or UV light, sudden weather changes or infections by pathogens. Organisms can adapt to withstand the harmful effects of these stresses. Previous experimental work with microorganisms has reported variability in stress responses between genetically identical individuals. The results of the present study suggest that this variation emerges because individual organisms take random decisions, and such variation is beneficial because it helps organisms to reduce the metabolic costs of protection without compromising the overall benefits.

The theoretical results of this study can help to understand why genetically identical organisms often express different traits, an observation that is not explained by the conventional notion of nature and nurture. Future experiments will reveal whether the predictions made by the mathematical model are met in natural systems.

Nelson Rodrigues e o “Sobrenatural de Almeida” (Portal Entretextos)

11.07.2012

Miguel Carqueija

Um mestre do “mainstream” também entrou em terreno fantástico.

Nelson Rodrigues, de quem se comemora o centenário em 2012, não foi apenas um dramaturgo e contista, mas também produziu crônica esportiva. Por muito tempo manteve uma coluna no jornal carioca “O Globo” — e naquele tempo este diário, hoje decadente, possuia bons colunistas — que mudava de nome, mas o seu titulo principal era “À sombra das chuteiras imortais” (outros títulos usados foram “A batalha” e “Os bandeirinhas também são anjos”).
Nelson tinha um estilo sui-generis e, a rigor, reconhecível facilmente, mesmo se ele não assinasse. Fluminense doente, era descaradamente parcial nas suas crônicas. E eu, que torcia pelo Fluminense, as lia comprazer.
Detalhe interessante é que Nelson, na maior cara-de-pau, gostava de “profetizar” a vitória do Flu no então campeonato carioca. O futebol, naquele tempo, era muito regional. E, claro, a profecia dava certo quando o clube ganhava o campeonato.
Certo ano, durante o que parecia ser uma maré de azar, Nelson escreveu que o sobrenatural estava perseguindo o Fluminense. Dias depois o cronista publicou uma “carta” que teria recebido, e que diria mais ou menos assim: “No dia tal o senhor disse que o sobrenatural está perseguindo o Fluminense. Ora, o Sobrenatural sou eu, e garanto que isso não é verdade etc.” O “personagem” encerrava a missiva garantindo que no próximo jogo o tricolor ganharia, e assinava: “Sobrenatural de Almeida”.
Veio o domingo e o Fluminense perdeu. Revoltado, Nelson acusou o Sobrenatural de Almeida de haver mentido descaradamente. Aí começava a guerra da torcida do Fluminense, chefiada por Nelson Rodrigues, contra o sinistro Sobrenatural de Almeida.
Pode parecer estranho hoje em dia, para quem não conheceu o carisma do cronista e dramaturgo falecido em 1980, mas o caso é que o Sobrenatural de Almeida foi, durante algum tempo, verdadeira coqueluche na cidade. Os repórteres esportivos falavam nele. Certo jogo foi acompanhado de forte ventania, que chegou a desviar a bola que ia para o gol. “É o Sobrenatural de Almeida!”, gritou o locutor da rádio.
Veio um novo jogo e o Fluminense venceu. Nelson comemorou a vitória contra o inimigo, que teria se retirado melancolicamente do Maracanã. Depois, porém, por motivos que hoje me escapam, o campeonato foi suspenso por algum tempo. Nelson Rodrigues então “recebeu” um telefonema do Sobrenatural de Almeida, assumindo ser o responsável pela interrupção do campeonato.
Com o tempo o colunista foi dando maiores informações sobre a misteriosa figura, que nas caricaturas aparecia com uma roupa preta, tão “assustador” como o Zé do Caixão. Segundo Nelson, o Sobrenatural tivera os seus tempos de glória mas agora, coitado, morava em Irajá e viajava nos trens da Central. Por isso até chegava atrasado ao Maracanã, e só então começava a interferir.
Essa febre do Sobrenatural de Almeida durou semanas, meses, mas acabou saturando e o Nelson terminou parando de falar nele. Mas, de certa forma, foi uma contribuição do jornalista para a nossa literatura fantástica.

Calgary hail storm: Cloud seeding credited for sparing city from worse disaster (The Calgary Herald)

‘The storm was a monster,’ says weather modification company

BY THANDI FLETCHER, CALGARY HERALD AUGUST 14, 2012

Paul Newell captured dramatic images in the Bearspaw area of northwest Calgary just before the start of the hailstorm on Sunday, Aug. 12, 2012.

Paul Newell captured dramatic images in the Bearspaw area of northwest Calgary just before the start of the hailstorm on Sunday, Aug. 12, 2012. Photograph by: Reader photo , Paul Newell

A ferocious storm that hammered parts of Calgary with hail stones larger than golf balls late Sunday, causing millions of dollars worth of damage, could have been much worse if cloud-seeding planes hadn’t attempted to calm it down.

“The storm was a monster,” said Terry Krauss, project director of the Alberta Severe Weather Management Society, which contracts American-based company Weather Modification Inc. to seed severe weather clouds in Alberta’s skies. The society is funded by a group of insurance companies with a goal of reducing hail damage claims.

Before the storm hit, Krauss said, the company sent all four of its cloud-seeding aircraft into the thick and swirling black clouds. The planes flew for more than 12 hours, shooting silver iodide, a chemical agent that helps limit the size of hail stones, at the top and base of the clouds, until midnight.

But despite the heavy seeding, golf-ball-sized hail stones pelted parts of Calgary late Sunday night, causing widespread damage to cars and homes.

“This one was a beast. It took everything we threw at it and still was able to wreak some havoc,” said Krauss. “I believe if we hadn’t seeded, it would have even been worse.”

Northeast Calgary was worst hit by the storm, where the hail was between five and six centimetres, said Environment Canada meteorologist John Paul Craig. Other parts of the city saw toonie-sized hail from a second storm system, said Craig.

Craig said Sunday’s storm was worse than Calgary’s last major hailstorm, which saw four-centimetre hail stones, in July 2010.

“These hail stones were just a little bit bigger,” he said.

At Royal Oak Audi in the city’s northwest, broken glass from smashed windows littered the lot Monday morning. Of the 85 new and used cars on the lot, general manager Murray Dorren said not a single car was spared from the storm.

“It’s devastating — that’s probably the best word I can come up with,” he said. “It’s unbelievable that Mother Nature can do this much damage in a very short time. I think it probably took a matter of 10 minutes and there’s millions of dollars worth of damage.

Dorren estimated the damage at about $2 million. Across the lot, the dinged-up vehicles looked like dimpled golf balls from the repetitive pounding of the sizable stones. Some windows and sunroofs were shattered, while others were pierced by the heavy hail.

“They look like bullet holes right through the windscreen,” salesman Nick Berkland said of the damage.

Insurance companies and brokers were inundated with calls all day as customers tried to file claims on their wrecked cars and homes.

Ron Biggs, claims director for Intact Insurance, said it’s too early to tell how many claims the hail event will spurn, although he said they received about two to three times their normal call volume on Monday.

Biggs said the level of damage so far appears to be similar to the July 2010 hailstorm, when Intact received about 12,000 hail damage claims.

Chief operating officer Bruce Rabik of Rogers Insurance, which insures several car dealerships in Calgary, said the damage is extensive.

“It’s certainly a bad one,” he said. “We’ve had one dealership, which they estimate 600 damaged cars. A couple other dealerships with 200 damaged cars each.”

Rabik said claims adjusters are overwhelmed with the volume of claims. He urged customers to be patient as it may take a day or two as insurance workers make their way to each home.

Shredded leaves, twigs and broken branches blanketed pathways along the Bow and Elbow rivers as city crews worked to clear them, said Calgary parks pathway lead Duane Sutherland.

“This was the worst that I’ve seen,” said Sutherland.

Once daylight broke Monday, Royal Oak resident Satya Mudlair inspected the exterior of his home, which was riddled with damage. “Lots of holes in the siding, window damage to the two bedroom windows, and the roof a little bit,” he said.

The apple tree in his backyard has also lost about half its apples, he said. Fortunately, his car was parked inside the garage and was spared any dents.

Mudlair said his insurance company told him it would take two or three weeks before the damage would be repaired. “There’s a big pile of names ahead of me,” he said.

Mudlair’s wife, Nirmalla, had just fallen asleep when she was awoken by the sound of hail stones hitting the roof.

“It was very bad. It was like, thump, thump,” she described the pelting sound. “We got scared and I kept running from room to room.”

Cloud-seeding expert Krauss said Calgary has experienced more severe weather than usual this year, although Sunday’s storm was by far the worst.

“It has been a very stormy year,” he said.

© Copyright (c) The Calgary Herald

Post Normal Science: Deadlines (Climate Etc.)

Posted on August 3, 2012

by Steven Mosher

Science has changed. More precisely, in post normal conditions the behavior of people doing science has changed.

Ravetz describes a post normal situation by the following criteria:

  1. Facts are uncertain
  2. Values are in conflict
  3. Stakes are high
  4. Immediate action is required

The difference between Kuhnian normal science, or the behavior of those doing science under normal conditions, and post normal science is best illustrated by example. We can use the recent discovery of the Higgs Boson as an example. Facts were uncertain–they always are to a degree; no values were in conflict; the stakes were not high; and, immediate action was not required. What we see in that situation is those doing science acting as we expect them to, according to our vague ideal of science. Because facts are uncertain, they listen to various conflicting theories. They try to put those theories to a test. They face a shared uncertainity and in good faith accept the questions and doubts of others interested in the same field. Their participation in politics is limited to asking for money. Because values are not in conflict no theorist takes the time to investigate his opponent’s views on evolution or smoking or taxation. Because the field of personal values is never in play, personal attacks are minimized. Personal pride may be at stake, but values rarely are. The stakes for humanity in the discovery of the Higgs are low: at least no one argues that our future depends upon the outcome. No scientist straps himself to the collider and demands that it be shut down. And finally, immediate action is not required; under no theory is the settling of the uncertainty so important as to rush the result. In normal science, according to Kuhn,  we can view the behavior of those doing science as puzzle solving. The details of a paradigm are filled out slowly and deliberately.

The situation in climate science are close to the polar opposite of this. That does not mean and should not be construed as a criticism of climate science or its claims. The simple point is this: in a PNS situation, the behavior of those doing science changes. To be sure much of their behavior remains the same. They formulate theories; they collect data, and they test their theories against the data. They don’t stop doing what we notional  describe as science. But, as foreshadowed above in the description of how high energy particle physicists behave, one can see how that behavior changes in a PNS situation. There is uncertainty, but the good faith that exists in normal science, the faith that other people are asking questions because they actually want the answer is gone. Asking questions, raising doubts, asking to see proof becomes suspect in and of itself. And those doing science are faced with a question that science cannot answer: Does this person really want the answer or are they amerchant of doubt? Such a question never gets asked in normal science. Normal science doesn’t ask this question because science cannot answer it.

Because values are in conflict the behavior of those doing science changes. In normal science no one would care if Higgs was a Christian or an atheist. No one would care if he voted liberal or conservative; but because two different value systems are in conflict in climate science, the behavior of those doing science changes. They investigate each other. They question motives. They form tribes.  And because the stakes are high the behavior of those doing science changes as well. They protest; they take money from lobby groups on both sides and worse of all they perform horrendous raps on youTube. In short, they become human; while those around them canonize them or demonize them and their findings become iconized or branded as hoaxes.

This brings us to the last aspect of a PNS situation: immediate action is required. This perhaps is the most contentious aspect of PNS, in fact I would argue it is thedefining characteristic. In all PNS situations it is almost always the case the one side sees the need for action, given the truth of their theory, while the doubtersmust of necessity see no need for immediate action. They must see no need for immediate action because their values are at risk and because the stakes are high. Another way to put this is as follows. When you are in a PNS situation, all sides must deny it. Those demanding immediate action, deny it by claiming more certainty*than is present; those refusing immediate action, do so by increasing demands for certainty. This leads to a centralization and valorization of the topic of uncertainty, and epistemology becomes a topic of discussion for those doing science. That is decidedly not normal science.

The demand for immediate action, however, is broader than simply a demand that society changes. In a PNS situation the behavior of those doing science changes. One of the clearest signs that you are in PNS is the change in behavior around deadlines. Normal science has no deadline. In normal science, the puzzle is solved when it is solved. In normal science there may be a deadline to shut down the collider for maintenance. Nobody rushes the report to keep the collider running longer than it should. And if a good result is found, the schedules can be changed to accommodate the scienceBroadly speaking, science drives the schedule; the schedule doesn’t drive the science.

The climategate mails are instructive here. As one reads through the mails it’s clear that the behavior of those doing science is not what one would call disinterested patient puzzle solving. Human beings acting in a situation where values are in conflict and stakes are high will engage in behavior that they might not otherwise. Those changes are most evident in situations surrounding deadlines. The point here is not to rehash The Crutape Lettersbut rather to relook at one incident ( there are others, notably around congressional hearings ) where deadlines came into play. The deadline in question was the deadline for submitting papers for consideration. As covered in The Crutape Letters and in The Hockeystick Illusion, the actions taken by those doing science around the“Jesus Paper” is instructive. In fact, were I to rewrite the Crutape letters I would do it from the perspective of PNS, focusing on how the behavior of those doing science deviated from the ideals of openness, transparency and letting truth come on its own good time.

Climategate is about FOIA. There were two critical paths for FOIA: one sought data, the other sought the emails of scientists. Not quite normal. Not normal in that data is usually shared; not normal in that we normally respect the privacy of those doing science. But this is PNS, and all bets are off. Values and practices from other fields, such as business and government,  are imported into the culture of science: Data hoarding is defended using IP and confidentiality agreements. Demanding private mail is defended using values imported from performing business for the public. In short, one sign that a science is post normal, is the attempt to import values and procedures from related disciplines. Put another way, PNS poses the question of governance. Who runs science and how should they run it.

The “Jesus paper” in a nutshell can be explained as follows. McIntyre and McKittrick had a paper published in the beginning of 2005. That paper needed to be rebutted in order to make Briffa’s job of writing chapter 6 easier. However, there was a deadline in play. Papers had to be accepted by a date certain. At one point Steven Schneider suggested the creation of a new category, a novelty–  provisionally accepted — so that the “jesus paper” could make the deadline. McIntyre covers the issue here. One need not re-adjudicate whether or not the IPCC rules were broken. And further these rules have nothing whatsoever ever to do with the truth of the claims in that paper. This is not about the truth of the science. What is important is the importation of the concept of a deadline into the search for truth. What is important is that the behavior of those doing science changes. Truth suddenly cares about a date. Immediate action is required. In this case immediate action is taken to see to it that the paper makes it into the chapter. Normal science takes no notice of deadlines. In PNS, deadlines matter.

Last week we saw another example of deadlines and high stakes changing the behavior of those doing science. The backstory here explains .   It appears to me that the behavior of those involved changed from what I have known it to be. It changed because they perceived that immediate action was required. A deadline had to be met. Again, as with the Jesus paper, the facts surrounding the releasedo not go to the truth of the claims. In normal science, a rushed claimed might very well get the same treatment as an unrushed claim: It will be evaluated on its merits. In PNS, either the rush to meet an IPCC deadline– as in the case of the Jesus paper, or the rush to be ready for congress –as in the Watts case, is enoughfor some doubt the science.  What has been testified to in Congress by Christy, a co author, may very well be true. But in this high stakes arena, where facts are uncertain and values are in conflict, the behavior of those doing science can and does change. Not all their behavior changes. They still observe and test and report. But the manner in which they do that changes. Results are rushed and data is held in secret. Deadlines change everything. Normal science doesn’t operate this way; if it does, quality can suffer. And yet, the demand for more certainty than is needed, the bad faith game of delaying action by asking questions, precludes a naïve return to science without deadlines.

The solution that Ravetz suggests is extended peer review and a recognition of the importance of quality. In truth, the way out of a PNS situation is not that simple. The first step out of a PNS situation is the recognition that one is in the situation to begin with. Today, few people embroiled in this debate would admit that the situation has changed how they would normally behave. An admission that this isn’t working is a cultural crisis for science. No one has the standing to describe how one should conduct science in a PNS situation. No one has the standing to chart the path out of a PNS situation. The best we can do is describe what we see. Today, I observe that deadlines change the behavior of those doing science. We see that in climategate; we see that in the events of the past week. That’s doesn’t entail anything about the truth of science performed under pressure. But it should make us pause and consider if truth will be found any faster by rushing the results and hiding the data.

*I circulated a copy of this to Michael Tobis to get his reaction. MT took issue with this characterization. MT, I believe, originated the argument that our uncertainty is a reason for action. It is true that while the certainty about the science  has been a the dominant piece of the rhetoric, there has been a second thread of rhetoric that bases action in the uncertainty about sensitivity. I would call this certainty shifting. While the uncertainty about facts of sensitivity are accepted in this path of argument the certainty is shifted to certainty about values and certainty about impacts. In short, the argument becomes that while we are uncertain about sensitivity the certainty we have about large impacts and trans-generational obligations necessitates action.

Scientists struggle with limits – and risks – of advocacy (eenews.net)

Monday, July 9, 2012

Paul Voosen, E&E reporter

Jon Krosnick has seen the frustration etched into the faces of climate scientists.

For 15 years, Krosnick has charted the rising public belief in global warming. Yet, as the field’s implications became clearer, action has remained elusive. Science seemed to hit the limits of its influence. It is a result that has prompted some researchers to cross their world’s no man’s land — from advice to activism.

As Krosnick has watched climate scientists call for government action, he began pondering a recent small dip in the public’s belief. And he wondered: Could researchers’ move into the political world be undermining their scientific message?

Jon Krosnick
Stanford’s Jon Krosnick has been studying the public’s belief in climate change for 15 years, but only recently did he decide to probe their reaction to scientists’ advocacy. Photo courtesy of Jon Krosnick.

“What if a message involves two different topics, one trustworthy and one not trustworthy?” said Krosnick, a communication and psychology professor at Stanford University. “Can the general public detect crossing that line?”

His results, not yet published, would seem to say they can.

Using a national survey, Krosnick has found that, among low-income and low-education respondents, climate scientists suffered damage to their trustworthiness and credibility when they veered from describing science into calling viewers to ask the government to halt global warming. And not only did trust in the messenger fall — even the viewers’ belief in the reality of human-caused warming dropped steeply.

It is a warning that, even as the frustration of inaction mounts and the politicization of climate science deepens, researchers must be careful in getting off the political sidelines.

“The advice that comes out of this work is that all of us, when we claim to have expertise and offer opinions on matters [in the world], need to be guarded about how far we’re willing to go,” Krosnick said. Speculation, he added, “could compromise everything.”

Krosnick’s survey is just the latest social science revelation that has reordered how natural scientists understand their role in the world. Many of these lessons have stemmed from the public’s and politicians’ reactions to climate change, which has provided a case study of how science communication works and doesn’t work. Complexity, these researchers have found, does not stop at their discipline’s verge.

For decades, most members of the natural sciences held a simple belief that the public stood lost, holding out empty mental buckets for researchers to fill with knowledge, if they could only get through to them. But, it turns out, not only are those buckets already full with a mix of ideology and cultural belief, but it is incredibly fraught, and perhaps ineffective, for scientists to suggest where those contents should be tossed.

It’s been a difficult lesson for researchers.

“Many of us have been saddened that the world has done so little about it,” said Richard Somerville, a meteorologist at the Scripps Institution of Oceanography and former author of the United Nations’ authoritative report on climate change.

“A lot of physical climate scientists, myself included, have in the past not been knowledgeable about what the social sciences have been saying,” he added. “People who know a lot about the science of communication … [are] on board now. But we just don’t see that reflected in the policy process.”

While not as outspoken as NASA’s James Hansen, who has taken a high-profile moral stand alongside groups like 350.org and Greenpeace, Somerville has been a leader in bringing scientists together to call for greenhouse gas reductions. He helped organize the 2007 Bali declaration, a pointed letter from more than 200 scientists urging negotiators to limit global CO2 levels well below 450 parts per million.

Such declarations, in the end, have done little, Somerville said.

“If you look at the effect this has had on the policy process, it is very, very small,” he said.

This failed influence has spurred scientists like Somerville to partner closely with social scientists, seeking to understand why their message has failed. It is an effort that received a seal of approval this spring, when the National Academy of Sciences, the nation’s premier research body, hosted a two-day meeting on the science of science communication. Many of those sessions pivoted on public views of climate change.

It’s a discussion that’s been long overdue. When it comes to how the public learns about expert opinions, assumptions mostly rule in the sciences, said Dan Kahan, a professor of law and psychology at Yale Law School.

“Scientists are filled with conjectures that are plausible about how people make sense about information,” Kahan said, “only some fraction of which [are] correct.”

Shifting dynamic

Krosnick’s work began with a simple, hypothetical scene: NASA’s Hansen, whose scientific work on climate change is widely respected, walks into the Oval Office.

As he has since the 1980s, Hansen rattles off the inconvertible, ever-increasing evidence of human-caused climate change. It’s a stunning litany, authoritative in scope, and one the fictional president — be it a Bush or an Obama — must judge against Hansen’s scientific credentials, backed by publications and institutions of the highest order. If Hansen stops there, one might think, the case is made.

But he doesn’t stop. Hansen continues, arguing, as a citizen, for an immediate carbon tax.

“Whoa, there!” Krosnick’s president might think. “He’s crossed into my domain, and he’s out of touch with how policy works.” And if Hansen is willing to offer opinions where he lacks expertise, the president starts to wonder: “Can I trust any of his work?”

Richard Somerville
Part of Scripps’ legendary climate team — Charles David Keeling was an early mentor — Richard Somerville helped organize the 2007 Bali declaration by climate scientists, calling for government action on CO2 emissions. Photo by Sylvia Bal Somerville.

Researchers have studied the process of persuasion for 50 years, Krosnick said. Over that time, a few vital truths have emerged, including that trust in a source matters. But looking back over past work, Krosnick found no answer to this question. The treatment was simplistic. Messengers were either trustworthy or not. No one had considered the case of two messages, one trusted and one shaky, from the same person.

The advocacy of climate scientists provided an excellent path into this shifting dynamic.

Krosnick’s team hunted down video of climate scientists first discussing the science of climate change and then, in the same interview, calling for viewers to pressure the government to act on global warming. (Out of fears of bruised feelings, Krosnick won’t disclose the specific scientists cited.) They cut the video in two edits: one showing only the science, and one showing the science and then the call to arms.

Krosnick then showed a nationally representative sample of 793 Americans one of three videos: the science-only cut, the science and political cut, and a control video about baking meatloaf (The latter being closer to politics than Krosnick might admit). The viewers were then asked a series of questions both about their opinion of the scientist’s credibility and their overall beliefs on global warming.

For a cohort of 548 respondents who either had a household income under $50,000 or no more than a high school diploma, the results were stunning and statistically significant. Across the board, the move into politics undermined the science.

The viewers’ trust in the scientist dropped 16 percentage points, from 48 to 32 percent. Their belief in the scientist’s accuracy fell from 47 to 36 percent. Their overall trust in all scientists went from 60 to 52 percent. Their belief that government should “do a lot” to stop warming fell from 62 to 49 percent. And their belief that humans have caused climate change fell 14 percentage points, from 81 to 67 percent.

Krosnick is quick to note the study’s caveats. First, educated or wealthy viewers had no significant reaction to the political call and seemed able to parse the difference between science and a personal political view. The underlying reasons for the drop are far from clear, as well — it could simply be a function of climate change’s politicization. And far more testing needs to be done to see whether this applies in other contexts.

With further evidence, though, the implications could be widespread, Krosnick said.

“Is it the case that the principle might apply broadly?” he asked. “Absolutely.”

‘Fraught with misadventure’

Krosnick’s study is likely rigorous and useful — he is known for his careful methods — but it still carries with it a simple, possibly misleading frame, several scientists said.

Most of all, it remains hooked to a premise that words float straight from the scientist’s lips to the public’s ears. The idea that people learn from scientists at all or that they are simply misunderstanding scientific conclusions is not how reality works, Yale’s Kahan said.

“The thing that goes into the ear is fraught with misadventure,” he said.

Kahan has been at the forefront of charting how the empty-bucket theory of science communication — called the deficit model — fails. People interpret new information within the context of their own cultural beliefs, peers and politics. They use their reasoning to pick the evidence that supports their views, rather than the other way around. Indeed, recent work by Kahan found that higher-educated respondents were more likely to be polarized than their less-educated peers.

Krosnick’s study will surely spur new investigations, Kahan said, though he resisted definite remarks until he could see the final work. If the study’s conditions aren’t realistic, even a simple model can have “plenty of implications for all kinds of ways of which people become exposed to science,” he said.

The survey sits well with other research in the field and carries an implication about what role scientists should play in scientific debates, added Matthew Nisbet, a communication professor at American University.

“As soon as you start talking about a policy option, you’re presenting information that is potentially threatening to people’s values or identity,” he said. The public, he added, doesn’t “view scientists and scientific information in a vacuum.”

The deficit model has remained an enduring frame for scientists, many of whom are just becoming aware of social science work on the problem. Kahan compares it to the stages of grief. The first stage was that the truth just needs to be broadcast to change minds. The second, and one still influential in the scientific world, is that if the message is just simplified, the right images used, than the deficit will be filled.

“That too, I think, is a stage of misperception about how this works,” Kahan said.

Take the hand-wringing about science education that accompanied a recent poll finding that 46 percent of the United States believed in a creationist origin for humans. It’s a result that speaks to belief, not an understanding of evolution. Many surveyed who believed in evolution would still fail to explain natural selection, mutation or genetic variance, Kahan said, just as they don’t have to understand relativity to use their GPS.

Much of science doesn’t run up against the public’s belief systems and is accepted with little fuss. It’s not as if Louis Pasteur had to sell pasteurization by using slick images of children getting sick; for nearly all of society, it was simply a useful tool. People want to defer to the experts, as long as they don’t have to concede their beliefs on the way.

“People know what’s known without having a comprehension of why that’s the truth,” Kahan said.

There remains a danger in the emerging consensus that all scientific knowledge is filtered by the motivated reasoning of political and cultural ideology, Nisbet added. Not all people can be sorted by two, or even four, variables.

“In the new ideological deficit model, we tend to assume that failures in communication are caused by conservative media and conservative psychology,” he said. “The danger in this model is that we define the public in exclusively binary terms, as liberals versus conservatives, deniers versus believers.”

‘Crossing that line’

So why do climate scientists, more than most fields, cross the line into advocacy?

Most of all, it’s because their scientific work tells them the problem is so pressing, and time dependent, given the centuries-long life span of CO2 emissions, Somerville said.

“You get to the point where the emissions are large enough that you’ve run out of options,” he said. “You can no longer limit [it]. … We may be at that point already.”

There may also be less friction for scientists to suggest communal solutions to warming because, as Nisbet’s work has found, scientists tend to skew more liberal than the general population with more than 50 percent of one U.S. science society self-identifying as “liberal.” Given this outlook, they are more likely to accept efforts like cap and trade, a bill that, in implying a “cap” on activity, rubbed conservatives wrong.

Dan Kahan
A prolific law professor and psychologist at Yale, Dan Kahan has been charting how the public comes to, and understands, science. Photo courtesy of Dan Kahan.

“Not a lot of scientists would question if this is an effective policy,” Nisbet said.

It is not that scientists are unaware that they are moving into policy prescription, either. Most would intuitively know the line between their work and its political implications.

“I think many are aware when they’re crossing that line,” said Roger Pielke Jr., an environmental studies professor at the University of Colorado, Boulder, “but they’re not aware of the consequences [of] doing so.”

This willingness to cross into advocacy could also stem from the fact that it is the next logical skirmish. The battle for public opinion on the reality of human-driven climate change is already over, Pielke said, “and it’s been won … by the people calling for action.”

While there are slight fluctuations in public belief, in general a large majority of Americans side with what scientists say about the existence and causes of climate change. It’s not unanimous, he said, but it’s larger than the numbers who supported actions like the Montreal Protocol, the bank bailout or the Iraq War.

What has shifted has been its politicization: As more Republicans have begun to disbelieve global warming, Democrats have rallied to reinforce the science. And none of it is about the actual science, of course. It’s a fact Scripps’ Somerville now understands. It’s a code, speaking for fear of the policies that could happen if the science is accepted.

Doubters of warming don’t just hear the science. A policy is attached to it in their minds.

“Here’s a fact,” Pielke said. “And you have to change your entire lifestyle.”

For all the focus on how scientists talk to the public — whether Hansen has helped or hurt his cause — Yale’s Kahan ultimately thinks the discussion will mean very little. Ask most of the public who Hansen is, and they’ll mention something about the Muppets. It can be hard to accept, for scientists and journalists, but their efforts at communication are often of little consequence, he said.

“They’re not the primary source of information,” Kahan said.

‘A credible voice’

Like many of his peers, Somerville has suffered for his acts of advocacy.

“We all get hate email,” he said. “I’ve given congressional testimony and been denounced as an arrogant elitist hiding behind a discredited organization. Every time I’m on national news, I get a spike in ugly email. … I’ve received death threats.”

There are also pressures within the scientific community. As an elder statesman, Somerville does not have to worry about his career. But he tells young scientists to keep their heads down, working on technical papers. There is peer pressure to stay out of politics, a tension felt even by Somerville’s friend, the late Stephen Schneider, also at Stanford, who was long one of the country’s premier speakers on climate science.

He was publicly lauded, but many in the climate science community grumbled, Somerville said, that Schneider should “stop being a motormouth and start publishing technical papers.”

But there is a reason tradition has sustained the distinction between advising policymakers and picking solutions, one Krosnick’s work seems to ratify, said Michael Mann, a climatologist at Pennsylvania State University and a longtime target of climate contrarians.

“It is thoroughly appropriate, as a scientist, to discuss how our scientific understanding informs matters of policy, but … we should stop short of trying to prescribe policy,” Mann said. “This distinction is, in my view, absolutely critical.”

Somerville still supports the right of scientists to speak out as concerned citizens, as he has done, and as his friend, NASA’s Hansen, has done more stridently, protesting projects like the Keystone XL pipeline. As long as great care is taken to separate the facts from the political opinion, scientists should speak their minds.

“I don’t think being a scientist deprives you of the right to have a viewpoint,” he said.

Somerville often returns to a quote from the late Sherwood Rowland, a Nobel laureate from the University of California, Irvine, who discovered the threat chlorofluorocarbons posed to ozone: “What’s the use of having developed a science well enough to make predictions if, in the end, all we’re willing to do is stand around and wait for them to come true?”

Somerville asked Rowland several times whether the same held for global warming.

“Yes, absolutely,” he replied.

It’s an argument that Krosnick has heard from his own friends in climate science. But often this fine distinction gets lost in translation, as advocacy groups present the scientist’s personal message as the message of “science.” It’s luring to offer advice — Krosnick feels it himself when reporters call — but restraint may need to rule.

“In order to preserve a credible voice in public dialogue,” Krosnick said, “it might be that scientists such as myself need to restrain ourselves as speaking as public citizens.”

Broader efforts of communication, beyond scientists, could still mobilize the public, Nisbet said. Leave aside the third of the population who are in denial or alarmed about climate change, he said, and figure out how to make it relevant to the ambivalent middle.

“We have yet to really do that on climate change,” he said.

Somerville is continuing his efforts to improve communication from scientists. Another Bali declaration is unlikely, though. What he’d really like to do is get trusted messengers from different moral realms beyond science — leaders like the Dalai Lama — to speak repeatedly on climate change.

It’s all Somerville can do. It would be too painful to accept the other option, that climate change is like racism, war or poverty — problems the world has never abolished.

“[It] may well be that it is a problem that is too difficult for humanity to solve,” he said.

Mapping the Future of Climate Change in Africa (Science Daily)

ScienceDaily (Aug. 2, 2012) — Our planet’s changing climate is devastating communities in Africa through droughts, floods and myriad other disasters.

Children in the foothills of Drakensberg mountains in South Africa who still live in traditional rondavels on family homesteads. (Credit: Todd G. Smith, CCAPS Program)

Using detailed regional climate models and geographic information systems, researchers with the Climate Change and African Political Stability (CCAPS) program developed an online mapping tool that analyzes how climate and other forces interact to threaten the security of African communities.

The program was piloted by the Robert S. Strauss Center for International Security and Law at The University of Texas at Austin in 2009 after receiving a $7.6 million five-year grant from the Minerva Initiative with the Department of Defense, according to Francis J. Gavin, professor of international affairs and director of the Strauss Center.

“The first goal was to look at whether we could more effectively identify what were the causes and locations of vulnerability in Africa, not just climate, but other kinds of vulnerability,” Gavin said.

CCAPS comprises nine research teams focusing on various aspects of climate change, their relationship to different types of conflict, the government structures that exist to mitigate them, and the effectiveness of international aid in intervening. Although most CCAPS researchers are based at The University of Texas at Austin, the Strauss Center also works closely with Trinity College Dublin, the College of William and Mary, and the University of North Texas.

“In the beginning these all began as related, but not intimately connected, topics” Gavin said, “and one of the really impressive things about the project is how all these different streams have come together.”

Africa is particularly vulnerable to the effects of climate change due to its reliance on rain-fed agriculture and the inability of many of its governments to help communities in times of need.

The region is of increasing importance for U.S. national security, according to Gavin, because of the growth of its population, economic strength and resource importance, and also due to concerns about non-state actors, weakening governments and humanitarian disasters.

Although these issues are too complex to yield a direct causal link between climate change and security concerns, he said, understanding the levels of vulnerability that exist is crucial in comprehending the full effect of this changing paradigm.

The vulnerability mapping program within CCAPS is led by Joshua Busby, assistant professor at the Lyndon B. Johnson School of Public Affairs.

To determine the vulnerability of a given location based on changing climate conditions, Busby and his team looked at four different sources: 1) the degree of physical exposure to climate hazards, 2) population size, 3) household or community resilience, and 4) the quality of governance or presence of political violence.

The first source records the different types of climate hazards which could occur in the area, including droughts, floods, wildfires, storms and coastal inundation. However, their presence alone is not enough to qualify a region as vulnerable.

The second source — population size — determines the number of people who will be impacted by these climate hazards. More people create more demand for resources, potentially making the entire population more vulnerable.

The third source looks at how resilient a community is to adverse effects, analyzing the quality of their education and health, as well as whether they have easy access to food, water and health care.

“If exposure is really bad, it may exceed the capacity of local communities to protect themselves,” Busby said, “and then it comes down to whether or not the governments are going to be willing or able to help them.”

The final source accounts for the effectiveness of a given government, the amount of accountability present, how integrated it is with the international community, how politically stable it is, and whether there is any political violence present.

Busby and his team combined the four sources of vulnerability and gave them each equal weight, adding them together to form a composite map. Their scores were then divided into a ranking of five equal parts, or quintiles, going from the 20 percent of regions with the lowest vulnerability to the 20 percent with the highest.

The researchers gathered information for the tool from a variety of sources, including historic models of physical exposure from the United Nations Environment Programme (UNEP), population estimates from LandScan, as well as household surveys and governance assessments from the World Bank’s World Development and Worldwide Governance Indicators.

This data reflects past and present vulnerability, but to understand which places in Africa would be most vulnerable to future climate change, Busby and his team relied on the regional climate model simulations designed by Edward Vizy and Kerry Cook, both members of the CCAPS team from the Jackson School of Geosciences.

Vizy and Cook ran three, 20-year nested simulations of the African continent’s climate at the regional scales of 90 and 30 kilometers, using a derivation of the Weather Research and Forecasting Model of the National Center for Atmospheric Research. One was a control simulation representative of the years 1989-2008, and the others represented the climate as it may exist in 2041-2060 and 2081-2100.

“We’re adjusting the control simulation’s CO2 concentration, model boundary conditions, and sea surface temperatures to increased greenhouse gas forcing scenario conditions derived from atmosphere-ocean global climate models. We re-run the simulation to understand how the climate will operate under a different, warmer state at spatial resolutions needed for regional impact analyses,” Vizy said.

Each simulation took two months to complete on the Rangersupercomputer at the Texas Advanced Computing Center (TACC).

“We couldn’t run these simulations without the high-performance computing resources at TACC, it would just take too long. If it takes two months running with 200 processors, I can’t fathom doing it with one processor,” Vizy said.

Researchers input data from these vulnerability maps into an online mapping tool developed by the CCAPS program to integrate its various lines of climate, conflict and aid research. CCAPS’s current mapping tool is based on a prototype developed by the team to assess conflict patterns in Africa with the help of researchers at the TACC/ACES Visualization Laboratory (Vislab), according to Ashley Moran, program manager of CCAPS.

“The mapping tool is a key part of our effort to produce new research that could support policy making and the work of practitioners and governments in Africa,” Moran said. “We want to communicate this research in ways that are of maximum use to policymakers and researchers.”

The initial prototype of the mapping tool used the ArcGIS platform to project data onto maps. Working with its partner Development Gateway, CCAPS expanded the system to incorporate conflict, vulnerability, governance and aid research data.

After completing the first version of their model, Busby and his team carried out the process of ground truthing their maps by visiting local officials and experts in several African countries, such as Kenya and South Africa.

“The experience of talking with local experts was tremendously gratifying,” Busby said. “They gave us confidence that the things we’re doing in a computer lab setting in Austin do pick up on some of the ground-level expert opinions.”

Busby and his team complemented their maps with local perspectives on the kind of impact climate was already having, leading to new insights that could help perfect the model. For example, local experts felt the model did not address areas with chronic water scarcity, an issue the researchers then corrected upon returning home.

According to Busby, the vulnerability maps serve as focal points which can give way to further analysis about the issues they illustrate.

Some of the countries most vulnerable to climate change include Somalia, Sierra Leone, Guinea, Sudan and parts of the Democratic Republic of Congo. Knowing this allows local policymakers to develop security strategies for the future, including early warning systems against floods, investments in drought-resistant agriculture, and alternative livelihoods that might facilitate resource sharing and help prevent future conflicts. The next iteration of the online mapping tool to be released later this year will also incorporate the future projections of climate exposure from the models developed by Vizy and Cook.

The CCAPS team publishes their research in journals likeClimate Dynamics and The International Studies Review, carries out regular consultations with the U.S. government and governments in Africa, and participates in conferences sponsored by concerned organizations, such as the United Nations and the United States Africa Command.

“What this project has showed us is that many of the real challenges of the 21st century aren’t always in traditional state-to-state interactions, but are transnational in nature and require new ways of dealing with,” Gavin said.

Teen Survival Expectations Predict Later Risk-Taking Behavior (Science Daily)

ScienceDaily (Aug. 1, 2012) — Some young people’s expectations that they will not live long, healthy lives may actually foreshadow such outcomes.

New research published August 1 in the open access journal PLOS ONEreports that, for American teens, the expectation of death before the age of 35 predicted increased risk behaviors including substance abuse and suicide attempts later in life and a doubling to tripling of mortality rates in young adulthood.

The researchers, led by Quynh Nguyen of Northeastern University in Boston, found that one in seven participants in grades 7 to 12 reported perceiving a 50-50 chance or less of surviving to age 35. Upon follow-up interviews over a decade later, the researchers found that low expectations of longevity at young ages predicted increased suicide attempts and suicidal thoughts as well as heavy drinking, smoking, and use of illicit substances later in life relative to their peers who were almost certain they would live to age 35.

“The association between early survival expectations and detrimental outcomes suggests that monitoring survival expectations may be useful for identifying at-risk youth,” the authors state.

The study compared data collected from 19,000 adolescents in 1994-1995 to follow-up data collected from the same respondents 13-14 years later. The cohort was part of the National Longitudinal Study of Adolescent Health (Add Health), conducted by the Carolina Population Center and funded by the National Institutes of Health and 23 other federal agencies and foundations.

Journal Reference:

Quynh C. Nguyen, Andres Villaveces, Stephen W. Marshall, Jon M. Hussey, Carolyn T. Halpern, Charles Poole. Adolescent Expectations of Early Death Predict Adult Risk BehaviorsPLoS ONE, 2012; 7 (8): e41905 DOI: 10.1371/journal.pone.0041905

Brain Imaging Can Predict How Intelligent You Are: ‘Global Brain Connectivity’ Explains 10 Percent of Variance in Individual Intelligence (Science Daily)

ScienceDaily (Aug. 1, 2012) — When it comes to intelligence, what factors distinguish the brains of exceptionally smart humans from those of average humans?

New research suggests as much as 10 percent of individual variances in human intelligence can be predicted based on the strength of neural connections between the lateral prefrontal cortex and other regions of the brain. (Credit: WUSTL Image / Michael Cole)

As science has long suspected, overall brain size matters somewhat, accounting for about 6.7 percent of individual variation in intelligence. More recent research has pinpointed the brain’s lateral prefrontal cortex, a region just behind the temple, as a critical hub for high-level mental processing, with activity levels there predicting another 5 percent of variation in individual intelligence.

Now, new research from Washington University in St. Louis suggests that another 10 percent of individual differences in intelligence can be explained by the strength of neural pathways connecting the left lateral prefrontal cortex to the rest of the brain.

Published in the Journal of Neuroscience, the findings establish “global brain connectivity” as a new approach for understanding human intelligence.

“Our research shows that connectivity with a particular part of the prefrontal cortex can predict how intelligent someone is,” suggests lead author Michael W. Cole, PhD, a postdoctoral research fellow in cognitive neuroscience at Washington University.

The study is the first to provide compelling evidence that neural connections between the lateral prefrontal cortex and the rest of the brain make a unique and powerful contribution to the cognitive processing underlying human intelligence, says Cole, whose research focuses on discovering the cognitive and neural mechanisms that make human behavior uniquely flexible and intelligent.

“This study suggests that part of what it means to be intelligent is having a lateral prefrontal cortex that does its job well; and part of what that means is that it can effectively communicate with the rest of the brain,” says study co-author Todd Braver, PhD, professor of psychology in Arts & Sciences and of neuroscience and radiology in the School of Medicine. Braver is a co-director of the Cognitive Control and Psychopathology Lab at Washington University, in which the research was conducted.

One possible explanation of the findings, the research team suggests, is that the lateral prefrontal region is a “flexible hub” that uses its extensive brain-wide connectivity to monitor and influence other brain regions in a goal-directed manner.

“There is evidence that the lateral prefrontal cortex is the brain region that ‘remembers’ (maintains) the goals and instructions that help you keep doing what is needed when you’re working on a task,” Cole says. “So it makes sense that having this region communicating effectively with other regions (the ‘perceivers’ and ‘doers’ of the brain) would help you to accomplish tasks intelligently.”

While other regions of the brain make their own special contribution to cognitive processing, it is the lateral prefrontal cortex that helps coordinate these processes and maintain focus on the task at hand, in much the same way that the conductor of a symphony monitors and tweaks the real-time performance of an orchestra.

“We’re suggesting that the lateral prefrontal cortex functions like a feedback control system that is used often in engineering, that it helps implement cognitive control (which supports fluid intelligence), and that it doesn’t do this alone,” Cole says.

The findings are based on an analysis of functional magnetic resonance brain images captured as study participants rested passively and also when they were engaged in a series of mentally challenging tasks associated with fluid intelligence, such as indicating whether a currently displayed image was the same as one displayed three images ago.

Previous findings relating lateral prefrontal cortex activity to challenging task performance were supported. Connectivity was then assessed while participants rested, and their performance on additional tests of fluid intelligence and cognitive control collected outside the brain scanner was associated with the estimated connectivity.

Results indicate that levels of global brain connectivity with a part of the left lateral prefrontal cortex serve as a strong predictor of both fluid intelligence and cognitive control abilities.

Although much remains to be learned about how these neural connections contribute to fluid intelligence, new models of brain function suggested by this research could have important implications for the future understanding — and perhaps augmentation — of human intelligence.

The findings also may offer new avenues for understanding how breakdowns in global brain connectivity contribute to the profound cognitive control deficits seen in schizophrenia and other mental illnesses, Cole suggests.

Other co-authors include Tal Yarkoni, PhD, a postdoctoral fellow in the Department of Psychology and Neuroscience at the University of Colorado at Boulder; Grega Repovs, PhD, professor of psychology at the University of Ljubljana, Slovenia; and Alan Anticevic, an associate research scientist in psychiatry at Yale University School of Medicine.

Funding from the National Institute of Mental Health supported the study (National Institutes of Health grants MH66088, NR012081, MH66078, MH66078-06A1W1, and 1K99MH096801).

The Conversion of a Climate-Change Skeptic (N.Y.Times)

OP-ED CONTRIBUTOR

By RICHARD A. MULLER

Published: July 28, 2012

Berkeley, Calif.

CALL me a converted skeptic. Three years ago I identified problems in previous climate studies that, in my mind, threw doubt on the very existence of global warming. Last year, following an intensive research effort involving a dozen scientists, I concluded that global warming was real and that the prior estimates of the rate of warming were correct. I’m now going a step further: Humans are almost entirely the cause.

My total turnaround, in such a short time, is the result of careful and objective analysis by the Berkeley Earth Surface Temperature project, which I founded with my daughter Elizabeth. Our results show that the average temperature of the earth’s land has risen by two and a half degrees Fahrenheit over the past 250 years, including an increase of one and a half degrees over the most recent 50 years. Moreover, it appears likely that essentially all of this increase results from the human emission of greenhouse gases.

These findings are stronger than those of the Intergovernmental Panel on Climate Change, the United Nations group that defines the scientific and diplomatic consensus on global warming. In its 2007 report, the I.P.C.C. concluded only that most of the warming of the prior 50 years could be attributed to humans. It was possible, according to the I.P.C.C. consensus statement, that the warming before 1956 could be because of changes in solar activity, and that even a substantial part of the more recent warming could be natural.

Our Berkeley Earth approach used sophisticated statistical methods developed largely by our lead scientist, Robert Rohde, which allowed us to determine earth land temperature much further back in time. We carefully studied issues raised by skeptics: biases from urban heating (we duplicated our results using rural data alone), from data selection (prior groups selected fewer than 20 percent of the available temperature stations; we used virtually 100 percent), from poor station quality (we separately analyzed good stations and poor ones) and from human intervention and data adjustment (our work is completely automated and hands-off). In our papers we demonstrate that none of these potentially troublesome effects unduly biased our conclusions.

The historic temperature pattern we observed has abrupt dips that match the emissions of known explosive volcanic eruptions; the particulates from such events reflect sunlight, make for beautiful sunsets and cool the earth’s surface for a few years. There are small, rapid variations attributable to El Niño and other ocean currents such as the Gulf Stream; because of such oscillations, the “flattening” of the recent temperature rise that some people claim is not, in our view, statistically significant. What has caused the gradual but systematic rise of two and a half degrees? We tried fitting the shape to simple math functions (exponentials, polynomials), to solar activity and even to rising functions like world population. By far the best match was to the record of atmospheric carbon dioxide, measured from atmospheric samples and air trapped in polar ice.

Just as important, our record is long enough that we could search for the fingerprint of solar variability, based on the historical record of sunspots. That fingerprint is absent. Although the I.P.C.C. allowed for the possibility that variations in sunlight could have ended the “Little Ice Age,” a period of cooling from the 14th century to about 1850, our data argues strongly that the temperature rise of the past 250 years cannot be attributed to solar changes. This conclusion is, in retrospect, not too surprising; we’ve learned from satellite measurements that solar activity changes the brightness of the sun very little.

How definite is the attribution to humans? The carbon dioxide curve gives a better match than anything else we’ve tried. Its magnitude is consistent with the calculated greenhouse effect — extra warming from trapped heat radiation. These facts don’t prove causality and they shouldn’t end skepticism, but they raise the bar: to be considered seriously, an alternative explanation must match the data at least as well as carbon dioxide does. Adding methane, a second greenhouse gas, to our analysis doesn’t change the results. Moreover, our analysis does not depend on large, complex global climate models, the huge computer programs that are notorious for their hidden assumptions and adjustable parameters. Our result is based simply on the close agreement between the shape of the observed temperature rise and the known greenhouse gas increase.

It’s a scientist’s duty to be properly skeptical. I still find that much, if not most, of what is attributed to climate change is speculative, exaggerated or just plain wrong. I’ve analyzed some of the most alarmist claims, and my skepticism about them hasn’t changed.

Hurricane Katrina cannot be attributed to global warming. The number of hurricanes hitting the United States has been going down, not up; likewise for intense tornadoes. Polar bears aren’t dying from receding ice, and the Himalayan glaciers aren’t going to melt by 2035. And it’s possible that we are currently no warmer than we were a thousand years ago, during the “Medieval Warm Period” or “Medieval Optimum,” an interval of warm conditions known from historical records and indirect evidence like tree rings. And the recent warm spell in the United States happens to be more than offset by cooling elsewhere in the world, so its link to “global” warming is weaker than tenuous.

The careful analysis by our team is laid out in five scientific papers now online atBerkeleyEarth.org. That site also shows our chart of temperature from 1753 to the present, with its clear fingerprint of volcanoes and carbon dioxide, but containing no component that matches solar activity. Four of our papers have undergone extensive scrutiny by the scientific community, and the newest, a paper with the analysis of the human component, is now posted, along with the data and computer programs used. Such transparency is the heart of the scientific method; if you find our conclusions implausible, tell us of any errors of data or analysis.

What about the future? As carbon dioxide emissions increase, the temperature should continue to rise. I expect the rate of warming to proceed at a steady pace, about one and a half degrees over land in the next 50 years, less if the oceans are included. But if China continues its rapid economic growth (it has averaged 10 percent per year over the last 20 years) and its vast use of coal (it typically adds one new gigawatt per month), then that same warming could take place in less than 20 years.

Science is that narrow realm of knowledge that, in principle, is universally accepted. I embarked on this analysis to answer questions that, to my mind, had not been answered. I hope that the Berkeley Earth analysis will help settle the scientific debate regarding global warming and its human causes. Then comes the difficult part: agreeing across the political and diplomatic spectrum about what can and should be done.

Richard A. Muller, a professor of physics at the University of California, Berkeley, and a former MacArthur Foundation fellow, is the author, most recently, of “Energy for Future Presidents: The Science Behind the Headlines.”

*   *   *

Climate change study forces sceptical scientists to change minds (The Guardian)

Earth’s land shown to have warmed by 1.5C over past 250 years, with humans being almost entirely responsible

Leo Hickman
guardian.co.uk, Sunday 29 July 2012 14.03 BST

Prof Richard MullerProf Richard Muller considers himself a converted sceptic following the study’s surprise results. Photograph: Dan Tuffs for the Guardian

The Earth’s land has warmed by 1.5C over the past 250 years and “humans are almost entirely the cause”, according to a scientific study set up to address climate change sceptics’ concerns about whether human-induced global warming is occurring.

Prof Richard Muller, a physicist and climate change sceptic who founded the Berkeley Earth Surface Temperature (Best) project, said he was surprised by the findings. “We were not expecting this, but as scientists, it is our duty to let the evidence change our minds.” He added that he now considers himself a “converted sceptic” and his views had undergone a “total turnaround” in a short space of time.

“Our results show that the average temperature of the Earth’s land has risen by 2.5F over the past 250 years, including an increase of 1.5 degrees over the most recent 50 years. Moreover, it appears likely that essentially all of this increase results from the human emission of greenhouse gases,” Muller wrote in an opinion piece for the New York Times.

Can scientists in California end the war on climate change?
Study finds no grounds for climate sceptics’ concerns
Video: Berkeley Earth tracks climate change
Are climate sceptics more likely to be conspiracy theorists?

The team of scientists based at the University of California, Berkeley, gathered and merged a collection of 14.4m land temperature observations from 44,455 sites across the world dating back to 1753. Previous data sets created by Nasa, the US National Oceanic and Atmospheric Administration, and the Met Office and the University of East Anglia’s climate research unit only went back to the mid-1800s and used a fifth as many weather station records.

The funding for the project included $150,000 from the Charles G Koch Charitable Foundation, set up by the billionaire US coal magnate and key backer of the climate-sceptic Heartland Institute thinktank. The research also received $100,000 from the Fund for Innovative Climate and Energy Research, which was created by Bill Gates.

Unlike previous efforts, the temperature data from various sources was not homogenised by hand – a key criticism by climate sceptics. Instead, the statistical analysis was “completely automated to reduce human bias”. The Best team concluded that, despite their deeper analysis, their own findings closely matched the previous temperature reconstructions, “but with reduced uncertainty”.

Last October, the Best team published results that showed the average global land temperature has risen by about 1C since the mid-1950s. But the team did not look for possible fingerprints to explain this warming. The latest data analysis reached much further back in time but, crucially, also searched for the most likely cause of the rise by plotting the upward temperature curve against suspected “forcings”. It analysed the warming impact of solar activity – a popular theory among climate sceptics – but found that, over the past 250 years, the contribution of the sun has been “consistent with zero”. Volcanic eruptions were found to have caused short dips in the temperature rise in the period 1750–1850, but “only weak analogues” in the 20th century.

“Much to my surprise, by far the best match came to the record of atmospheric carbon dioxide, measured from atmospheric samples and air trapped in polar ice,” said Muller. “While this doesn’t prove that global warming is caused by human greenhouse gases, it is currently the best explanation we have found, and sets the bar for alternative explanations.”

Muller said his team’s findings went further and were stronger than the latest report published by the Intergovernmental Panel on ClimateChange.

In an unconventional move aimed at appeasing climate sceptics by allowing “full transparency”, the results have been publicly released before being peer reviewed by the Journal of Geophysical Research. All the data and analysis is now available to be freely scrutinised at the Bestwebsite. This follows the pattern of previous Best results, none of which have yet been published in peer-reviewed journals.

When the Best project was announced last year, the prominent climate sceptic blogger Anthony Watts was consulted on the methodology. He stated at the time: “I’m prepared to accept whatever result they produce, even if it proves my premise wrong.” However, tensions have since arisen between Watts and Muller.

Early indications suggest that climate sceptics are unlikely to fully accept Best’s latest results. Prof Judith Curry, a climatologist at the Georgia Institute of Technology who runs a blog popular with climate sceptics and who is a consulting member of the Best team, told the Guardian that the method used to attribute the warming to human emissions was “way over-simplistic and not at all convincing in my opinion”. She added: “I don’t think this question can be answered by the simple curve fitting used in this paper, and I don’t see that their paper adds anything to our understanding of the causes of the recent warming.”

Prof Michael Mann, the Penn State palaeoclimatologist who has faced hostility from climate sceptics for his famous “hockey stick” graph showing a rapid rise in temperatures during the 20th century, said he welcomed the Best results as they “demonstrated once again what scientists have known with some degree of certainty for nearly two decades”. He added: “I applaud Muller and his colleagues for acting as any good scientists would, following where their analyses led them, without regard for the possible political repercussions. They are certain to be attacked by the professional climate change denial crowd for their findings.”

Muller said his team’s analysis suggested there would be 1.5 degrees of warming over land in the next 50 years, but if China continues its rapid economic growth and its vast use of coal then that same warming could take place in less than 20 years.

“Science is that narrow realm of knowledge that, in principle, is universally accepted,” wrote Muller. “I embarked on this analysis to answer questions that, to my mind, had not been answered. I hope that the Berkeley Earth analysis will help settle the scientific debate regarding global warming and its human causes. Then comes the difficult part: agreeing across the political and diplomatic spectrum about what can and should be done.”

Climate Change and the Next U.S. Revolution (ZNet)

Thursday, July 26, 2012

The U.S. heat wave is slowly shaking the foundations of American politics. It may take years for the deep rumble to evolve into an above ground, institution-shattering earthquake, but U.S. society has changed for good.

The heat wave has helped convince tens of millions of Americans that climate change is real, overpowering the fake science and right-wing media – funded by corporate cash – to convince Americans otherwise.

Republicans and Democrats alike also erect roadblocks to understanding climate change. By the politicians’ complete lack of action towards addressing the issue, the “climate change is fake” movement was strengthened, since Americans presumed that any sane government would be actively trying to address an issue that had the potential to destroy civilization.

But working people have finally made up their mind. A recent poll showed that 70 percent of Americans now believe that climate change is real, up from 52 percent in 2010. And a growing number of people are recognizing that the warming of the planet is caused by human activity.

Business Week explains: “A record heat wave, drought and catastrophic wildfires are accomplishing what climate scientists could not: convincing a wide swath of Americans that global temperatures are rising.”

This means that working class families throughout the Midwest and southern states simply don’t believe what their media and politicians are telling them.

It also implies that these millions of Americans are being further politicized in a deeper sense.

Believing that climate change exists implies that you are somewhat aware about the massive consequences to humanity if the global economy doesn’t drastically change, and fast.

This awareness has revolutionary implications. As millions of Americans watch the environment destroyed – for their grandchildren or themselves – while politicians do absolutely nothing in response, or make tiny token gestures – a growing number of Americans will demand political alternatives, and fight to see them created. The American political system as it exists today cannot cope with this inevitable happening.

The New York Times explains why: “…the American political system is not ready to agree to a [climate] treaty that would force the United States, over time, to accept profound changes in its energy [coal, oil], transport [trucking and airline industry] and manufacturing [corporate] sectors.”

In short, the U.S. government will not force corporations to make less profit by behaving more eco-friendly. This is the essence of the problem.

In order for humanity to survive climate change, the economy must be radically transformed; massive investments must be made in renewable energy, public transportation, and recycling, while dirty energy sources must be quickly swept into the dustbin of history.

But the economy is currently owned by giant, privately run corporations, that will continue destroying the earth if it earns them huge profits, and they make massive “contributions” to political parties to ensure this remains so. It’s becoming increasingly obvious that government inaction on climate change is directly linked to the “special interests” of corporations that dominate these governments.

This fact of U.S. politics is present in every other capitalist country as well, which means that international agreements on reducing greenhouse gasses will remain impossible, as each country’s corporations vie for market domination, reducing pollution simply puts them at a competitive disadvantage.

This dynamic has already caused massive delays in the UN’s already inadequate efforts at addressing climate change. The Kyoto climate agreement was the by-product of years of cooperation and planning between many nations that included legally binding agreements to reduce greenhouse gasses. The Bush and Obama administrations helped destroy these efforts.

For example, Instead of building upon the foundation of the Kyoto Protocol, the Obama administration demanded a whole new structure, something that would take years to achieve. The Kyoto framework (itself insufficient) was abandoned because it included legally binding agreements, and was based on multilateral, agreed-upon reductions of greenhouse gasses.

In an article by the Guardian entitled “US Planning to Weaken Copenhagen Climate Deal,” the Obama administration’s UN position is exposed, as he dismisses the Kyoto Protocol by proposing that “…each country set its own rules and to decide unilaterally how to meet its target.”
Obama’s proposal came straight from the mouth of U.S. corporations, who wanted to ensure that there was zero accountability, zero oversight, zero climate progress, and therefore no dent to their profits. Instead of using its massive international leverage for climate justice, the U.S. has used it to promote divisiveness and inaction, to the potential detriment of billions of people globally.

The stakes are too high to hold out any hope that governments will act boldly. The Business Week article below explains the profound changes happening to the climate:

“The average temperature for the U.S. during June was 71.2 degrees Fahrenheit (21.7 Celsius), which is 2 degrees higher than the average for the 20th century, according to the National Oceanic and Atmospheric Administration. The June temperatures made the preceding 12 months the warmest since record-keeping began in 1895, the government agency said.”

Activists who are radicalized by this global problem face a crisis of what to do about it. It is difficult to put forth a positive climate change demand, since the problem is global.  Demanding that governments “act boldly” to address climate change hasn’t worked, and lesser demands seem inadequate.

The environmental rights movement continues to go through a variety of phases: individual and small group eco-“terrorism,” causing property damage to environmentally damaging companies; corporate campaigns that target especially bad polluters with high-profile direct action; and massive education programs that have been highly successful, but fall short when it comes to winning change.

Ultimately, climate activists must come face to face with political and corporate power. Corporate-owned governments are the ones with the power to adequately address the climate change issue, and they will not be swayed by good science, common sense, basic decency, or even a torched planet.

Those in power only respond to power, and the only power capable of displacing corporate power is when people unite and act collectively, as was done in Egypt, Tunisia, and is still developing throughout Europe.

Climate groups cannot view their issue as separate from other groups that are organizing against corporate power. The social movements that have emerged to battle austerity measures are natural allies, as are anti-war and labor activists. The climate solution will inevitably require revolutionary measures, which first requires that alliances and demands are put forward that unite Labor, working people in general, community, and student groups towards collective action.

One possible immediate demand is for environmental activists to unite with Labor groups over a federal jobs program, paid for by taxing the rich, that makes massive investments in jobs that are climate related, such as solar panel production, transportation, building recycling centers, home retro-fitting, etc.

Another demand could be to insist that the government convene the most knowledgeable scientists in the area of clean energy. These scientists should be given all the resources they need in order to collectively create alternative sources of clean energy that would allow for a realistic alternative to the current polluting and toxic sources of energy.

However, any type of immediate demand will meet giant corporate resistance from both political parties. Fighting for a uniting demand will thus strengthen the movement, and for this reason it is important to link climate solutions to the creation of jobs, which are the number one concern of most Americans. This unity will in turn lead allies toward a deeper understanding of the problem, and therefore deeper solutions will emerge that challenge the whole economic structure that is deaf to the needs of humans and the climate and sacrifices everything to the private profit of a few.

Shamus Cooke is a social service worker, trade unionist, and writer for Workers Action (www.workerscompass.org). He can be reached at shamuscooke@gmail.com

http://www.businessweek.com/news/2012-07-18/record-heat-wave-pushes-u-dot-s-dot-belief-in-climate-change-to-70-percent

http://www.nytimes.com/2009/12/13/weekinreview/13broder.html

http://www.guardian.co.uk/environment/2009/sep/15/europe-us-copenhagen

Computers Can Predict Effects of HIV Policies, Study Suggests (Science Daily)

ScienceDaily (July 27, 2012) — Policymakers in the fight against HIV/AIDS may have to wait years, even decades, to know whether strategic choices among possible interventions are effective. How can they make informed choices in an age of limited funding? A reliable, well-calibrated, predictive computer simulation would be a great help.

A visualization generated by an agent-based model of New York City’s HIV epidemic shows the risky interactions of unprotected sex or needle sharing among injection drug users (red), non-injection drug users (blue) and non-users (green). (Credit: Brandon Marshall/Brown University)

Policymakers struggling to stop the spread of HIV grapple with “what if” questions on the scale of millions of people and decades of time. They need a way to predict the impact of many potential interventions, alone or in combination. In two papers to be presented at the 2012 International AIDS Society Conference in Washington, D.C., Brandon Marshall, assistant professor of epidemiology at Brown University, will unveil a computer program calibrated to model accurately the spread of HIV in New York City over a decade and to make specific predictions about the future of the epidemic under various intervention scenarios.

“It reflects what’s seen in the real world,” said Marshall. “What we’re trying to do is identify the ideal combination of interventions to reduce HIV most dramatically in injection drug users.”

In an analysis that he’ll present on July 27, Marshall projects that with no change in New York City’s current programs, the infection rate among injection drug users will be 2.1 per 1,000 in 2040. Expanding HIV testing would drop the rate only 12 percent to 1.9 per 1,000; increasing drug treatment would reduce the rate 26 percent to 1.6 per 1,000; providing earlier delivery of antiretroviral therapy and better adherence would drop the rate 45 percent to 1.2 per 1,000; and expanding needle exchange programs would reduce the rate 34 percent to 1.4 per 1,000. Most importantly, doing all four of those things would cut the rate by more than 60 percent, to 0.8 per 1,000.

Virtual reality, real choices

The model is unique in that it creates a virtual reality of 150,000 “agents,” a programming term for simulated individuals, who in the case of the model, engage in drug use and sexual activity like real people.

Like characters in an all-too-serious video game, the agents behave in a world governed by biological rules, such as how often the virus can be transmitted through encounters such as unprotected gay sex or needle sharing.

With each run of the model, agents accumulate a detailed life history. For example, in one run, agent 89,425, who is male and has sex with men, could end up injecting drugs. He participates in needle exchanges, but according to the built-in probabilities, in year three he shares needles multiple times with another injection drug user with whom he is also having unprotected sex. In the last of those encounters, agent 89,425 becomes infected with HIV. In year four he starts participating in drug treatment and in year five he gets tested for HIV, starts antiretroviral treatment, and reduces the frequency with which he has unprotected sex. Because he always takes his HIV medications, he never transmits the virus further.

That level of individual detail allows for a detailed examination of transmission networks and how interventions affect them.

“With this model you can really look at the microconnections between people,” said Marshall, who began working on the model as a postdoctoral fellow at Columbia University and has continued to develop it since coming to Brown in January. “That’s something that we’re really excited about.”

To calibrate the model, Marshall and his colleagues found the best New York City data they could about how many people use drugs, what percentage of people were gay or lesbian, the probabilities of engaging in unprotected sex and needle sharing, viral transmission, access to treatment, treatment effectiveness, participation in drug treatment, progression from HIV infection to AIDS, and many more behavioral, social and medical factors. They also continuously calibrated it until the model could faithfully reproduce the infection rates among injection drug users that were known to occur in New York between 1992 and 2002.

And they don’t just run the simulation once. They run it thousands of times on a supercomputer at Brown to be sure the results they see are reliable.

Future applications

At Brown, Marshall is continuing to work on other aspects of the model, including an analysis of the cost effectiveness of each intervention and their combinations. Cost is, after all, another fact of life that policymakers and public health officials must weigh.

And then there’s the frustrating insight that the infection rate, even with four strengthened interventions underway, didn’t reduce the projected epidemic by much more than half.

“I actually expected something larger,” Marshall said. “That speaks to how hard we have to work to make sure that drug users can access and benefit from proven interventions to reduce the spread of HIV.”

Marshall’s collaborators on the model include Magdalena Paczkowski, Lars Seemann, Barbara Tempalski, Enrique Pouget, Sandro Galea, and Samuel Friedman.

The National Institutes of Health and the Lifespan/Tufts/Brown Center for AIDS Research provide financial support for the model’s continued development.

Climate Change Could Open Trade Opportunities for Some Vulnerable Nations (Science Daily)

ScienceDaily (July 26, 2012) — Tanzania is one developing country that could actually benefit from climate change by increasing exports of corn to the U.S. and other nations, according to a study by researchers at Stanford University, the World Bank and Purdue University.

The study, published in the Review of Development Economics, shows the African country better known for safaris and Mt. Kilimanjaro has the potential to substantially increase its maize exports and take advantage of higher commodity prices with a variety of trading partners due to predicted dry and hot weather that could affect those countries’ usual sources for the crop. In years that major consumer countries such as the U.S., China and India are forecast to experience severe dry conditions, Tanzania’s weather will likely be comparatively wet. Similarly, in the relatively few years this century that it is expected to have severe dry weather, Tanzania could import corn from trading partners experiencing better growing conditions.

“This study highlights how government policies can influence the impact that we experience from the climate system” said study co-author Noah Diffenbaugh, an assistant professor of environmental Earth system science at Stanford’s School of Earth Sciences and a center fellow at the Stanford Woods Institute for the Environment. “Tanzania is a particularly interesting case, as it has the potential to benefit from climate change if climate model predictions of decreasing drought in East Africa prove to be correct, and if trade policies are constructed to take advantage of those new opportunities.”

Tightening restrictions on crop exports during times of climate instability may seem like a logical way to ensure domestic food availability and price stability. In fact, the study warns, trade restrictions such as those that Tanzania has instituted several times in recent years prevent countries such as Tanzania from buffering its poor citizens in bad climate years and from taking advantage of economic opportunities in good climate years.

The study, the most long-range and detailed of its kind to date uses economic, climatic and agricultural data and computational models to forecast the occurrence of severe dry years during the next nine decades in Tanzania and its key trading partners. The authors began by analyzing historical years in which Tanzania experienced grains surpluses or deficits. They found that a closed trade policy enhanced poverty in both kinds of years, by limiting the ability to offset shortfalls with imports during deficit years and limiting the ability to profit from exports during surplus years.

The authors then attempted to predict how often Tanzania and key trading partners will experience severely dry years in response to continued global warming. Among the predictions: during an average of 96 percent of the years that the U.S. and China are predicted to have extremely dry conditions, Tanzania will not experience similarly dry weather. For India, that percentage increases to 97 percent. Similarly, the study’s climate models suggest that Tanzania is likely to have adequate growing season moisture in most of the years that its key African trading partners experience severe dry weather.

Among Tanzania’s trading partners, the U.S., China, Canada and Russia are most likely to consistently experience adequate growing conditions in years when Tanzania does not. When compared with all of its key trading partners, Tanzania’s dry years during the 21st century will often coincide with non-dry years in the other countries. Having a diverse mix of trading partners could help hedge against a coincidence of severe dry weather within and outside of Africa, the study’s results suggest.

The findings are relevant to grain-growing countries around the world. Those countries stand to profit from exports in years when trading partners are enduring severe dry and / or hot weather. Likewise, they can buffer themselves against bad growing weather at home by importing from grains-rich regions less affected by such weather during that particular year.

“This study highlights the importance of trade in either buffering or exacerbating the effects of climate stresses on the poor,” says Diffenbaugh. “We find that these effects are already taking place in the current climate, and that they could become even more important in the future as the co-occurrence of good and bad years between different regions changes in response to global warming.”