Arquivo da tag: Incerteza

Climate models that predict more droughts win further scientific support (Washington Post)

The drought of 2012: It has been more than a half-century since a drought this extensive hit the United States, NOAA reported July 16. The effects are growing and may cost the U.S. economy $50 billion.

By Hristio Boytchev, Published: August 13

The United States will suffer a series of severe droughts in the next two decades, according to a new study published in the journal Nature Climate Change. Moreover, global warming will play an increasingly important role in their abundance and severity, claims Aiguo Dai, the study’s author.

His findings bolster conclusions from climate models used by researchers around the globe that have predicted severe and widespread droughts in coming decades over many land areas. Those models had been questioned because they did not fully reflect actual drought patterns when they were applied to conditions in the past. However, using a statistical method with data about sea surface temperatures, Dai, a climate researcher at the federally funded National Center for Atmospheric Research, found that the model accurately portrayed historic climate events.

“We can now be more confident that the models are correct,” Dai said, “but unfortunately, their predictions are dire.”

In the United States, the main culprit currently is a cold cycle in the surface temperature of the eastern Pacific Ocean. It decreases precipitation, especially over the western part of the country. “We had a similar situation in the Dust Bowl era of the 1930s,” said Dai, who works at the research center’s headquarters in Boulder, Colo.

While current models cannot predict the severity of a drought in a given year, they can assess its probability. “Considering the current trend, I was not surprised by the 2012 drought,” Dai said.

The Pacific cycle is expected to last for the next one or two decades, bringing more aridity. On top of that comes climate change. “Global warming has a subtle effect on drought at the moment,” Dai said, “but by the end of the cold cycle, global warming might take over and continue to cause dryness.”

While the variations in sea temperatures primarily influence precipitation, global warming is expected to bring droughts by increasing evaporation over land. Additionally, Dai predicts more dryness in South America, Southern Europe and Africa.

“The similarity between the observed droughts and the projections from climate models here is striking,” said Peter Cox, a professor of climate system dynamics at Britain’s University of Exeter, who was not involved in Dai’s research. He said he also agrees that the latest models suggest increasing drought to be consistent with man-made climate change.

Programa de computador mimetiza evolução humana (Fapesp)

Software desenvolvido na USP de São Carlos cria e seleciona programas geradores de Árvores de Decisão, ferramentas capazes de fazer previsões. Pesquisa foi premiada nos Estados Unidos, no maior evento de computação evolutiva (Wikimedia)

16/08/2012

Por Karina Toledo

Agência FAPESP – Árvores de Decisão são ferramentas computacionais que conferem às máquinas a capacidade de fazer previsões com base na análise de dados históricos. A técnica pode, por exemplo, auxiliar o diagnóstico médico ou a análise de risco de aplicações financeiras.

Mas, para ter a melhor previsão, é necessário o melhor programa gerador de Árvores de Decisão. Para alcançar esse objetivo, pesquisadores do Instituto de Ciências Matemáticas e de Computação (ICMC) da Universidade de São Paulo (USP), em São Carlos, se inspiraram na teoria evolucionista de Charles Darwin.

“Desenvolvemos um algoritmo evolutivo, ou seja, que mimetiza o processo de evolução humana para gerar soluções”, disse Rodrigo Coelho Barros, doutorando do Laboratório de Computação Bioinspirada (BioCom) do ICMC e bolsista da FAPESP.

A computação evolutiva, explicou Barros, é uma das várias técnicas bioinspiradas, ou seja, que buscam na natureza soluções para problemas computacionais. “É notável como a natureza encontra soluções para problemas extremamente complicados. Não há dúvidas de que precisamos aprender com ela”, disse Barros.

Segundo Barros, o software desenvolvido em seu doutorado é capaz de criar automaticamente programas geradores de Árvores de Decisão. Para isso, faz cruzamentos aleatórios entre os códigos de programas já existentes gerando “filhos”.

“Esses ‘filhos’ podem eventualmente sofrer mutações e evoluir. Após um tempo, é esperado que os programas de geração de Árvores de Decisão evoluídos sejam cada vez melhores e nosso algoritmo seleciona o melhor de todos”, afirmou Barros.

Mas enquanto o processo de seleção natural na espécie humana leva centenas ou até milhares de anos, na computação dura apenas algumas horas, dependendo do problema a ser resolvido. “Estabelecemos cem gerações como limite do processo evolutivo”, contou Barros.

Inteligência artificial

Em Ciência da Computação, é denominada heurística a capacidade de um sistema fazer inovações e desenvolver técnicas para alcançar um determinado fim.

O software desenvolvido por Barros se insere na área de hiper-heurísticas, tópico recente na área de computação evolutiva que tem como objetivo a geração automática de heurísticas personalizadas para uma determinada aplicação ou conjunto de aplicações.

“É um passo preliminar em direção ao grande objetivo da inteligência artificial: o de criar máquinas capazes de desenvolver soluções para problemas sem que sejam explicitamente programadas para tal”, detalhou Barros.

O trabalho deu origem ao artigo A Hyper-Heuristic Evolutionary Algorithm for Automatically Designing Decision-Tree Algorithms, premiado em três categorias na Genetic and Evolutionary Computation Conference (GECCO), maior evento da área de computação evolutiva do mundo, realizado em julho na Filadélfia, Estados Unidos.

Além de Barros, também são autores do artigo os professores André Carlos Ponce de Leon Ferreira de Carvalho, orientador da pesquisa no ICMC, Márcio Porto Basgalupp, da Universidade Federal de São Paulo (Unifesp), e Alex Freitas, da University of Kent, no Reino Unido, que assumiu a co-orientação.

Os autores foram convidados a submeter o artigo para a revista Evolutionary Computation Journal, publicada pelo Instituto de Tecnologia de Massachusetts (MIT). “O trabalho ainda passará por revisão, mas, como foi submetido a convite, tem grande chance de ser aceito”, disse Barros.

A pesquisa, que deve ser concluída somente em 2013, também deu origem a um artigo publicado a convite no Journal of the Brazilian Computer Society, após ser eleito como melhor trabalho no Encontro Nacional de Inteligência Artificial de 2011.

Outro artigo, apresentado na 11ª International Conference on Intelligent Systems Design and Applications, realizada na Espanha em 2011, rendeu convite para publicação na revistaNeurocomputing.

Search Technology That Can Gauge Opinion and Predict the Future (Science Daily)

ScienceDaily (Aug. 16, 2012) — Inspired by a system for categorising books proposed by an Indian librarian more than 50 years ago, a team of EU-funded researchers have developed a new kind of internet search that takes into account factors such as opinion, bias, context, time and location. The new technology, which could soon be in use commercially, can display trends in public opinion about a topic, company or person over time — and it can even be used to predict the future.

‘Do a search for the word “climate” on Google or another search engine and what you will get back is basically a list of results featuring that word: there’s no categorisation, no specific order, no context. Current search engines do not take into account the dimensions of diversity: factors such as when the information was published, if there is a bias toward one opinion or another inherent in the content and structure, who published it and when,’ explains Fausto Giunchiglia, a professor of computer science at the University of Trento in Italy.

But can search technology be made to identify and embrace diversity? Can a search engine tell you, for example, how public opinion about climate change has changed over the last decade? Or how hot the weather will be a century from now, by aggregating current and past estimates from different sources?

It seems that it can, thanks to a pioneering combination of modern science and a decades-old classification method, brought together by European researchers in the LivingKnowledge (1) project. Supported by EUR 4.8 million in funding from the European Commission, the LivingKnowledge team, coordinated by Prof. Giunchiglia, adopted a multidisciplinary approach to developing new search technology, drawing on fields as diverse as computer science, social science, semiotics and library science.

Indeed, the so-called father of library science, Sirkali Ramamrita Ranganathan, an Indian librarian, served as a source of inspiration for the researchers. In the 1920s and 1930s, Ranganathan developed the first major analytico-synthetic, or faceted, classification system. Using this approach, objects — books, in the case of Ranganathan; web and database content, in the case of the LivingKnowlege team — are assigned multiple characteristics and attributes (facets), enabling the classification to be ordered in multiple ways, rather than in a single, predetermined, taxonomic order. Using the system, an article about the effects on agriculture of climate change written in Norway in 1990 might be classified as ‘Geography; Climate; Climate change; Agriculture; Research; Norway; 1990.’

In order to understand the classification system better and implement it in search engine technology, the LivingKnowledge researchers turned to the Indian Statistical Institute, a project partner, which uses faceted classification on a daily basis.

‘Using their knowledge we were able to turn Ranganathan’s pseudo-algorithm into a computer algorithm and the computer scientists were able to use it to mine data from the web, extract its meaning and context, assign facets to it, and use these to structure the information based on the dimensions of diversity,’ Prof. Giunchiglia says.

Researchers at the University of Pavia in Italy, another partner, drew on their expertise in extracting meaning from web content — not just from text and multimedia content, but also from the way the information is structured and laid out — in order to infer bias and opinions, adding another facet to the data.

‘We are able to identify the bias of authors on a certain subject and whether their opinions are positive or negative,’ the LivingKnowledge coordinator says. ‘Facts are facts, but any information about an event, or on any subject, is often surrounded by opinions and bias.’

From libraries of the 1930s to space travel in 2034…

The technology was implemented in a testbed, now available as open source software, and used for trials based around two intriguing application scenarios.

Working with Austrian social research institute SORA, the team used the LivingKnowledge system to identify social trends and monitor public opinion in both quantitative and qualitative terms. Used for media content analysis, the system could help a company understand the impact of a new advertising campaign, showing how it has affected brand recognition over time and which social groups have been most receptive. Alternatively, a government might use the system to gauge public opinion about a new policy, or a politician could use it to respond in the most publicly acceptable way to a rival candidate’s claims.

With Barcelona Media, a non-profit research foundation supported by Yahoo!, and with the Netherlands-based Internet Memory Foundation, the LivingKnowledge team looked not only at current and past trends, but extrapolated them and drew on forecasts extracted from existing data to try to predict the future. Their Future Predictor application is able to make searches based on questions such as ‘What will oil prices be in 2050?’ or ‘How much will global temperatures rise over the next 100 years?’ and find relevant information and forecasts from today’s web. For example, a search for the year 2034 turns up ‘space travel’ as the most relevant topic indexed in today’s news.

‘More immediately, this application scenario provides functionality for detecting trends even before these trends become apparent in daily events — based on integrated search and navigation capabilities for finding diverse, multi-dimensional information depending on content, bias and time,’ Prof. Giunchiglia explains.

Several of the project partners have plans to implement the technology commercially, and the project coordinator intends to set up a non-profit foundation to build on the LivingKnowledge results at a time when demand for this sort of technology is only likely to increase.

As Prof. Giunchiglia points out, Google fundamentally changed the world by providing everyone with access to much of the world’s information, but it did it for people: currently only humans can understand the meaning of all that data, so much so that information overload is a common problem. As we move into a ‘big data’ age in which information about everything and anything is available at the touch of a button, the meaning of that information needs to be understandable not just by humans but also by machines, so quantity must come combined with quality. The LivingKnowledge approach addresses that problem.

‘When we started the project, no one was talking about big data. Now everyone is and there is increasing interest in this sort of technology,’ Prof. Giunchiglia says. ‘The future will be all about big data — we can’t say whether it will be good or bad, but it will certainly be different.’

Global Warming Causes More Extreme Shifts of the Southern Hemisphere’s Largest Rain Band, Study Suggests (Science Daily)

ScienceDaily (Aug. 16, 2012) — The changes will result from the South Pacific rain band responding to greenhouse warming. The South Pacific rain band is largest and most persistent of the Southern Hemisphere spanning the Pacific from south of the Equator, south-eastward to French Polynesia.

Infrared satellite image obtained with the Geostationary Meteorological Satellite-5. (Credit: NOAA)

Occasionally, the rain band moves northwards towards the Equator by 1000 kilometres, inducing extreme climate events.

The international study, led by CSIRO oceanographer Dr Wenju Cai, focuses on how the frequency of such movement may change in the future. The study finds the frequency will almost double in the next 100 years, with a corresponding intensification of the rain band.

Dr Wenju and colleagues turned to the extensive archives of general circulation models submitted for the fourth and fifth IPCC Assessments and found that increases in greenhouse gases are projected to enhance equatorial Pacific warming. In turn, and in spite of disagreement about the future of El Niño events, this warming leads to the increased frequency of extreme excursions of the rain band.

During moderate El Niño events with warming in the equatorial eastern Pacific, the rain band moves north-eastward by 300 kilometres. Countries located within the bands’ normal position such as Vanuatu, Samoa, and the southern Cook Islands experience forest fires and droughts as well as increased frequency of tropical cyclones, whereas countries to which the rain band moves experience extreme floods.

“During extreme El Niño events, such as 1982/83 and 1997/98, the band moved northward by up to 1000 kilometres. The shift brings more severe extremes, including cyclones to regions such as French Polynesia that are not accustomed to such events,” said Dr Cai, a scientist at the Wealth from Oceans Flagship.

“Understanding changes in the frequency of these events as the climate changes proceed is therefore of broad scientific and socio-economic interest.”

A central issue for community adaptation in Australia and across the Pacific is understanding how the warming atmosphere and oceans will influence the intensity and frequency of extreme events. The impact associated with the observed extreme excursions includes massive droughts, severe food shortage, and coral reef mortality through thermally-induced coral bleaching across the South Pacific.

“Understanding changes in the frequency of these events as the climate changes proceed is therefore of broad scientific and socio-economic interest.”

The paper, “More extreme swings of the South Pacific Convergence Zone due to greenhouse warming,” was co-authored by Australian scientists Dr Simon Borlace, Mr Tim Cowan from CSIRO and Drs Scott Power and Jo Brown, two Bureau of Meteorology scientists at the Centre for Australian Weather and Climate Research, who were joined by French, US, UK, and Cook Island scientists.

The research effort from Australian scientists was supported by the Australian Climate Change Science Program, the CSIRO Office of Chief Executive Science Leader program, and the Pacific-Australia Climate Change Science and Adaptation Planning Program.

Democracy Works for Endangered Species Act, Study Finds; Citizen Involvement Key in Protecting and Saving Threatened Species (Science Daily)

ScienceDaily (Aug. 16, 2012) — When it comes to protecting endangered species, the power of the people is key, an analysis of listings under the U.S. Endangered Species Act finds.

Desert Tortoise, Gopherus agassizii, in Mojave of Utah. The FWS turned down a petition to list the Mojave Desert population of the Desert Tortoise, Gopherus agassizii, but that decision was reversed. The Desert Tortoise is now in the ESA highest threat category, and populations of the entire species are thought to have declined by more than 90 percent during the past 20 years. (Credit: © mattjeppson / Fotolia)

The journal Science is publishing the analysis comparing listings of “endangered” and “threatened” species initiated by the U.S. Fish and Wildlife Service, the agency that administers the Endangered Species Act, to those initiated by citizen petition.

“We found that citizens, on average, do a better job of picking species that are threatened than does the Fish and Wildlife Service. That’s a really interesting and surprising finding,” says co-author Berry Brosi, a biologist and professor of environmental studies at Emory University.

Brosi conducted the analysis with Eric Biber, a University of California, Berkeley School of Law professor who specializes in environmental law.

Controversy has surrounded the Endangered Species Act (ESA) since it became law nearly 40 years ago. A particular flashpoint is the provision that allows citizens to petition the Fish and Wildlife Service (FWS) to list any unprotected species, and use litigation to challenge any FWS listing decision. Critics of this provision say the FWS wastes time and resources processing the stream of citizen requests. Another argument is that many citizen-initiated listings are driven less by concern for a species than by political motives, such as blocking a development project.

The study authors counter that their findings bolster the need to keep the public highly involved.

“There are some 100,000 species of plants and animals in North America, and asking one federal agency to stay on top of that is tough,” Biber says. “If there were restrictions on the number of citizen-initiated petitions being reviewed, the government would lose a whole universe of people providing high-quality information about species at risk, and it is likely that many species would be left unprotected.”

The researchers built a database of the 913 domestic and freshwater species listed as “threatened” or “endangered” under the ESA from 1986 on. They examined whether citizens or the FWS initiated the petition, whether it was litigated, and whether it conflicted with an economic development project. They also looked at the level of biological threat to each of the species, using FWS threat scores in reports the agency regularly makes to Congress.

The results showed that listings resulting from citizen-initiated petitions are more likely to pose conflicts with development, but those species are also significantly more threatened, on average, than the species in FWS-initiated petitions.

“The overriding message is that citizen involvement really does work in combination with the oversight of the FWS,” Brosi says. “It’s a two-step system of checks and balances that is important to maintain.”

The public brings diffuse and specialized expertise to the table, from devoted nature enthusiasts to scientists who have spent their whole careers studying one particular animal, insect or plant. Public involvement can also help counter the political pressure inherent in large development projects. The FWS, however, is unlikely to approve the listing of a species that is not truly threatened or endangered, so some petitions are filtered out.

“You could compare it to the trend of crowdsourcing that the Internet has spawned,” Brosi says. “It’s sort of like crowdsourcing what species need to be protected.”

Many people associate the success of the ESA with iconic species like the bald eagle and the whooping crane.

“To me,” Brosi says, “the greater accomplishment of the act is its protection of organisms that don’t get the same amount of attention as a beautiful bird or mammal.”

For example, the FWS turned down a petition to list the Mojave Desert population of the Desert Tortoise, Gopherus agassizii,but that decision was reversed. The Desert Tortoise is now in the ESA highest threat category, and populations of the entire species are thought to have declined by more than 90 percent during the past 20 years.

“One of the biggest threats it faces is urban and suburban expansion, which could have made it politically challenging for the FWS,” Brosi notes. “And yet, the Desert Tortoise is a keystone species that helps support dozens of other species by creating habitats in its burrows and dispersing seeds.”

Organisms Cope With Environmental Uncertainty by Guessing the Future (Science Daily)

ScienceDaily (Aug. 16, 2012) — In uncertain environments, organisms not only react to signals, but also use molecular processes to make guesses about the future, according to a study by Markus Arnoldini et al. from ETH Zurich and Eawag, the Swiss Federal Institute of Aquatic Science and Technology. The authors report in PLoS Computational Biology that if environmental signals are unreliable, organisms are expected to evolve the ability to take random decisions about adapting to cope with adverse situations.

Most organisms live in ever-changing environments, and are at times exposed to adverse conditions that are not preceded by any signal. Examples for such conditions include exposure to chemicals or UV light, sudden weather changes or infections by pathogens. Organisms can adapt to withstand the harmful effects of these stresses. Previous experimental work with microorganisms has reported variability in stress responses between genetically identical individuals. The results of the present study suggest that this variation emerges because individual organisms take random decisions, and such variation is beneficial because it helps organisms to reduce the metabolic costs of protection without compromising the overall benefits.

The theoretical results of this study can help to understand why genetically identical organisms often express different traits, an observation that is not explained by the conventional notion of nature and nurture. Future experiments will reveal whether the predictions made by the mathematical model are met in natural systems.

Nelson Rodrigues e o “Sobrenatural de Almeida” (Portal Entretextos)

11.07.2012

Miguel Carqueija

Um mestre do “mainstream” também entrou em terreno fantástico.

Nelson Rodrigues, de quem se comemora o centenário em 2012, não foi apenas um dramaturgo e contista, mas também produziu crônica esportiva. Por muito tempo manteve uma coluna no jornal carioca “O Globo” — e naquele tempo este diário, hoje decadente, possuia bons colunistas — que mudava de nome, mas o seu titulo principal era “À sombra das chuteiras imortais” (outros títulos usados foram “A batalha” e “Os bandeirinhas também são anjos”).
Nelson tinha um estilo sui-generis e, a rigor, reconhecível facilmente, mesmo se ele não assinasse. Fluminense doente, era descaradamente parcial nas suas crônicas. E eu, que torcia pelo Fluminense, as lia comprazer.
Detalhe interessante é que Nelson, na maior cara-de-pau, gostava de “profetizar” a vitória do Flu no então campeonato carioca. O futebol, naquele tempo, era muito regional. E, claro, a profecia dava certo quando o clube ganhava o campeonato.
Certo ano, durante o que parecia ser uma maré de azar, Nelson escreveu que o sobrenatural estava perseguindo o Fluminense. Dias depois o cronista publicou uma “carta” que teria recebido, e que diria mais ou menos assim: “No dia tal o senhor disse que o sobrenatural está perseguindo o Fluminense. Ora, o Sobrenatural sou eu, e garanto que isso não é verdade etc.” O “personagem” encerrava a missiva garantindo que no próximo jogo o tricolor ganharia, e assinava: “Sobrenatural de Almeida”.
Veio o domingo e o Fluminense perdeu. Revoltado, Nelson acusou o Sobrenatural de Almeida de haver mentido descaradamente. Aí começava a guerra da torcida do Fluminense, chefiada por Nelson Rodrigues, contra o sinistro Sobrenatural de Almeida.
Pode parecer estranho hoje em dia, para quem não conheceu o carisma do cronista e dramaturgo falecido em 1980, mas o caso é que o Sobrenatural de Almeida foi, durante algum tempo, verdadeira coqueluche na cidade. Os repórteres esportivos falavam nele. Certo jogo foi acompanhado de forte ventania, que chegou a desviar a bola que ia para o gol. “É o Sobrenatural de Almeida!”, gritou o locutor da rádio.
Veio um novo jogo e o Fluminense venceu. Nelson comemorou a vitória contra o inimigo, que teria se retirado melancolicamente do Maracanã. Depois, porém, por motivos que hoje me escapam, o campeonato foi suspenso por algum tempo. Nelson Rodrigues então “recebeu” um telefonema do Sobrenatural de Almeida, assumindo ser o responsável pela interrupção do campeonato.
Com o tempo o colunista foi dando maiores informações sobre a misteriosa figura, que nas caricaturas aparecia com uma roupa preta, tão “assustador” como o Zé do Caixão. Segundo Nelson, o Sobrenatural tivera os seus tempos de glória mas agora, coitado, morava em Irajá e viajava nos trens da Central. Por isso até chegava atrasado ao Maracanã, e só então começava a interferir.
Essa febre do Sobrenatural de Almeida durou semanas, meses, mas acabou saturando e o Nelson terminou parando de falar nele. Mas, de certa forma, foi uma contribuição do jornalista para a nossa literatura fantástica.

Calgary hail storm: Cloud seeding credited for sparing city from worse disaster (The Calgary Herald)

‘The storm was a monster,’ says weather modification company

BY THANDI FLETCHER, CALGARY HERALD AUGUST 14, 2012

Paul Newell captured dramatic images in the Bearspaw area of northwest Calgary just before the start of the hailstorm on Sunday, Aug. 12, 2012.

Paul Newell captured dramatic images in the Bearspaw area of northwest Calgary just before the start of the hailstorm on Sunday, Aug. 12, 2012. Photograph by: Reader photo , Paul Newell

A ferocious storm that hammered parts of Calgary with hail stones larger than golf balls late Sunday, causing millions of dollars worth of damage, could have been much worse if cloud-seeding planes hadn’t attempted to calm it down.

“The storm was a monster,” said Terry Krauss, project director of the Alberta Severe Weather Management Society, which contracts American-based company Weather Modification Inc. to seed severe weather clouds in Alberta’s skies. The society is funded by a group of insurance companies with a goal of reducing hail damage claims.

Before the storm hit, Krauss said, the company sent all four of its cloud-seeding aircraft into the thick and swirling black clouds. The planes flew for more than 12 hours, shooting silver iodide, a chemical agent that helps limit the size of hail stones, at the top and base of the clouds, until midnight.

But despite the heavy seeding, golf-ball-sized hail stones pelted parts of Calgary late Sunday night, causing widespread damage to cars and homes.

“This one was a beast. It took everything we threw at it and still was able to wreak some havoc,” said Krauss. “I believe if we hadn’t seeded, it would have even been worse.”

Northeast Calgary was worst hit by the storm, where the hail was between five and six centimetres, said Environment Canada meteorologist John Paul Craig. Other parts of the city saw toonie-sized hail from a second storm system, said Craig.

Craig said Sunday’s storm was worse than Calgary’s last major hailstorm, which saw four-centimetre hail stones, in July 2010.

“These hail stones were just a little bit bigger,” he said.

At Royal Oak Audi in the city’s northwest, broken glass from smashed windows littered the lot Monday morning. Of the 85 new and used cars on the lot, general manager Murray Dorren said not a single car was spared from the storm.

“It’s devastating — that’s probably the best word I can come up with,” he said. “It’s unbelievable that Mother Nature can do this much damage in a very short time. I think it probably took a matter of 10 minutes and there’s millions of dollars worth of damage.

Dorren estimated the damage at about $2 million. Across the lot, the dinged-up vehicles looked like dimpled golf balls from the repetitive pounding of the sizable stones. Some windows and sunroofs were shattered, while others were pierced by the heavy hail.

“They look like bullet holes right through the windscreen,” salesman Nick Berkland said of the damage.

Insurance companies and brokers were inundated with calls all day as customers tried to file claims on their wrecked cars and homes.

Ron Biggs, claims director for Intact Insurance, said it’s too early to tell how many claims the hail event will spurn, although he said they received about two to three times their normal call volume on Monday.

Biggs said the level of damage so far appears to be similar to the July 2010 hailstorm, when Intact received about 12,000 hail damage claims.

Chief operating officer Bruce Rabik of Rogers Insurance, which insures several car dealerships in Calgary, said the damage is extensive.

“It’s certainly a bad one,” he said. “We’ve had one dealership, which they estimate 600 damaged cars. A couple other dealerships with 200 damaged cars each.”

Rabik said claims adjusters are overwhelmed with the volume of claims. He urged customers to be patient as it may take a day or two as insurance workers make their way to each home.

Shredded leaves, twigs and broken branches blanketed pathways along the Bow and Elbow rivers as city crews worked to clear them, said Calgary parks pathway lead Duane Sutherland.

“This was the worst that I’ve seen,” said Sutherland.

Once daylight broke Monday, Royal Oak resident Satya Mudlair inspected the exterior of his home, which was riddled with damage. “Lots of holes in the siding, window damage to the two bedroom windows, and the roof a little bit,” he said.

The apple tree in his backyard has also lost about half its apples, he said. Fortunately, his car was parked inside the garage and was spared any dents.

Mudlair said his insurance company told him it would take two or three weeks before the damage would be repaired. “There’s a big pile of names ahead of me,” he said.

Mudlair’s wife, Nirmalla, had just fallen asleep when she was awoken by the sound of hail stones hitting the roof.

“It was very bad. It was like, thump, thump,” she described the pelting sound. “We got scared and I kept running from room to room.”

Cloud-seeding expert Krauss said Calgary has experienced more severe weather than usual this year, although Sunday’s storm was by far the worst.

“It has been a very stormy year,” he said.

© Copyright (c) The Calgary Herald

In the Name of the Future, Rio Is Destroying Its Past (N.Y.Times)

OP-ED CONTRIBUTORS

By THERESA WILLIAMSON and MAURÍCIO HORA

Published: August 12, 2012

THE London Olympics concluded Sunday, but the battle over the next games has just begun in Rio, where protests against illegal evictions of some of the city’s poorest residents are spreading. Indeed, the Rio Olympics are poised to increase inequality in a city already famous for it.

Last month, Unesco awarded World Heritage Site status to a substantial portion of the city, an area that includes some of its hillside favelas, where more than 1.4 million of the city’s 6 million residents live. No favela can claim greater historical importance than Rio’s first — Morro da Providência — yet Olympic construction projects are threatening its future.

Providência was formed in 1897 when veterans of the bloody Canudos war in Brazil’s northeast were promised land in Rio de Janeiro, which was then the federal capital. Upon arriving, they found no such land available. After squatting in front of the Ministry of War, the soldiers were moved to a nearby hill belonging to a colonel, though they were given no title to the land. Originally named “Morro da Favela” after the spiny favela plant typical of the Canudos hills where soldiers had spent many nights, Providência grew during the early 20th century as freed slaves joined the soldiers. New European migrants came as well, as it was the only affordable way to live near work in the city’s center and port.

Overlooking the site where hundreds of thousands of African slaves first entered Brazil, Providência is part of one of the most important cultural sites in Afro-Brazilian history, where the first commercial sambas were composed, traditions like capoeira and candomblé flourished and Rio’s Quilombo Pedra do Sal was founded. Today 60 percent of its residents are Afro-Brazilian.

Over a century after its creation, Providência still bears the cultural and physical imprint of its initial residents. But now it is threatened with destruction in the name of Olympic improvements: almost a third of the community is to be razed, a move that will inevitably destabilize what’s left of it.

By mid-2013 Providência will have received 131 million reais ($65 million) in investments under a private-sector-led plan to redevelop Rio’s port area, including a cable car, funicular tram and wider roads. Previous municipal interventions to upgrade the community recognized its historical importance, but today’s projects have no such intent.

Although the city claims that investments will benefit residents, 30 percent of the community’s population has already been marked for removal and the only “public meetings” held were to warn residents of their fate. Homes are spray-painted during the day with the initials for the municipal housing secretary and an identifying number. Residents return from work to learn that their homes will be demolished, with no warning of what’s to come, or when.

A quick walk through the community reveals the appalling state of uncertainty residents are living in: at the very top of the hill, some 70 percent of homes are marked for eviction — an area supposedly set to benefit from the transportation investments being made. But the luxury cable car will transport 1,000 to 3,000 people per hour during the Olympics. It’s not residents who will benefit, but investors.

Residents of Providência are fearful. Only 36 percent of them hold documentation of their land rights, compared with 70 percent to 95 percent in other favelas. More than in other poor neighborhoods, residents are particularly unaware of their rights and terrified of losing their homes. Combine this with the city’s “divide and conquer” approach — in which residents are confronted individually to sign up for relocation, and no communitywide negotiations are permitted — and resistance is effectively squelched.

Pressure from human rights groups and the international news media has helped. But brutal evictions continue as well as new, subtler forms of removal. As part of the city’s port revitalization plan, authorities declared the “relocations” to be in the interest of residents because they live in “risky areas” where landslides might occur and because “de-densification” is required to improve quality of life.

But there is little evidence of landslide risk or dangerous overcrowding; 98 percent of Providência’s homes are made of sturdy brick and concrete and 90 percent have more than three rooms. Moreover, an important report by local engineers showed that the risk factors announced by the city were inadequately studied and inaccurate.

If Rio succeeds in disfiguring and dismantling its most historic favela, the path will be open to further destruction throughout the city’s hundreds of others. The economic, social and psychological impacts of evictions are dire: families moved into isolated units where they lose access to the enormous economic and social benefits of community cooperation, proximity to work and existing social networks — not to mention generations’ worth of investments made in their homes.

Rio is becoming a playground for the rich, and inequality breeds instability. It would be much more cost-effective to invest in urban improvements that communities help shape through a participatory democratic process. This would ultimately strengthen Rio’s economy and improve its infrastructure while also reducing inequality and empowering the city’s still marginalized Afro-Brazilian population.

Theresa Williamson, the publisher of RioOnWatch.org, founded Catalytic Communities, an advocacy group for favelas. Maurício Hora, a photographer, runs the Favelarte program in the Providência favela.

*   *   *

APRIL 2, 2012

Are the Olympics More Trouble Than They’re Worth?

ProtestingToby Melville/Reuters

Winning a bid to host the Olympics is just the beginning. As London prepares for the 2012 Games this summer, residents have plenty of doubts: Will it be too expensive? Will it disrupt life too much? In the end, will they be better off because of the Games, or just saddled with public debt and a velodrome no one knows what to do with?

What about Rio de Janeiro: Will it come out ahead, after having hosted the Pan American Games in 2007, the World Cup in 2014 and the Olympics in 2016?

READ THE DISCUSSION »

DEBATERS

Neil Jameson

The Games Help Londoners

NEIL JAMESON, LEAD ORGANIZER, LONDON CITIZENS

This is the world’s first “Living Wage Olympics,” and East London residents will reap the rewards.

Julian Cheyne

The Games Hurt Londoners

JULIAN CHEYNE, EVICTED RESIDENT, EAST LONDON

The Olympics are an expensive distraction that sets dangerous precedents, coddling the elite and trampling the poor.

Theresa Williamson

A Missed Opportunity in Rio

THERESA WILLIAMSON, FOUNDER, CATALYTIC COMMUNITIES

In preparing for the World Cup and the Olympics, Rio could make long-term investments and integrate the favelas. Instead it is aggravating its problems.

Bruno Reis

Brazil Can Come Out Ahead

BRUNO REIS, RISK ANALYST IN BRAZIL

These Games represent a golden opportunity, but will Rio de Janeiro repeat the success of Barcelona or the failure of Athens?

Andrew Zimbalist

Venues as an Asset or an Albatross

ANDREW ZIMBALIST, ECONOMIST, SMITH COLLEGE

Olympics planning takes place in a frenzied atmosphere — not optimal conditions for contemplating the future shape of an urban landscape.

Mitchell L. Moss

New York Is Lucky Not to Have the Games

MITCHELL L. MOSS, NEW YORK UNIVERSITY

London will be a morass this summer. Meanwhile, there has never been a better time to visit New York City.

Occupy, Anthropology, and the 2011 Global Uprisings (Cultural Anthropology)

Hot spot – Occupy, Anthropology, and the 2011 Global Uprisings

Submitted by Cultural Anthropology on Fri, 2012-07-27 10:36

Introduction: Occupy, Anthropology, and the 2011 Global Uprisings

Guest Edited by Jeffrey S. Juris (Northeastern University) and Maple Razsa (Colby College)

Occupy Wall Street burst spectacularly onto the scene last fall with the take-over of New York City’s Zuccotti Park on September 17, 2011, followed by the rapid spread of occupations to cities throughout the US and the world. The movement combined mass occupations of urban public spaces with horizontal forms of organization and large-scale, directly democratic assemblies. Making effective use of the viral flows of images and information generated by the intersections of social and mass media, the occupations mobilized tens of thousands around the globe, including many new activists who had never taken part in a mass movement before, and inspired many more beyond the physical encampments themselves. Before the wave of violent police evictions in November and December of 2011 drove activists into submerged forms of organizing through the winter, the Occupy movements had already captured the public imagination. Bequeathing to us potent new memes such as the 1% (those at the top of the wealth and income scale) and the 99% (the rest of us), Occupy provided a framework for talking about issues that have been long obscured in public life such as class and socio-economic inequality and helped to shift the dominant political-economic discourse from an obsession with budget deficits and austerity to a countervailing concern for jobs, equality, and economic fairness.

In other words, prior to Occupy, much of the populist anger stemming from the 2008 financial crisis in North America and Europe had been effectively channeled by the Right into both an attack on marginalized groups—e.g. immigrants, people of color, Gays and Lesbians—and a particularly pernicious version of the already familiar critique of unbridled spending. This was especially so in the US where the Tea Party tapped into the widespread public ire over the Wall Street bailouts to bolster a far-reaching attack on “big government” through a radical program of fiscal austerity. Of course, the debt problem was a consequence rather than a cause of the crisis, the result of deregulation, predatory lending, and the spread of highly complex financial instruments facilitated by the neoliberal agenda of the very people who were now seeking to impose budgetary discipline (see Financial Crisis Hot Spot).

However, the contributions of Occupy are not exclusively, or even primarily, to be assessed in terms of their intervention in public discourse. The Occupy movements are also a response to a fundamental crisis of representative politics embodied in an embrace of more radical, directly democratic practices and forms. In their commitment to direct democracy and action the politics put into practice in the various encampments are also innovative prefigurative attempts to model alternative forms of political organization, decision making, and sociability. This turn is crucial: while neoliberalism has been endlessly critiqued it seems to live on as the only policy response—in the form of austerity—to the crisis neoliberalism itself has produced. The need for ethnographic accounts of this prefigurative politics, and its attendant challenges and contradictions, is especially urgent given that Occupy has refused official representatives and because occupiers have extended democracy beyond formal institutions into new spheres of life through a range of practices, including the collective seizure of public space, the people’s mic, horizontal organization, hand signals, and general assemblies.

It is also important to remember that Occupy was a relative latecomer—if a symbolically important one—to the social unrest the global crisis and policies of austerity have provoked. Cracks in the veneer of conformity emerged during the 2008 rebellion in Greece, where students, union members, and other social actors, galvanized by the murder of a fifteen year old student, took to the streets to challenge the worsening economic conditions (See Greece Hot Spot). Students were also among the first wave of resistance elsewhere with protests against budget cuts and increased fees in California, Croatia, the UK, and Chile. In the US signs of wider social discontent finally surfaced during the Wisconsin uprising in February 2011, which included the occupation of the Wisconsin State House in opposition to Governor Scott Walker’s attack on collective bargaining for public sector unions under the guise of budgetary discipline (cf. Collins 2012). As in Wisconsin, the widespread circulation of images from the Arab Spring continued to spark the intense feelings of solidarity, political possibility, and agency that ultimately led to the occupation of Wall Street. From the pro-democracy marches in Tunisia in response to the self-immolation of Mohammed Bouazizi to the mass occupations of Cairo’s Tahrir Square in opposition to the Egyptian dictator Hosni Mubarak, the Middle East uprisings, imbued protesters with the sense that dramatic political transformation was possible even as subsequent events have indicated that actual political outcomes are always ambivalent and uncertain (see Arab Spring Hot Spot).

Inspired by the uprisings in Tunisia and Egypt and responding to the working and middle class casualties of Spain and Europe’s debt crisis, hundreds of thousands of protesters took to the streets of Madrid on May 15, 2011 and occupied the Puerta del Sol square, sparking a wave of similar mobilizations and encampments around the Spain that would become known as 15M or the movement of the Indignados. Indeed, the combination of mass public occupations with large-scale participatory assemblies provided a template that would be enacted in Zuccotti Park, in part via the influence of Spanish activists residing in New York. That summer a similar movement of Israeli youths sprang up in Tel Aviv, using tent cities and popular assemblies to shine a light on the rising cost of housing and other living expenses.

Finally, in response to an August 2011 call by the Canadian magazine AdBusters to occupy Wall Street in the spirit of these 2011 Global uprisings, activists occupied Zuccotti Park after being rebuffed by the police in an attempt to take Wall Street itself. The occupation initially garnered little media attention, until its second week when images of police repression started going viral, leading to a surge in public sympathy and support, and ever growing numbers streaming to the encampments themselves each time another protester was maced or a group of seemingly innocent protesters rounded up, beaten, and/or arrested. Occupations quickly spread around the US and other parts of the world, generating, for a moment, a proliferating series of encampments physically rooted in local territories, yet linked up with other occupations through interpersonal and online trans-local networks. Following the evictions in the US last fall, local assemblies and working groups have continued to meet—hosting discussions, planning actions and campaigns, producing media, and building and modifying organizational forms—even as the Occupy movements prepared for their public reemergence in the spring through mobilizations such as the May Day protests and mass direct actions against NATO in Chicago and the European Central Bank in Frankfurt.

Additionally, each of these uprisings has diffused through the widespread use of social media, reflecting the mutually constitutive nature of embodied and online protest. The use of social media, in particular, has allowed the Occupy movements, as in other recent mobilizations, to penetrate deeply into the social fabric and mobilize many newcomers who have never been active before in social movements. At the same time, these emerging “logics of aggregation” within the Occupy movements have resulted in a more individualized mode of participation and a form of movement that is more singularizing (e.g. the way the 99% frame can obscure internal differences) and more dependent on the long-term occupation of public space than other recent movements (Juris 2012). A particular set of tensions and strategic dilemmas have thus plagued the Occupy movements, including a divide between newer and more seasoned activists, the difficulty of recognizing and negotiating internal differences, a lack of common political and organizational principles beyond the General Assembly model, and the difficulty of transitioning to new tactics, strategies, visions, and structures in a post-eviction era. In short, activists are now faced with fundamental questions about how to build a movement capable of actually transforming the deep inequalities they have attempted to address.

In assembling this Hot Spot on Occupy we have invited contributions from anthropologists, ethnographers, and activists writing on the above themes: the mass occupation of public spaces, directly democratic practices and forms, the use of social media, the emotions and emerging subjectivities of protest, as well as the underlying political critiques and contradictions that have arisen in the movement. Similarly, in light of the global history we outline above, the range of other social movement responses to the current global economic crisis, as well as the ongoing links between struggles in the US, Europe, Latin America, and North Africa, we have been careful to include contributors conducting research beyond the US in countries such as Greece, Slovenia, Spain, Israel, Argentina, Egypt, and Canada. In so doing, we insist that Occupy must be understood in a global rather than a populist US-centric framework.

Our collaboration on this Hot Spot—which emerged from conversations around our articles on Occupy in the May 2012 edition ofAmerican Ethnologist (Juris 2012Razsa and Kurnik 2012)—also reflects our scholarly and political commitments, as well as those of our contributors. First, it was our priority to invite scholars and activists who are directly involved with these movements rather than adding to the abundant armchair punditry on Occupy. These contributions also reflect recent trends in anthropology with respect to the growing practice of activist research, militant ethnography, public anthropology, and other forms of politically committed ethnographic research, which are taking increasingly institutionalized forms with Cultural Anthropology “Hot Spots”like this one, “Public Anthropology Reviews” in American Anthropologist, recent interventions in American Ethnologist on Egypt, Wisconsin, and Occupy, as well as Current Anthropology “Current Applications.”

In addition to providing an ethnographically and analytically informed view of and from various occupations and kindred mobilizations, this Hot Spot thus provides another example of how anthropologists are making themselves politically relevant and are engaging issues of broad public concern. Given these shifts, together with the progressive inclinations of many anthropologists and the ubiquity and inherent interest of Occupy, it should come as no surprise that so many anthropologists and ethnographers from related fields, including those within and outside the academy, have played key roles in the Occupy movements and their precursors in countries such as Greece and Spain. Indeed, in their post Carles Feixa and his collaboratorsrefer to anthropologists as the “organic intellectuals” of the 15 M movement. As many of the contributions to this Hot Spot attest, a similar case might be made for the role of activist anthropologists within Occupy more generally.

As the contributions below make clear, our emphasis on participatory and politically committed research does not imply a romanticization of resistance or a refusal to confront the contradictions, limits, and exclusions of social movements, especially along axes of class, race, gender, sexuality, and citizenship. Given the disproportionate, though by no means exclusively White, middle class participation in the US Occupy movements, such critical perspectives are essential. Each of the following entries thus combines thick ethnographic description on the part of anthropologists, ethnographers, and activists who have been directly involved in the Occupy movements or other instances of mobilization during the 2011 global uprisings—either through engagement with one more encampments and/or the themes addressed by Occupy—with critical analysis of one or more of the issues outlined above.

NOTES

[1] Occupy has thus addressed many of the same themes and drawn on many of the organizational practices associated with the global justice movements of a previous era, even as it has resonated more strongly with domestic national contexts of the Global north.

[2] The people’s mic is a form of voice amplification whereby everyone in listening distance repeats a speaker’s words so that others situated further away can also hear (See Garces, this Hot Spot).

[3] For example, in the U.S. local encampments created “Inter-Occupy” groups maintain ties with other occupations, while twitter feeds, listservs, websites, and other digital tools were used to communicate and coordinate more broadly. See our digital resources page for additional links.

REFERENCES

Collins, Jane. 2012. “Theorizing Wisconsin’s 2011 Protests: Community-Based Unionism Confronts Accumulation by Dispossession.” American Ethnologist 39 (1):6–20.

Juris, Jeffrey. 2012. “Reflections on #Occupy Everywhere: Social Media, Public Space, and Emerging Logics of Aggregation.”American Ethnologist 39 (2):259-279.

Razsa, Maple and Andrej Kurnik. 2012. “The Occupy Movement in Žižek’s Hometown: Direct Democracy and a Politics of Becoming.” American Ethnologist 39 (2):238-258.

***ESSAYS***

Prefigurative Politics

Marianne Maeckelbergh, Horizontal Decision-Making across Time and Place

Chris Garces, People’s Mic and ‘Leaderful’ Charisma

Philip Cartelli, Trying to Occupy Harvard

Public Space

Zoltán Glück, Between Wall Street and Zuccotti: Occupy and the Scale of Politics

Carles Feixa, et al., The #spanishrevolution and Beyond

Dimitris Dalakoglou,  The Movement and the “Movement” of Syntagma Square

Experience and Subjectivity

Jeffrey S. Juris, The 99% and the Production of Insurgent Subjectivity

Diane Nelson, et al., Her earliest leaf’s a flower…

Maple Razsa, The Subjective Turn: The Radicalization of Personal Experience within Occupy Slovenia

Marina Sitrin, Occupy Trust: The Role of Emotion in the New Movements

Strategy and Tactics

David Graeber, Occupy Wall Street rediscovers the radical imagination

Kate Griffiths-Dingani, May Day, Precarity, Affective Labor, and the General Strike

Angelique Haugerud, Humor and Occupy Wall Street

Karen Ho, Occupy Finance and the Paradox/Possibilities of Productivity

Social Media

Alice Mattoni, Beyond Celebration: Toward a More Nuanced Assessment of Facebook’s Role in Occupy Wall Street

John Postill, Participatory Media Research and Spain’s 15M Movement

Critical Perspectives

Yvonne Yen Liu, Decolonizing the Occupy Movement

Manissa McCleave Maharawal, Fieldnotes on Union Square, Anti-Oppression, and Occupy

Uri Gordon, Israel’s “Tent Protests:” A Domesticated Mobilization

Alex Khasnabish, Occupy Nova Scotia: The Symbolism and Politics of Space

Post Normal Science: Deadlines (Climate Etc.)

Posted on August 3, 2012

by Steven Mosher

Science has changed. More precisely, in post normal conditions the behavior of people doing science has changed.

Ravetz describes a post normal situation by the following criteria:

  1. Facts are uncertain
  2. Values are in conflict
  3. Stakes are high
  4. Immediate action is required

The difference between Kuhnian normal science, or the behavior of those doing science under normal conditions, and post normal science is best illustrated by example. We can use the recent discovery of the Higgs Boson as an example. Facts were uncertain–they always are to a degree; no values were in conflict; the stakes were not high; and, immediate action was not required. What we see in that situation is those doing science acting as we expect them to, according to our vague ideal of science. Because facts are uncertain, they listen to various conflicting theories. They try to put those theories to a test. They face a shared uncertainity and in good faith accept the questions and doubts of others interested in the same field. Their participation in politics is limited to asking for money. Because values are not in conflict no theorist takes the time to investigate his opponent’s views on evolution or smoking or taxation. Because the field of personal values is never in play, personal attacks are minimized. Personal pride may be at stake, but values rarely are. The stakes for humanity in the discovery of the Higgs are low: at least no one argues that our future depends upon the outcome. No scientist straps himself to the collider and demands that it be shut down. And finally, immediate action is not required; under no theory is the settling of the uncertainty so important as to rush the result. In normal science, according to Kuhn,  we can view the behavior of those doing science as puzzle solving. The details of a paradigm are filled out slowly and deliberately.

The situation in climate science are close to the polar opposite of this. That does not mean and should not be construed as a criticism of climate science or its claims. The simple point is this: in a PNS situation, the behavior of those doing science changes. To be sure much of their behavior remains the same. They formulate theories; they collect data, and they test their theories against the data. They don’t stop doing what we notional  describe as science. But, as foreshadowed above in the description of how high energy particle physicists behave, one can see how that behavior changes in a PNS situation. There is uncertainty, but the good faith that exists in normal science, the faith that other people are asking questions because they actually want the answer is gone. Asking questions, raising doubts, asking to see proof becomes suspect in and of itself. And those doing science are faced with a question that science cannot answer: Does this person really want the answer or are they amerchant of doubt? Such a question never gets asked in normal science. Normal science doesn’t ask this question because science cannot answer it.

Because values are in conflict the behavior of those doing science changes. In normal science no one would care if Higgs was a Christian or an atheist. No one would care if he voted liberal or conservative; but because two different value systems are in conflict in climate science, the behavior of those doing science changes. They investigate each other. They question motives. They form tribes.  And because the stakes are high the behavior of those doing science changes as well. They protest; they take money from lobby groups on both sides and worse of all they perform horrendous raps on youTube. In short, they become human; while those around them canonize them or demonize them and their findings become iconized or branded as hoaxes.

This brings us to the last aspect of a PNS situation: immediate action is required. This perhaps is the most contentious aspect of PNS, in fact I would argue it is thedefining characteristic. In all PNS situations it is almost always the case the one side sees the need for action, given the truth of their theory, while the doubtersmust of necessity see no need for immediate action. They must see no need for immediate action because their values are at risk and because the stakes are high. Another way to put this is as follows. When you are in a PNS situation, all sides must deny it. Those demanding immediate action, deny it by claiming more certainty*than is present; those refusing immediate action, do so by increasing demands for certainty. This leads to a centralization and valorization of the topic of uncertainty, and epistemology becomes a topic of discussion for those doing science. That is decidedly not normal science.

The demand for immediate action, however, is broader than simply a demand that society changes. In a PNS situation the behavior of those doing science changes. One of the clearest signs that you are in PNS is the change in behavior around deadlines. Normal science has no deadline. In normal science, the puzzle is solved when it is solved. In normal science there may be a deadline to shut down the collider for maintenance. Nobody rushes the report to keep the collider running longer than it should. And if a good result is found, the schedules can be changed to accommodate the scienceBroadly speaking, science drives the schedule; the schedule doesn’t drive the science.

The climategate mails are instructive here. As one reads through the mails it’s clear that the behavior of those doing science is not what one would call disinterested patient puzzle solving. Human beings acting in a situation where values are in conflict and stakes are high will engage in behavior that they might not otherwise. Those changes are most evident in situations surrounding deadlines. The point here is not to rehash The Crutape Lettersbut rather to relook at one incident ( there are others, notably around congressional hearings ) where deadlines came into play. The deadline in question was the deadline for submitting papers for consideration. As covered in The Crutape Letters and in The Hockeystick Illusion, the actions taken by those doing science around the“Jesus Paper” is instructive. In fact, were I to rewrite the Crutape letters I would do it from the perspective of PNS, focusing on how the behavior of those doing science deviated from the ideals of openness, transparency and letting truth come on its own good time.

Climategate is about FOIA. There were two critical paths for FOIA: one sought data, the other sought the emails of scientists. Not quite normal. Not normal in that data is usually shared; not normal in that we normally respect the privacy of those doing science. But this is PNS, and all bets are off. Values and practices from other fields, such as business and government,  are imported into the culture of science: Data hoarding is defended using IP and confidentiality agreements. Demanding private mail is defended using values imported from performing business for the public. In short, one sign that a science is post normal, is the attempt to import values and procedures from related disciplines. Put another way, PNS poses the question of governance. Who runs science and how should they run it.

The “Jesus paper” in a nutshell can be explained as follows. McIntyre and McKittrick had a paper published in the beginning of 2005. That paper needed to be rebutted in order to make Briffa’s job of writing chapter 6 easier. However, there was a deadline in play. Papers had to be accepted by a date certain. At one point Steven Schneider suggested the creation of a new category, a novelty–  provisionally accepted — so that the “jesus paper” could make the deadline. McIntyre covers the issue here. One need not re-adjudicate whether or not the IPCC rules were broken. And further these rules have nothing whatsoever ever to do with the truth of the claims in that paper. This is not about the truth of the science. What is important is the importation of the concept of a deadline into the search for truth. What is important is that the behavior of those doing science changes. Truth suddenly cares about a date. Immediate action is required. In this case immediate action is taken to see to it that the paper makes it into the chapter. Normal science takes no notice of deadlines. In PNS, deadlines matter.

Last week we saw another example of deadlines and high stakes changing the behavior of those doing science. The backstory here explains .   It appears to me that the behavior of those involved changed from what I have known it to be. It changed because they perceived that immediate action was required. A deadline had to be met. Again, as with the Jesus paper, the facts surrounding the releasedo not go to the truth of the claims. In normal science, a rushed claimed might very well get the same treatment as an unrushed claim: It will be evaluated on its merits. In PNS, either the rush to meet an IPCC deadline– as in the case of the Jesus paper, or the rush to be ready for congress –as in the Watts case, is enoughfor some doubt the science.  What has been testified to in Congress by Christy, a co author, may very well be true. But in this high stakes arena, where facts are uncertain and values are in conflict, the behavior of those doing science can and does change. Not all their behavior changes. They still observe and test and report. But the manner in which they do that changes. Results are rushed and data is held in secret. Deadlines change everything. Normal science doesn’t operate this way; if it does, quality can suffer. And yet, the demand for more certainty than is needed, the bad faith game of delaying action by asking questions, precludes a naïve return to science without deadlines.

The solution that Ravetz suggests is extended peer review and a recognition of the importance of quality. In truth, the way out of a PNS situation is not that simple. The first step out of a PNS situation is the recognition that one is in the situation to begin with. Today, few people embroiled in this debate would admit that the situation has changed how they would normally behave. An admission that this isn’t working is a cultural crisis for science. No one has the standing to describe how one should conduct science in a PNS situation. No one has the standing to chart the path out of a PNS situation. The best we can do is describe what we see. Today, I observe that deadlines change the behavior of those doing science. We see that in climategate; we see that in the events of the past week. That’s doesn’t entail anything about the truth of science performed under pressure. But it should make us pause and consider if truth will be found any faster by rushing the results and hiding the data.

*I circulated a copy of this to Michael Tobis to get his reaction. MT took issue with this characterization. MT, I believe, originated the argument that our uncertainty is a reason for action. It is true that while the certainty about the science  has been a the dominant piece of the rhetoric, there has been a second thread of rhetoric that bases action in the uncertainty about sensitivity. I would call this certainty shifting. While the uncertainty about facts of sensitivity are accepted in this path of argument the certainty is shifted to certainty about values and certainty about impacts. In short, the argument becomes that while we are uncertain about sensitivity the certainty we have about large impacts and trans-generational obligations necessitates action.

Scientists struggle with limits – and risks – of advocacy (eenews.net)

Monday, July 9, 2012

Paul Voosen, E&E reporter

Jon Krosnick has seen the frustration etched into the faces of climate scientists.

For 15 years, Krosnick has charted the rising public belief in global warming. Yet, as the field’s implications became clearer, action has remained elusive. Science seemed to hit the limits of its influence. It is a result that has prompted some researchers to cross their world’s no man’s land — from advice to activism.

As Krosnick has watched climate scientists call for government action, he began pondering a recent small dip in the public’s belief. And he wondered: Could researchers’ move into the political world be undermining their scientific message?

Jon Krosnick
Stanford’s Jon Krosnick has been studying the public’s belief in climate change for 15 years, but only recently did he decide to probe their reaction to scientists’ advocacy. Photo courtesy of Jon Krosnick.

“What if a message involves two different topics, one trustworthy and one not trustworthy?” said Krosnick, a communication and psychology professor at Stanford University. “Can the general public detect crossing that line?”

His results, not yet published, would seem to say they can.

Using a national survey, Krosnick has found that, among low-income and low-education respondents, climate scientists suffered damage to their trustworthiness and credibility when they veered from describing science into calling viewers to ask the government to halt global warming. And not only did trust in the messenger fall — even the viewers’ belief in the reality of human-caused warming dropped steeply.

It is a warning that, even as the frustration of inaction mounts and the politicization of climate science deepens, researchers must be careful in getting off the political sidelines.

“The advice that comes out of this work is that all of us, when we claim to have expertise and offer opinions on matters [in the world], need to be guarded about how far we’re willing to go,” Krosnick said. Speculation, he added, “could compromise everything.”

Krosnick’s survey is just the latest social science revelation that has reordered how natural scientists understand their role in the world. Many of these lessons have stemmed from the public’s and politicians’ reactions to climate change, which has provided a case study of how science communication works and doesn’t work. Complexity, these researchers have found, does not stop at their discipline’s verge.

For decades, most members of the natural sciences held a simple belief that the public stood lost, holding out empty mental buckets for researchers to fill with knowledge, if they could only get through to them. But, it turns out, not only are those buckets already full with a mix of ideology and cultural belief, but it is incredibly fraught, and perhaps ineffective, for scientists to suggest where those contents should be tossed.

It’s been a difficult lesson for researchers.

“Many of us have been saddened that the world has done so little about it,” said Richard Somerville, a meteorologist at the Scripps Institution of Oceanography and former author of the United Nations’ authoritative report on climate change.

“A lot of physical climate scientists, myself included, have in the past not been knowledgeable about what the social sciences have been saying,” he added. “People who know a lot about the science of communication … [are] on board now. But we just don’t see that reflected in the policy process.”

While not as outspoken as NASA’s James Hansen, who has taken a high-profile moral stand alongside groups like 350.org and Greenpeace, Somerville has been a leader in bringing scientists together to call for greenhouse gas reductions. He helped organize the 2007 Bali declaration, a pointed letter from more than 200 scientists urging negotiators to limit global CO2 levels well below 450 parts per million.

Such declarations, in the end, have done little, Somerville said.

“If you look at the effect this has had on the policy process, it is very, very small,” he said.

This failed influence has spurred scientists like Somerville to partner closely with social scientists, seeking to understand why their message has failed. It is an effort that received a seal of approval this spring, when the National Academy of Sciences, the nation’s premier research body, hosted a two-day meeting on the science of science communication. Many of those sessions pivoted on public views of climate change.

It’s a discussion that’s been long overdue. When it comes to how the public learns about expert opinions, assumptions mostly rule in the sciences, said Dan Kahan, a professor of law and psychology at Yale Law School.

“Scientists are filled with conjectures that are plausible about how people make sense about information,” Kahan said, “only some fraction of which [are] correct.”

Shifting dynamic

Krosnick’s work began with a simple, hypothetical scene: NASA’s Hansen, whose scientific work on climate change is widely respected, walks into the Oval Office.

As he has since the 1980s, Hansen rattles off the inconvertible, ever-increasing evidence of human-caused climate change. It’s a stunning litany, authoritative in scope, and one the fictional president — be it a Bush or an Obama — must judge against Hansen’s scientific credentials, backed by publications and institutions of the highest order. If Hansen stops there, one might think, the case is made.

But he doesn’t stop. Hansen continues, arguing, as a citizen, for an immediate carbon tax.

“Whoa, there!” Krosnick’s president might think. “He’s crossed into my domain, and he’s out of touch with how policy works.” And if Hansen is willing to offer opinions where he lacks expertise, the president starts to wonder: “Can I trust any of his work?”

Richard Somerville
Part of Scripps’ legendary climate team — Charles David Keeling was an early mentor — Richard Somerville helped organize the 2007 Bali declaration by climate scientists, calling for government action on CO2 emissions. Photo by Sylvia Bal Somerville.

Researchers have studied the process of persuasion for 50 years, Krosnick said. Over that time, a few vital truths have emerged, including that trust in a source matters. But looking back over past work, Krosnick found no answer to this question. The treatment was simplistic. Messengers were either trustworthy or not. No one had considered the case of two messages, one trusted and one shaky, from the same person.

The advocacy of climate scientists provided an excellent path into this shifting dynamic.

Krosnick’s team hunted down video of climate scientists first discussing the science of climate change and then, in the same interview, calling for viewers to pressure the government to act on global warming. (Out of fears of bruised feelings, Krosnick won’t disclose the specific scientists cited.) They cut the video in two edits: one showing only the science, and one showing the science and then the call to arms.

Krosnick then showed a nationally representative sample of 793 Americans one of three videos: the science-only cut, the science and political cut, and a control video about baking meatloaf (The latter being closer to politics than Krosnick might admit). The viewers were then asked a series of questions both about their opinion of the scientist’s credibility and their overall beliefs on global warming.

For a cohort of 548 respondents who either had a household income under $50,000 or no more than a high school diploma, the results were stunning and statistically significant. Across the board, the move into politics undermined the science.

The viewers’ trust in the scientist dropped 16 percentage points, from 48 to 32 percent. Their belief in the scientist’s accuracy fell from 47 to 36 percent. Their overall trust in all scientists went from 60 to 52 percent. Their belief that government should “do a lot” to stop warming fell from 62 to 49 percent. And their belief that humans have caused climate change fell 14 percentage points, from 81 to 67 percent.

Krosnick is quick to note the study’s caveats. First, educated or wealthy viewers had no significant reaction to the political call and seemed able to parse the difference between science and a personal political view. The underlying reasons for the drop are far from clear, as well — it could simply be a function of climate change’s politicization. And far more testing needs to be done to see whether this applies in other contexts.

With further evidence, though, the implications could be widespread, Krosnick said.

“Is it the case that the principle might apply broadly?” he asked. “Absolutely.”

‘Fraught with misadventure’

Krosnick’s study is likely rigorous and useful — he is known for his careful methods — but it still carries with it a simple, possibly misleading frame, several scientists said.

Most of all, it remains hooked to a premise that words float straight from the scientist’s lips to the public’s ears. The idea that people learn from scientists at all or that they are simply misunderstanding scientific conclusions is not how reality works, Yale’s Kahan said.

“The thing that goes into the ear is fraught with misadventure,” he said.

Kahan has been at the forefront of charting how the empty-bucket theory of science communication — called the deficit model — fails. People interpret new information within the context of their own cultural beliefs, peers and politics. They use their reasoning to pick the evidence that supports their views, rather than the other way around. Indeed, recent work by Kahan found that higher-educated respondents were more likely to be polarized than their less-educated peers.

Krosnick’s study will surely spur new investigations, Kahan said, though he resisted definite remarks until he could see the final work. If the study’s conditions aren’t realistic, even a simple model can have “plenty of implications for all kinds of ways of which people become exposed to science,” he said.

The survey sits well with other research in the field and carries an implication about what role scientists should play in scientific debates, added Matthew Nisbet, a communication professor at American University.

“As soon as you start talking about a policy option, you’re presenting information that is potentially threatening to people’s values or identity,” he said. The public, he added, doesn’t “view scientists and scientific information in a vacuum.”

The deficit model has remained an enduring frame for scientists, many of whom are just becoming aware of social science work on the problem. Kahan compares it to the stages of grief. The first stage was that the truth just needs to be broadcast to change minds. The second, and one still influential in the scientific world, is that if the message is just simplified, the right images used, than the deficit will be filled.

“That too, I think, is a stage of misperception about how this works,” Kahan said.

Take the hand-wringing about science education that accompanied a recent poll finding that 46 percent of the United States believed in a creationist origin for humans. It’s a result that speaks to belief, not an understanding of evolution. Many surveyed who believed in evolution would still fail to explain natural selection, mutation or genetic variance, Kahan said, just as they don’t have to understand relativity to use their GPS.

Much of science doesn’t run up against the public’s belief systems and is accepted with little fuss. It’s not as if Louis Pasteur had to sell pasteurization by using slick images of children getting sick; for nearly all of society, it was simply a useful tool. People want to defer to the experts, as long as they don’t have to concede their beliefs on the way.

“People know what’s known without having a comprehension of why that’s the truth,” Kahan said.

There remains a danger in the emerging consensus that all scientific knowledge is filtered by the motivated reasoning of political and cultural ideology, Nisbet added. Not all people can be sorted by two, or even four, variables.

“In the new ideological deficit model, we tend to assume that failures in communication are caused by conservative media and conservative psychology,” he said. “The danger in this model is that we define the public in exclusively binary terms, as liberals versus conservatives, deniers versus believers.”

‘Crossing that line’

So why do climate scientists, more than most fields, cross the line into advocacy?

Most of all, it’s because their scientific work tells them the problem is so pressing, and time dependent, given the centuries-long life span of CO2 emissions, Somerville said.

“You get to the point where the emissions are large enough that you’ve run out of options,” he said. “You can no longer limit [it]. … We may be at that point already.”

There may also be less friction for scientists to suggest communal solutions to warming because, as Nisbet’s work has found, scientists tend to skew more liberal than the general population with more than 50 percent of one U.S. science society self-identifying as “liberal.” Given this outlook, they are more likely to accept efforts like cap and trade, a bill that, in implying a “cap” on activity, rubbed conservatives wrong.

Dan Kahan
A prolific law professor and psychologist at Yale, Dan Kahan has been charting how the public comes to, and understands, science. Photo courtesy of Dan Kahan.

“Not a lot of scientists would question if this is an effective policy,” Nisbet said.

It is not that scientists are unaware that they are moving into policy prescription, either. Most would intuitively know the line between their work and its political implications.

“I think many are aware when they’re crossing that line,” said Roger Pielke Jr., an environmental studies professor at the University of Colorado, Boulder, “but they’re not aware of the consequences [of] doing so.”

This willingness to cross into advocacy could also stem from the fact that it is the next logical skirmish. The battle for public opinion on the reality of human-driven climate change is already over, Pielke said, “and it’s been won … by the people calling for action.”

While there are slight fluctuations in public belief, in general a large majority of Americans side with what scientists say about the existence and causes of climate change. It’s not unanimous, he said, but it’s larger than the numbers who supported actions like the Montreal Protocol, the bank bailout or the Iraq War.

What has shifted has been its politicization: As more Republicans have begun to disbelieve global warming, Democrats have rallied to reinforce the science. And none of it is about the actual science, of course. It’s a fact Scripps’ Somerville now understands. It’s a code, speaking for fear of the policies that could happen if the science is accepted.

Doubters of warming don’t just hear the science. A policy is attached to it in their minds.

“Here’s a fact,” Pielke said. “And you have to change your entire lifestyle.”

For all the focus on how scientists talk to the public — whether Hansen has helped or hurt his cause — Yale’s Kahan ultimately thinks the discussion will mean very little. Ask most of the public who Hansen is, and they’ll mention something about the Muppets. It can be hard to accept, for scientists and journalists, but their efforts at communication are often of little consequence, he said.

“They’re not the primary source of information,” Kahan said.

‘A credible voice’

Like many of his peers, Somerville has suffered for his acts of advocacy.

“We all get hate email,” he said. “I’ve given congressional testimony and been denounced as an arrogant elitist hiding behind a discredited organization. Every time I’m on national news, I get a spike in ugly email. … I’ve received death threats.”

There are also pressures within the scientific community. As an elder statesman, Somerville does not have to worry about his career. But he tells young scientists to keep their heads down, working on technical papers. There is peer pressure to stay out of politics, a tension felt even by Somerville’s friend, the late Stephen Schneider, also at Stanford, who was long one of the country’s premier speakers on climate science.

He was publicly lauded, but many in the climate science community grumbled, Somerville said, that Schneider should “stop being a motormouth and start publishing technical papers.”

But there is a reason tradition has sustained the distinction between advising policymakers and picking solutions, one Krosnick’s work seems to ratify, said Michael Mann, a climatologist at Pennsylvania State University and a longtime target of climate contrarians.

“It is thoroughly appropriate, as a scientist, to discuss how our scientific understanding informs matters of policy, but … we should stop short of trying to prescribe policy,” Mann said. “This distinction is, in my view, absolutely critical.”

Somerville still supports the right of scientists to speak out as concerned citizens, as he has done, and as his friend, NASA’s Hansen, has done more stridently, protesting projects like the Keystone XL pipeline. As long as great care is taken to separate the facts from the political opinion, scientists should speak their minds.

“I don’t think being a scientist deprives you of the right to have a viewpoint,” he said.

Somerville often returns to a quote from the late Sherwood Rowland, a Nobel laureate from the University of California, Irvine, who discovered the threat chlorofluorocarbons posed to ozone: “What’s the use of having developed a science well enough to make predictions if, in the end, all we’re willing to do is stand around and wait for them to come true?”

Somerville asked Rowland several times whether the same held for global warming.

“Yes, absolutely,” he replied.

It’s an argument that Krosnick has heard from his own friends in climate science. But often this fine distinction gets lost in translation, as advocacy groups present the scientist’s personal message as the message of “science.” It’s luring to offer advice — Krosnick feels it himself when reporters call — but restraint may need to rule.

“In order to preserve a credible voice in public dialogue,” Krosnick said, “it might be that scientists such as myself need to restrain ourselves as speaking as public citizens.”

Broader efforts of communication, beyond scientists, could still mobilize the public, Nisbet said. Leave aside the third of the population who are in denial or alarmed about climate change, he said, and figure out how to make it relevant to the ambivalent middle.

“We have yet to really do that on climate change,” he said.

Somerville is continuing his efforts to improve communication from scientists. Another Bali declaration is unlikely, though. What he’d really like to do is get trusted messengers from different moral realms beyond science — leaders like the Dalai Lama — to speak repeatedly on climate change.

It’s all Somerville can do. It would be too painful to accept the other option, that climate change is like racism, war or poverty — problems the world has never abolished.

“[It] may well be that it is a problem that is too difficult for humanity to solve,” he said.

Mapping the Future of Climate Change in Africa (Science Daily)

ScienceDaily (Aug. 2, 2012) — Our planet’s changing climate is devastating communities in Africa through droughts, floods and myriad other disasters.

Children in the foothills of Drakensberg mountains in South Africa who still live in traditional rondavels on family homesteads. (Credit: Todd G. Smith, CCAPS Program)

Using detailed regional climate models and geographic information systems, researchers with the Climate Change and African Political Stability (CCAPS) program developed an online mapping tool that analyzes how climate and other forces interact to threaten the security of African communities.

The program was piloted by the Robert S. Strauss Center for International Security and Law at The University of Texas at Austin in 2009 after receiving a $7.6 million five-year grant from the Minerva Initiative with the Department of Defense, according to Francis J. Gavin, professor of international affairs and director of the Strauss Center.

“The first goal was to look at whether we could more effectively identify what were the causes and locations of vulnerability in Africa, not just climate, but other kinds of vulnerability,” Gavin said.

CCAPS comprises nine research teams focusing on various aspects of climate change, their relationship to different types of conflict, the government structures that exist to mitigate them, and the effectiveness of international aid in intervening. Although most CCAPS researchers are based at The University of Texas at Austin, the Strauss Center also works closely with Trinity College Dublin, the College of William and Mary, and the University of North Texas.

“In the beginning these all began as related, but not intimately connected, topics” Gavin said, “and one of the really impressive things about the project is how all these different streams have come together.”

Africa is particularly vulnerable to the effects of climate change due to its reliance on rain-fed agriculture and the inability of many of its governments to help communities in times of need.

The region is of increasing importance for U.S. national security, according to Gavin, because of the growth of its population, economic strength and resource importance, and also due to concerns about non-state actors, weakening governments and humanitarian disasters.

Although these issues are too complex to yield a direct causal link between climate change and security concerns, he said, understanding the levels of vulnerability that exist is crucial in comprehending the full effect of this changing paradigm.

The vulnerability mapping program within CCAPS is led by Joshua Busby, assistant professor at the Lyndon B. Johnson School of Public Affairs.

To determine the vulnerability of a given location based on changing climate conditions, Busby and his team looked at four different sources: 1) the degree of physical exposure to climate hazards, 2) population size, 3) household or community resilience, and 4) the quality of governance or presence of political violence.

The first source records the different types of climate hazards which could occur in the area, including droughts, floods, wildfires, storms and coastal inundation. However, their presence alone is not enough to qualify a region as vulnerable.

The second source — population size — determines the number of people who will be impacted by these climate hazards. More people create more demand for resources, potentially making the entire population more vulnerable.

The third source looks at how resilient a community is to adverse effects, analyzing the quality of their education and health, as well as whether they have easy access to food, water and health care.

“If exposure is really bad, it may exceed the capacity of local communities to protect themselves,” Busby said, “and then it comes down to whether or not the governments are going to be willing or able to help them.”

The final source accounts for the effectiveness of a given government, the amount of accountability present, how integrated it is with the international community, how politically stable it is, and whether there is any political violence present.

Busby and his team combined the four sources of vulnerability and gave them each equal weight, adding them together to form a composite map. Their scores were then divided into a ranking of five equal parts, or quintiles, going from the 20 percent of regions with the lowest vulnerability to the 20 percent with the highest.

The researchers gathered information for the tool from a variety of sources, including historic models of physical exposure from the United Nations Environment Programme (UNEP), population estimates from LandScan, as well as household surveys and governance assessments from the World Bank’s World Development and Worldwide Governance Indicators.

This data reflects past and present vulnerability, but to understand which places in Africa would be most vulnerable to future climate change, Busby and his team relied on the regional climate model simulations designed by Edward Vizy and Kerry Cook, both members of the CCAPS team from the Jackson School of Geosciences.

Vizy and Cook ran three, 20-year nested simulations of the African continent’s climate at the regional scales of 90 and 30 kilometers, using a derivation of the Weather Research and Forecasting Model of the National Center for Atmospheric Research. One was a control simulation representative of the years 1989-2008, and the others represented the climate as it may exist in 2041-2060 and 2081-2100.

“We’re adjusting the control simulation’s CO2 concentration, model boundary conditions, and sea surface temperatures to increased greenhouse gas forcing scenario conditions derived from atmosphere-ocean global climate models. We re-run the simulation to understand how the climate will operate under a different, warmer state at spatial resolutions needed for regional impact analyses,” Vizy said.

Each simulation took two months to complete on the Rangersupercomputer at the Texas Advanced Computing Center (TACC).

“We couldn’t run these simulations without the high-performance computing resources at TACC, it would just take too long. If it takes two months running with 200 processors, I can’t fathom doing it with one processor,” Vizy said.

Researchers input data from these vulnerability maps into an online mapping tool developed by the CCAPS program to integrate its various lines of climate, conflict and aid research. CCAPS’s current mapping tool is based on a prototype developed by the team to assess conflict patterns in Africa with the help of researchers at the TACC/ACES Visualization Laboratory (Vislab), according to Ashley Moran, program manager of CCAPS.

“The mapping tool is a key part of our effort to produce new research that could support policy making and the work of practitioners and governments in Africa,” Moran said. “We want to communicate this research in ways that are of maximum use to policymakers and researchers.”

The initial prototype of the mapping tool used the ArcGIS platform to project data onto maps. Working with its partner Development Gateway, CCAPS expanded the system to incorporate conflict, vulnerability, governance and aid research data.

After completing the first version of their model, Busby and his team carried out the process of ground truthing their maps by visiting local officials and experts in several African countries, such as Kenya and South Africa.

“The experience of talking with local experts was tremendously gratifying,” Busby said. “They gave us confidence that the things we’re doing in a computer lab setting in Austin do pick up on some of the ground-level expert opinions.”

Busby and his team complemented their maps with local perspectives on the kind of impact climate was already having, leading to new insights that could help perfect the model. For example, local experts felt the model did not address areas with chronic water scarcity, an issue the researchers then corrected upon returning home.

According to Busby, the vulnerability maps serve as focal points which can give way to further analysis about the issues they illustrate.

Some of the countries most vulnerable to climate change include Somalia, Sierra Leone, Guinea, Sudan and parts of the Democratic Republic of Congo. Knowing this allows local policymakers to develop security strategies for the future, including early warning systems against floods, investments in drought-resistant agriculture, and alternative livelihoods that might facilitate resource sharing and help prevent future conflicts. The next iteration of the online mapping tool to be released later this year will also incorporate the future projections of climate exposure from the models developed by Vizy and Cook.

The CCAPS team publishes their research in journals likeClimate Dynamics and The International Studies Review, carries out regular consultations with the U.S. government and governments in Africa, and participates in conferences sponsored by concerned organizations, such as the United Nations and the United States Africa Command.

“What this project has showed us is that many of the real challenges of the 21st century aren’t always in traditional state-to-state interactions, but are transnational in nature and require new ways of dealing with,” Gavin said.

Teen Survival Expectations Predict Later Risk-Taking Behavior (Science Daily)

ScienceDaily (Aug. 1, 2012) — Some young people’s expectations that they will not live long, healthy lives may actually foreshadow such outcomes.

New research published August 1 in the open access journal PLOS ONEreports that, for American teens, the expectation of death before the age of 35 predicted increased risk behaviors including substance abuse and suicide attempts later in life and a doubling to tripling of mortality rates in young adulthood.

The researchers, led by Quynh Nguyen of Northeastern University in Boston, found that one in seven participants in grades 7 to 12 reported perceiving a 50-50 chance or less of surviving to age 35. Upon follow-up interviews over a decade later, the researchers found that low expectations of longevity at young ages predicted increased suicide attempts and suicidal thoughts as well as heavy drinking, smoking, and use of illicit substances later in life relative to their peers who were almost certain they would live to age 35.

“The association between early survival expectations and detrimental outcomes suggests that monitoring survival expectations may be useful for identifying at-risk youth,” the authors state.

The study compared data collected from 19,000 adolescents in 1994-1995 to follow-up data collected from the same respondents 13-14 years later. The cohort was part of the National Longitudinal Study of Adolescent Health (Add Health), conducted by the Carolina Population Center and funded by the National Institutes of Health and 23 other federal agencies and foundations.

Journal Reference:

Quynh C. Nguyen, Andres Villaveces, Stephen W. Marshall, Jon M. Hussey, Carolyn T. Halpern, Charles Poole. Adolescent Expectations of Early Death Predict Adult Risk BehaviorsPLoS ONE, 2012; 7 (8): e41905 DOI: 10.1371/journal.pone.0041905

Brain Imaging Can Predict How Intelligent You Are: ‘Global Brain Connectivity’ Explains 10 Percent of Variance in Individual Intelligence (Science Daily)

ScienceDaily (Aug. 1, 2012) — When it comes to intelligence, what factors distinguish the brains of exceptionally smart humans from those of average humans?

New research suggests as much as 10 percent of individual variances in human intelligence can be predicted based on the strength of neural connections between the lateral prefrontal cortex and other regions of the brain. (Credit: WUSTL Image / Michael Cole)

As science has long suspected, overall brain size matters somewhat, accounting for about 6.7 percent of individual variation in intelligence. More recent research has pinpointed the brain’s lateral prefrontal cortex, a region just behind the temple, as a critical hub for high-level mental processing, with activity levels there predicting another 5 percent of variation in individual intelligence.

Now, new research from Washington University in St. Louis suggests that another 10 percent of individual differences in intelligence can be explained by the strength of neural pathways connecting the left lateral prefrontal cortex to the rest of the brain.

Published in the Journal of Neuroscience, the findings establish “global brain connectivity” as a new approach for understanding human intelligence.

“Our research shows that connectivity with a particular part of the prefrontal cortex can predict how intelligent someone is,” suggests lead author Michael W. Cole, PhD, a postdoctoral research fellow in cognitive neuroscience at Washington University.

The study is the first to provide compelling evidence that neural connections between the lateral prefrontal cortex and the rest of the brain make a unique and powerful contribution to the cognitive processing underlying human intelligence, says Cole, whose research focuses on discovering the cognitive and neural mechanisms that make human behavior uniquely flexible and intelligent.

“This study suggests that part of what it means to be intelligent is having a lateral prefrontal cortex that does its job well; and part of what that means is that it can effectively communicate with the rest of the brain,” says study co-author Todd Braver, PhD, professor of psychology in Arts & Sciences and of neuroscience and radiology in the School of Medicine. Braver is a co-director of the Cognitive Control and Psychopathology Lab at Washington University, in which the research was conducted.

One possible explanation of the findings, the research team suggests, is that the lateral prefrontal region is a “flexible hub” that uses its extensive brain-wide connectivity to monitor and influence other brain regions in a goal-directed manner.

“There is evidence that the lateral prefrontal cortex is the brain region that ‘remembers’ (maintains) the goals and instructions that help you keep doing what is needed when you’re working on a task,” Cole says. “So it makes sense that having this region communicating effectively with other regions (the ‘perceivers’ and ‘doers’ of the brain) would help you to accomplish tasks intelligently.”

While other regions of the brain make their own special contribution to cognitive processing, it is the lateral prefrontal cortex that helps coordinate these processes and maintain focus on the task at hand, in much the same way that the conductor of a symphony monitors and tweaks the real-time performance of an orchestra.

“We’re suggesting that the lateral prefrontal cortex functions like a feedback control system that is used often in engineering, that it helps implement cognitive control (which supports fluid intelligence), and that it doesn’t do this alone,” Cole says.

The findings are based on an analysis of functional magnetic resonance brain images captured as study participants rested passively and also when they were engaged in a series of mentally challenging tasks associated with fluid intelligence, such as indicating whether a currently displayed image was the same as one displayed three images ago.

Previous findings relating lateral prefrontal cortex activity to challenging task performance were supported. Connectivity was then assessed while participants rested, and their performance on additional tests of fluid intelligence and cognitive control collected outside the brain scanner was associated with the estimated connectivity.

Results indicate that levels of global brain connectivity with a part of the left lateral prefrontal cortex serve as a strong predictor of both fluid intelligence and cognitive control abilities.

Although much remains to be learned about how these neural connections contribute to fluid intelligence, new models of brain function suggested by this research could have important implications for the future understanding — and perhaps augmentation — of human intelligence.

The findings also may offer new avenues for understanding how breakdowns in global brain connectivity contribute to the profound cognitive control deficits seen in schizophrenia and other mental illnesses, Cole suggests.

Other co-authors include Tal Yarkoni, PhD, a postdoctoral fellow in the Department of Psychology and Neuroscience at the University of Colorado at Boulder; Grega Repovs, PhD, professor of psychology at the University of Ljubljana, Slovenia; and Alan Anticevic, an associate research scientist in psychiatry at Yale University School of Medicine.

Funding from the National Institute of Mental Health supported the study (National Institutes of Health grants MH66088, NR012081, MH66078, MH66078-06A1W1, and 1K99MH096801).

Modern culture emerged in Africa 20,000 years earlier than thought (L.A.Times)

By Thomas H. Maugh II

July 30, 2012, 1:54 p.m.

Border Cave artifactsObjects found in the archaeological site called Border Cave include a) a wooden digging stick; b) a wooden poison applicator; c) a bone arrow point decorated with a spiral incision filled with red pigment; d) a bone object with four sets of notches; e) a lump of beeswax; and f) ostrich eggshell beads and marine shell beads used as personal ornaments. (Francesco d’Errico and Lucinda Backwell/ July 30, 2012)
Modern culture emerged in southern Africa at least 44,000 years ago, more than 20,000 years earlier than anthropologists had previously believed, researchers reported Monday.

That blossoming of technology and art occurred at roughly the same time that modern humans were migrating fromAfrica to Europe, where they soon displaced Neanderthals. Many of the characteristics of the ancient culture identified by anthropologists are still present in hunter-gatherer cultures of Africa today, such as the San culture of southern Africa, the researchers said.

The new evidence was provided by an international team of researchers excavating at an archaeological site called Border Cave in the foothills of the Lebombo Mountains on the border of KwaZulu-Natal in South Africa and Swaziland. The cave shows evidence of occupation by human ancestors going back more than 200,000 years, but the team reported in two papers in the Proceedings of the National Academy of Sciences that they were able to accurately date their discoveries to 42,000 to 44,000 years ago, a period known as the Later Stone Age or the Upper Paleolithic Period in Europe.

Among the organic — and thus datable — artifacts the team found in the cave were ostrich eggshell beads, thin bone arrowhead points, wooden digging sticks, a gummy substance called pitch that was used to attach bone and stone blades to wooden shafts, a lump of beeswax likely used for the same purpose, worked pig tusks that were probably use for planing wood, and notched bones used for counting.

“They adorned themselves with ostrich egg and marine shell beads, and notched bones for notational purposes,” said paleoanthropologist Lucinda Blackwell of the University of Witwatersrand in South Africa, a member of the team. “They fashioned fine bone points for use as awls and poisoned arrowheads. One point is decorated with a spiral groove filled with red ochre, which closely parallels similar marks that San make to identify their arrowheads when hunting.”

The very thin bone points are “very good evidence” for the use of bows and arrows, said co-author Paola Villa, a curator at the University of Colorado Museum of Natural History. Some of the bone points were apparently coated with ricinoleic acid, a poison made from the castor bean. “Such bone points could have penetrated thick hides, but the lack of ‘knock-down’ power means the use of poison probably was a requirement for successful kills,” she said.

The discovery also represents the first time pitch-making has been documented in South Africa, Villa said. The process requires burning peeled bark in the absence of air. The Stone Age residents probably dug holes in the ground, inserted the bark, lit it on fire, and covered the holes with stones, she said.

The Conversion of a Climate-Change Skeptic (N.Y.Times)

OP-ED CONTRIBUTOR

By RICHARD A. MULLER

Published: July 28, 2012

Berkeley, Calif.

CALL me a converted skeptic. Three years ago I identified problems in previous climate studies that, in my mind, threw doubt on the very existence of global warming. Last year, following an intensive research effort involving a dozen scientists, I concluded that global warming was real and that the prior estimates of the rate of warming were correct. I’m now going a step further: Humans are almost entirely the cause.

My total turnaround, in such a short time, is the result of careful and objective analysis by the Berkeley Earth Surface Temperature project, which I founded with my daughter Elizabeth. Our results show that the average temperature of the earth’s land has risen by two and a half degrees Fahrenheit over the past 250 years, including an increase of one and a half degrees over the most recent 50 years. Moreover, it appears likely that essentially all of this increase results from the human emission of greenhouse gases.

These findings are stronger than those of the Intergovernmental Panel on Climate Change, the United Nations group that defines the scientific and diplomatic consensus on global warming. In its 2007 report, the I.P.C.C. concluded only that most of the warming of the prior 50 years could be attributed to humans. It was possible, according to the I.P.C.C. consensus statement, that the warming before 1956 could be because of changes in solar activity, and that even a substantial part of the more recent warming could be natural.

Our Berkeley Earth approach used sophisticated statistical methods developed largely by our lead scientist, Robert Rohde, which allowed us to determine earth land temperature much further back in time. We carefully studied issues raised by skeptics: biases from urban heating (we duplicated our results using rural data alone), from data selection (prior groups selected fewer than 20 percent of the available temperature stations; we used virtually 100 percent), from poor station quality (we separately analyzed good stations and poor ones) and from human intervention and data adjustment (our work is completely automated and hands-off). In our papers we demonstrate that none of these potentially troublesome effects unduly biased our conclusions.

The historic temperature pattern we observed has abrupt dips that match the emissions of known explosive volcanic eruptions; the particulates from such events reflect sunlight, make for beautiful sunsets and cool the earth’s surface for a few years. There are small, rapid variations attributable to El Niño and other ocean currents such as the Gulf Stream; because of such oscillations, the “flattening” of the recent temperature rise that some people claim is not, in our view, statistically significant. What has caused the gradual but systematic rise of two and a half degrees? We tried fitting the shape to simple math functions (exponentials, polynomials), to solar activity and even to rising functions like world population. By far the best match was to the record of atmospheric carbon dioxide, measured from atmospheric samples and air trapped in polar ice.

Just as important, our record is long enough that we could search for the fingerprint of solar variability, based on the historical record of sunspots. That fingerprint is absent. Although the I.P.C.C. allowed for the possibility that variations in sunlight could have ended the “Little Ice Age,” a period of cooling from the 14th century to about 1850, our data argues strongly that the temperature rise of the past 250 years cannot be attributed to solar changes. This conclusion is, in retrospect, not too surprising; we’ve learned from satellite measurements that solar activity changes the brightness of the sun very little.

How definite is the attribution to humans? The carbon dioxide curve gives a better match than anything else we’ve tried. Its magnitude is consistent with the calculated greenhouse effect — extra warming from trapped heat radiation. These facts don’t prove causality and they shouldn’t end skepticism, but they raise the bar: to be considered seriously, an alternative explanation must match the data at least as well as carbon dioxide does. Adding methane, a second greenhouse gas, to our analysis doesn’t change the results. Moreover, our analysis does not depend on large, complex global climate models, the huge computer programs that are notorious for their hidden assumptions and adjustable parameters. Our result is based simply on the close agreement between the shape of the observed temperature rise and the known greenhouse gas increase.

It’s a scientist’s duty to be properly skeptical. I still find that much, if not most, of what is attributed to climate change is speculative, exaggerated or just plain wrong. I’ve analyzed some of the most alarmist claims, and my skepticism about them hasn’t changed.

Hurricane Katrina cannot be attributed to global warming. The number of hurricanes hitting the United States has been going down, not up; likewise for intense tornadoes. Polar bears aren’t dying from receding ice, and the Himalayan glaciers aren’t going to melt by 2035. And it’s possible that we are currently no warmer than we were a thousand years ago, during the “Medieval Warm Period” or “Medieval Optimum,” an interval of warm conditions known from historical records and indirect evidence like tree rings. And the recent warm spell in the United States happens to be more than offset by cooling elsewhere in the world, so its link to “global” warming is weaker than tenuous.

The careful analysis by our team is laid out in five scientific papers now online atBerkeleyEarth.org. That site also shows our chart of temperature from 1753 to the present, with its clear fingerprint of volcanoes and carbon dioxide, but containing no component that matches solar activity. Four of our papers have undergone extensive scrutiny by the scientific community, and the newest, a paper with the analysis of the human component, is now posted, along with the data and computer programs used. Such transparency is the heart of the scientific method; if you find our conclusions implausible, tell us of any errors of data or analysis.

What about the future? As carbon dioxide emissions increase, the temperature should continue to rise. I expect the rate of warming to proceed at a steady pace, about one and a half degrees over land in the next 50 years, less if the oceans are included. But if China continues its rapid economic growth (it has averaged 10 percent per year over the last 20 years) and its vast use of coal (it typically adds one new gigawatt per month), then that same warming could take place in less than 20 years.

Science is that narrow realm of knowledge that, in principle, is universally accepted. I embarked on this analysis to answer questions that, to my mind, had not been answered. I hope that the Berkeley Earth analysis will help settle the scientific debate regarding global warming and its human causes. Then comes the difficult part: agreeing across the political and diplomatic spectrum about what can and should be done.

Richard A. Muller, a professor of physics at the University of California, Berkeley, and a former MacArthur Foundation fellow, is the author, most recently, of “Energy for Future Presidents: The Science Behind the Headlines.”

*   *   *

Climate change study forces sceptical scientists to change minds (The Guardian)

Earth’s land shown to have warmed by 1.5C over past 250 years, with humans being almost entirely responsible

Leo Hickman
guardian.co.uk, Sunday 29 July 2012 14.03 BST

Prof Richard MullerProf Richard Muller considers himself a converted sceptic following the study’s surprise results. Photograph: Dan Tuffs for the Guardian

The Earth’s land has warmed by 1.5C over the past 250 years and “humans are almost entirely the cause”, according to a scientific study set up to address climate change sceptics’ concerns about whether human-induced global warming is occurring.

Prof Richard Muller, a physicist and climate change sceptic who founded the Berkeley Earth Surface Temperature (Best) project, said he was surprised by the findings. “We were not expecting this, but as scientists, it is our duty to let the evidence change our minds.” He added that he now considers himself a “converted sceptic” and his views had undergone a “total turnaround” in a short space of time.

“Our results show that the average temperature of the Earth’s land has risen by 2.5F over the past 250 years, including an increase of 1.5 degrees over the most recent 50 years. Moreover, it appears likely that essentially all of this increase results from the human emission of greenhouse gases,” Muller wrote in an opinion piece for the New York Times.

Can scientists in California end the war on climate change?
Study finds no grounds for climate sceptics’ concerns
Video: Berkeley Earth tracks climate change
Are climate sceptics more likely to be conspiracy theorists?

The team of scientists based at the University of California, Berkeley, gathered and merged a collection of 14.4m land temperature observations from 44,455 sites across the world dating back to 1753. Previous data sets created by Nasa, the US National Oceanic and Atmospheric Administration, and the Met Office and the University of East Anglia’s climate research unit only went back to the mid-1800s and used a fifth as many weather station records.

The funding for the project included $150,000 from the Charles G Koch Charitable Foundation, set up by the billionaire US coal magnate and key backer of the climate-sceptic Heartland Institute thinktank. The research also received $100,000 from the Fund for Innovative Climate and Energy Research, which was created by Bill Gates.

Unlike previous efforts, the temperature data from various sources was not homogenised by hand – a key criticism by climate sceptics. Instead, the statistical analysis was “completely automated to reduce human bias”. The Best team concluded that, despite their deeper analysis, their own findings closely matched the previous temperature reconstructions, “but with reduced uncertainty”.

Last October, the Best team published results that showed the average global land temperature has risen by about 1C since the mid-1950s. But the team did not look for possible fingerprints to explain this warming. The latest data analysis reached much further back in time but, crucially, also searched for the most likely cause of the rise by plotting the upward temperature curve against suspected “forcings”. It analysed the warming impact of solar activity – a popular theory among climate sceptics – but found that, over the past 250 years, the contribution of the sun has been “consistent with zero”. Volcanic eruptions were found to have caused short dips in the temperature rise in the period 1750–1850, but “only weak analogues” in the 20th century.

“Much to my surprise, by far the best match came to the record of atmospheric carbon dioxide, measured from atmospheric samples and air trapped in polar ice,” said Muller. “While this doesn’t prove that global warming is caused by human greenhouse gases, it is currently the best explanation we have found, and sets the bar for alternative explanations.”

Muller said his team’s findings went further and were stronger than the latest report published by the Intergovernmental Panel on ClimateChange.

In an unconventional move aimed at appeasing climate sceptics by allowing “full transparency”, the results have been publicly released before being peer reviewed by the Journal of Geophysical Research. All the data and analysis is now available to be freely scrutinised at the Bestwebsite. This follows the pattern of previous Best results, none of which have yet been published in peer-reviewed journals.

When the Best project was announced last year, the prominent climate sceptic blogger Anthony Watts was consulted on the methodology. He stated at the time: “I’m prepared to accept whatever result they produce, even if it proves my premise wrong.” However, tensions have since arisen between Watts and Muller.

Early indications suggest that climate sceptics are unlikely to fully accept Best’s latest results. Prof Judith Curry, a climatologist at the Georgia Institute of Technology who runs a blog popular with climate sceptics and who is a consulting member of the Best team, told the Guardian that the method used to attribute the warming to human emissions was “way over-simplistic and not at all convincing in my opinion”. She added: “I don’t think this question can be answered by the simple curve fitting used in this paper, and I don’t see that their paper adds anything to our understanding of the causes of the recent warming.”

Prof Michael Mann, the Penn State palaeoclimatologist who has faced hostility from climate sceptics for his famous “hockey stick” graph showing a rapid rise in temperatures during the 20th century, said he welcomed the Best results as they “demonstrated once again what scientists have known with some degree of certainty for nearly two decades”. He added: “I applaud Muller and his colleagues for acting as any good scientists would, following where their analyses led them, without regard for the possible political repercussions. They are certain to be attacked by the professional climate change denial crowd for their findings.”

Muller said his team’s analysis suggested there would be 1.5 degrees of warming over land in the next 50 years, but if China continues its rapid economic growth and its vast use of coal then that same warming could take place in less than 20 years.

“Science is that narrow realm of knowledge that, in principle, is universally accepted,” wrote Muller. “I embarked on this analysis to answer questions that, to my mind, had not been answered. I hope that the Berkeley Earth analysis will help settle the scientific debate regarding global warming and its human causes. Then comes the difficult part: agreeing across the political and diplomatic spectrum about what can and should be done.”

Climate Change and the Next U.S. Revolution (ZNet)

Thursday, July 26, 2012

The U.S. heat wave is slowly shaking the foundations of American politics. It may take years for the deep rumble to evolve into an above ground, institution-shattering earthquake, but U.S. society has changed for good.

The heat wave has helped convince tens of millions of Americans that climate change is real, overpowering the fake science and right-wing media – funded by corporate cash – to convince Americans otherwise.

Republicans and Democrats alike also erect roadblocks to understanding climate change. By the politicians’ complete lack of action towards addressing the issue, the “climate change is fake” movement was strengthened, since Americans presumed that any sane government would be actively trying to address an issue that had the potential to destroy civilization.

But working people have finally made up their mind. A recent poll showed that 70 percent of Americans now believe that climate change is real, up from 52 percent in 2010. And a growing number of people are recognizing that the warming of the planet is caused by human activity.

Business Week explains: “A record heat wave, drought and catastrophic wildfires are accomplishing what climate scientists could not: convincing a wide swath of Americans that global temperatures are rising.”

This means that working class families throughout the Midwest and southern states simply don’t believe what their media and politicians are telling them.

It also implies that these millions of Americans are being further politicized in a deeper sense.

Believing that climate change exists implies that you are somewhat aware about the massive consequences to humanity if the global economy doesn’t drastically change, and fast.

This awareness has revolutionary implications. As millions of Americans watch the environment destroyed – for their grandchildren or themselves – while politicians do absolutely nothing in response, or make tiny token gestures – a growing number of Americans will demand political alternatives, and fight to see them created. The American political system as it exists today cannot cope with this inevitable happening.

The New York Times explains why: “…the American political system is not ready to agree to a [climate] treaty that would force the United States, over time, to accept profound changes in its energy [coal, oil], transport [trucking and airline industry] and manufacturing [corporate] sectors.”

In short, the U.S. government will not force corporations to make less profit by behaving more eco-friendly. This is the essence of the problem.

In order for humanity to survive climate change, the economy must be radically transformed; massive investments must be made in renewable energy, public transportation, and recycling, while dirty energy sources must be quickly swept into the dustbin of history.

But the economy is currently owned by giant, privately run corporations, that will continue destroying the earth if it earns them huge profits, and they make massive “contributions” to political parties to ensure this remains so. It’s becoming increasingly obvious that government inaction on climate change is directly linked to the “special interests” of corporations that dominate these governments.

This fact of U.S. politics is present in every other capitalist country as well, which means that international agreements on reducing greenhouse gasses will remain impossible, as each country’s corporations vie for market domination, reducing pollution simply puts them at a competitive disadvantage.

This dynamic has already caused massive delays in the UN’s already inadequate efforts at addressing climate change. The Kyoto climate agreement was the by-product of years of cooperation and planning between many nations that included legally binding agreements to reduce greenhouse gasses. The Bush and Obama administrations helped destroy these efforts.

For example, Instead of building upon the foundation of the Kyoto Protocol, the Obama administration demanded a whole new structure, something that would take years to achieve. The Kyoto framework (itself insufficient) was abandoned because it included legally binding agreements, and was based on multilateral, agreed-upon reductions of greenhouse gasses.

In an article by the Guardian entitled “US Planning to Weaken Copenhagen Climate Deal,” the Obama administration’s UN position is exposed, as he dismisses the Kyoto Protocol by proposing that “…each country set its own rules and to decide unilaterally how to meet its target.”
Obama’s proposal came straight from the mouth of U.S. corporations, who wanted to ensure that there was zero accountability, zero oversight, zero climate progress, and therefore no dent to their profits. Instead of using its massive international leverage for climate justice, the U.S. has used it to promote divisiveness and inaction, to the potential detriment of billions of people globally.

The stakes are too high to hold out any hope that governments will act boldly. The Business Week article below explains the profound changes happening to the climate:

“The average temperature for the U.S. during June was 71.2 degrees Fahrenheit (21.7 Celsius), which is 2 degrees higher than the average for the 20th century, according to the National Oceanic and Atmospheric Administration. The June temperatures made the preceding 12 months the warmest since record-keeping began in 1895, the government agency said.”

Activists who are radicalized by this global problem face a crisis of what to do about it. It is difficult to put forth a positive climate change demand, since the problem is global.  Demanding that governments “act boldly” to address climate change hasn’t worked, and lesser demands seem inadequate.

The environmental rights movement continues to go through a variety of phases: individual and small group eco-“terrorism,” causing property damage to environmentally damaging companies; corporate campaigns that target especially bad polluters with high-profile direct action; and massive education programs that have been highly successful, but fall short when it comes to winning change.

Ultimately, climate activists must come face to face with political and corporate power. Corporate-owned governments are the ones with the power to adequately address the climate change issue, and they will not be swayed by good science, common sense, basic decency, or even a torched planet.

Those in power only respond to power, and the only power capable of displacing corporate power is when people unite and act collectively, as was done in Egypt, Tunisia, and is still developing throughout Europe.

Climate groups cannot view their issue as separate from other groups that are organizing against corporate power. The social movements that have emerged to battle austerity measures are natural allies, as are anti-war and labor activists. The climate solution will inevitably require revolutionary measures, which first requires that alliances and demands are put forward that unite Labor, working people in general, community, and student groups towards collective action.

One possible immediate demand is for environmental activists to unite with Labor groups over a federal jobs program, paid for by taxing the rich, that makes massive investments in jobs that are climate related, such as solar panel production, transportation, building recycling centers, home retro-fitting, etc.

Another demand could be to insist that the government convene the most knowledgeable scientists in the area of clean energy. These scientists should be given all the resources they need in order to collectively create alternative sources of clean energy that would allow for a realistic alternative to the current polluting and toxic sources of energy.

However, any type of immediate demand will meet giant corporate resistance from both political parties. Fighting for a uniting demand will thus strengthen the movement, and for this reason it is important to link climate solutions to the creation of jobs, which are the number one concern of most Americans. This unity will in turn lead allies toward a deeper understanding of the problem, and therefore deeper solutions will emerge that challenge the whole economic structure that is deaf to the needs of humans and the climate and sacrifices everything to the private profit of a few.

Shamus Cooke is a social service worker, trade unionist, and writer for Workers Action (www.workerscompass.org). He can be reached at shamuscooke@gmail.com

http://www.businessweek.com/news/2012-07-18/record-heat-wave-pushes-u-dot-s-dot-belief-in-climate-change-to-70-percent

http://www.nytimes.com/2009/12/13/weekinreview/13broder.html

http://www.guardian.co.uk/environment/2009/sep/15/europe-us-copenhagen

Computers Can Predict Effects of HIV Policies, Study Suggests (Science Daily)

ScienceDaily (July 27, 2012) — Policymakers in the fight against HIV/AIDS may have to wait years, even decades, to know whether strategic choices among possible interventions are effective. How can they make informed choices in an age of limited funding? A reliable, well-calibrated, predictive computer simulation would be a great help.

A visualization generated by an agent-based model of New York City’s HIV epidemic shows the risky interactions of unprotected sex or needle sharing among injection drug users (red), non-injection drug users (blue) and non-users (green). (Credit: Brandon Marshall/Brown University)

Policymakers struggling to stop the spread of HIV grapple with “what if” questions on the scale of millions of people and decades of time. They need a way to predict the impact of many potential interventions, alone or in combination. In two papers to be presented at the 2012 International AIDS Society Conference in Washington, D.C., Brandon Marshall, assistant professor of epidemiology at Brown University, will unveil a computer program calibrated to model accurately the spread of HIV in New York City over a decade and to make specific predictions about the future of the epidemic under various intervention scenarios.

“It reflects what’s seen in the real world,” said Marshall. “What we’re trying to do is identify the ideal combination of interventions to reduce HIV most dramatically in injection drug users.”

In an analysis that he’ll present on July 27, Marshall projects that with no change in New York City’s current programs, the infection rate among injection drug users will be 2.1 per 1,000 in 2040. Expanding HIV testing would drop the rate only 12 percent to 1.9 per 1,000; increasing drug treatment would reduce the rate 26 percent to 1.6 per 1,000; providing earlier delivery of antiretroviral therapy and better adherence would drop the rate 45 percent to 1.2 per 1,000; and expanding needle exchange programs would reduce the rate 34 percent to 1.4 per 1,000. Most importantly, doing all four of those things would cut the rate by more than 60 percent, to 0.8 per 1,000.

Virtual reality, real choices

The model is unique in that it creates a virtual reality of 150,000 “agents,” a programming term for simulated individuals, who in the case of the model, engage in drug use and sexual activity like real people.

Like characters in an all-too-serious video game, the agents behave in a world governed by biological rules, such as how often the virus can be transmitted through encounters such as unprotected gay sex or needle sharing.

With each run of the model, agents accumulate a detailed life history. For example, in one run, agent 89,425, who is male and has sex with men, could end up injecting drugs. He participates in needle exchanges, but according to the built-in probabilities, in year three he shares needles multiple times with another injection drug user with whom he is also having unprotected sex. In the last of those encounters, agent 89,425 becomes infected with HIV. In year four he starts participating in drug treatment and in year five he gets tested for HIV, starts antiretroviral treatment, and reduces the frequency with which he has unprotected sex. Because he always takes his HIV medications, he never transmits the virus further.

That level of individual detail allows for a detailed examination of transmission networks and how interventions affect them.

“With this model you can really look at the microconnections between people,” said Marshall, who began working on the model as a postdoctoral fellow at Columbia University and has continued to develop it since coming to Brown in January. “That’s something that we’re really excited about.”

To calibrate the model, Marshall and his colleagues found the best New York City data they could about how many people use drugs, what percentage of people were gay or lesbian, the probabilities of engaging in unprotected sex and needle sharing, viral transmission, access to treatment, treatment effectiveness, participation in drug treatment, progression from HIV infection to AIDS, and many more behavioral, social and medical factors. They also continuously calibrated it until the model could faithfully reproduce the infection rates among injection drug users that were known to occur in New York between 1992 and 2002.

And they don’t just run the simulation once. They run it thousands of times on a supercomputer at Brown to be sure the results they see are reliable.

Future applications

At Brown, Marshall is continuing to work on other aspects of the model, including an analysis of the cost effectiveness of each intervention and their combinations. Cost is, after all, another fact of life that policymakers and public health officials must weigh.

And then there’s the frustrating insight that the infection rate, even with four strengthened interventions underway, didn’t reduce the projected epidemic by much more than half.

“I actually expected something larger,” Marshall said. “That speaks to how hard we have to work to make sure that drug users can access and benefit from proven interventions to reduce the spread of HIV.”

Marshall’s collaborators on the model include Magdalena Paczkowski, Lars Seemann, Barbara Tempalski, Enrique Pouget, Sandro Galea, and Samuel Friedman.

The National Institutes of Health and the Lifespan/Tufts/Brown Center for AIDS Research provide financial support for the model’s continued development.

Climate Change Could Open Trade Opportunities for Some Vulnerable Nations (Science Daily)

ScienceDaily (July 26, 2012) — Tanzania is one developing country that could actually benefit from climate change by increasing exports of corn to the U.S. and other nations, according to a study by researchers at Stanford University, the World Bank and Purdue University.

The study, published in the Review of Development Economics, shows the African country better known for safaris and Mt. Kilimanjaro has the potential to substantially increase its maize exports and take advantage of higher commodity prices with a variety of trading partners due to predicted dry and hot weather that could affect those countries’ usual sources for the crop. In years that major consumer countries such as the U.S., China and India are forecast to experience severe dry conditions, Tanzania’s weather will likely be comparatively wet. Similarly, in the relatively few years this century that it is expected to have severe dry weather, Tanzania could import corn from trading partners experiencing better growing conditions.

“This study highlights how government policies can influence the impact that we experience from the climate system” said study co-author Noah Diffenbaugh, an assistant professor of environmental Earth system science at Stanford’s School of Earth Sciences and a center fellow at the Stanford Woods Institute for the Environment. “Tanzania is a particularly interesting case, as it has the potential to benefit from climate change if climate model predictions of decreasing drought in East Africa prove to be correct, and if trade policies are constructed to take advantage of those new opportunities.”

Tightening restrictions on crop exports during times of climate instability may seem like a logical way to ensure domestic food availability and price stability. In fact, the study warns, trade restrictions such as those that Tanzania has instituted several times in recent years prevent countries such as Tanzania from buffering its poor citizens in bad climate years and from taking advantage of economic opportunities in good climate years.

The study, the most long-range and detailed of its kind to date uses economic, climatic and agricultural data and computational models to forecast the occurrence of severe dry years during the next nine decades in Tanzania and its key trading partners. The authors began by analyzing historical years in which Tanzania experienced grains surpluses or deficits. They found that a closed trade policy enhanced poverty in both kinds of years, by limiting the ability to offset shortfalls with imports during deficit years and limiting the ability to profit from exports during surplus years.

The authors then attempted to predict how often Tanzania and key trading partners will experience severely dry years in response to continued global warming. Among the predictions: during an average of 96 percent of the years that the U.S. and China are predicted to have extremely dry conditions, Tanzania will not experience similarly dry weather. For India, that percentage increases to 97 percent. Similarly, the study’s climate models suggest that Tanzania is likely to have adequate growing season moisture in most of the years that its key African trading partners experience severe dry weather.

Among Tanzania’s trading partners, the U.S., China, Canada and Russia are most likely to consistently experience adequate growing conditions in years when Tanzania does not. When compared with all of its key trading partners, Tanzania’s dry years during the 21st century will often coincide with non-dry years in the other countries. Having a diverse mix of trading partners could help hedge against a coincidence of severe dry weather within and outside of Africa, the study’s results suggest.

The findings are relevant to grain-growing countries around the world. Those countries stand to profit from exports in years when trading partners are enduring severe dry and / or hot weather. Likewise, they can buffer themselves against bad growing weather at home by importing from grains-rich regions less affected by such weather during that particular year.

“This study highlights the importance of trade in either buffering or exacerbating the effects of climate stresses on the poor,” says Diffenbaugh. “We find that these effects are already taking place in the current climate, and that they could become even more important in the future as the co-occurrence of good and bad years between different regions changes in response to global warming.”

Ciência e cultura, o que elas têm em comum? (Jornal da Ciência)

JC e-mail 4549, de 27 de Julho de 2012.

A pergunta foi tema da mesa-redonda “Divulgação da Ciência e da Cultura”, realizada na 64ª Reunião Anual da Sociedade Brasileira para o Progresso da Ciência (SBPC), que termina hoje (27), em São Luís.

Para Ildeu de Castro Moreira, diretor de Popularização e Difusão da Ciência e Tecnologia do Ministério da Ciência, Tecnologia e Inovação (MCTI) e conselheiro da SBPC, o debate sobre a relação da ciência com a arte é muito importante porque são duas facetas fundamentais da cultura humana. “Ciência, arte e cultura têm em comum a criatividade inerente ao ser humano”, definiu. Ele explica que arte e ciência são atividades humanas e sociais baseadas na criatividade e curiosidade.

Físico e divulgador científico, Ildeu falou sobre o “imaginário científico presente na mente de artistas”, e explicou que a ciência também tem preocupação estética e guarda semelhanças com a arte. Para ele, há beleza nas teorias científicas. “Equações matemáticas e fórmulas físicas são lindas. Podem parecer chatas em sala de aula, mas contando com a ajuda do olhar de um artista é possível mostrar essa beleza. É preciso aprender a olhar a beleza da ciência, assim como temos que aprender a olhar muita coisa na arte contemporânea”, exemplifica.

Para Ildeu, as conexões entre ciência e arte são importantes para fazer a divulgação científica chegar mais facilmente ao público. Em sua exposição, ele mostrou manifestações artísticas que falam de ciência, dando exemplos de poesias, músicas, enredos de escolas de samba, ditos populares e cordel.

Público infantil – Em sua apresentação na mesa-redonda, Luisa Medeiros Massarani, jornalista e chefe do Museu da Vida da Fiocruz, no Rio de Janeiro, falou sobre iniciativas de divulgação científicas voltadas para o público infantil. “A experiência tem demonstrado uma grande receptividade das crianças, maior do que a de adultos e adolescentes. Principalmente devido à curiosidade da criança, que são consideradas como ‘cientistas naturais'”, explica.

Luisa falou sobre o crescimento de museus de ciências no País, que atualmente são cerca de 200, embora ainda estejam concentrados em algumas regiões. “Os museus têm apelo incrível para as crianças e são importantes também para o divulgador que vê na hora a reação da criança”, revela. Apesar de os museus terem grande parte do público formado por crianças, Luisa afirma que é preciso pensar em espaços específicos para elas, desde a redução do tamanho dos móveis até atividades interativas adequadas.

Ela defende que a criança deve ser encarada como ator social importante no processo de divulgação científica. “Falar de divulgação científica para criança não é falar de ciência unilateralmente, é preciso que a criança seja ator importante e protagonista do processo”, explica ao dizer que a experiência de uma feira de ciência, ou a visita a um museu fica na memória da criança e pode influenciar sua formação, além de provocar e despertar o interesse pela ciência.

A chefe do Museu da Vida citou exposições, livros e publicações voltadas para o público infantil. E destacou a importância de fazer avaliações junto às crianças depois dessas experiências, para saber qual caminho seguir.

Ildeu aproveitou para sugerir que artistas participem mais ativamente das reuniões da SBPC, não somente como um evento paralelo, como a SBPC Cultural, mas como integrantes de mesas e debates com os cientistas. A ideia é aproveitar o público da Reunião, que alcança 15, 20 mil pessoas para falar dessa relação.

(Jornal da Ciência)

Uma leitura de antropólogos e sociólogos sobre o futuro da Amazônia (Jornal da Ciência)

JC e-mail 4549, de 27 de Julho de 2012.

O enfraquecimento de agências multilaterais de cooperação internacional começa a ameaçar as políticas para conservação da Amazônia Legal. A afirmativa é do presidente do Programa Nova Cartografia Social, Alfredo Wagner de Almeida, que ministrou conferência ontem (26) na 64ª Reunião Anual da Sociedade Brasileira para o Progresso da Ciência (SBPC), realizada na Universidade Federal do Maranhão (UFMA), em São Luís.

Sob o tema “Povos e comunidades tradicionais atingidos por projetos militares”, o antropólogo alertou sobre a ação de sete estados que buscam reduzir a Amazônia Legal, cujos projetos tramitam no Legislativo. Dentre os quais estão o Mato Grosso que prevê retirar a participação de sua área como Amazônia Legal, igualmente a Rondônia, que quer retirar esse título de suas terras da região. Outros estados como Maranhão e Tocantins querem tirar o título de todas suas áreas consideradas Amazônia Legal.

A região engloba uma superfície de aproximadamente 5.217.423 km², o equivalente a cerca de 61% do território brasileiro. Foi instituída com objetivo de definir a delimitação geográfica da região política captadora de incentivos fiscais para promoção do desenvolvimento regional.

“Essa é uma primeira tentativa de reduzir a Amazônia Legal, pois esses estados agora não gozam mais dos benefícios concedidos pelas agências internacionais multilaterais”, analisou Almeida, também conselheiro da SBPC e professor da Universidade do Estado do Amazonas (UEA).

Segundo o pesquisador, os organismos internacionais, até então, eram fontes de recursos para programas de proteção à Amazônia. Tais como, o Projeto Integrado de Proteção às Populações e Terras Indígenas da Amazônia Legal (PPTAL), destinado à demarcação de terras indígenas, fomentado principalmente pelo governo da Alemanha. E o PPG7 (Programa Piloto para Proteção das Florestas Tropicais do Brasil). Foram essas políticas que fortaleceram a criação do Ministério do Meio Ambiente. “Sem o apoio das agências multilaterais as políticas para a Amazônia encolheram”, disse, sem citar valores.

Conforme o antropólogo, a decisão dos estados que querem sair da Amazônia Legal significa para eles “liderar mais terras segundo as quais consideram ser produtivas”, em detrimento da conservação das florestas.

As declarações do antropólogo são baseadas no dossiê “Amazônia: sociedade, fronteiras e políticas”, produzido por Edna Maria Ramos de Castro, socióloga do Núcleo de Altos Estudos Amazônicos, da Universidade Federal do Pará (UFPA), e diretora da SBPC, que intermediou a conferência. A íntegra do documento foi publicada recentemente no Caderno CRH da Bahia.

Terras indígenas – Na avaliação da autora do dossiê, os dispositivos jurídicos desses estados ameaçam as terras indígenas – protagonistas na conservação da biodiversidade que precisam da natureza para sobreviver. “São dispositivos legais, são claros na Constituição, mas essa prática pode levar a uma situação de impasse [da sociedade]”, analisou. Edna citou o caso da polêmica obra da hidrelétrica de Belo Monte que se tornou um ícone de um processo de resistência da sociedade brasileira.

Mudança de paradigma – O antropólogo fez uma leitura sobre o atual modelo político brasileiro administrativo. Ele vê uma mudança de uma política “de proteção” para uma “ideia de protecionismo”. “A distinção entre proteção e protecionismo revela em primeiro lugar o enfraquecimento das agências multilaterais internacionais”, disse. Segundo ele, o protecionismo “erige” fora do âmbito da proteção.

Do ponto de vista de Alfredo Wagner, os sinais de mudança refletem principalmente os desacordos na reunião da Organização Mundial do Comércio (OMC) em dezembro de 2011 em Genebra. Na ocasião, houve sinais de ruptura de acordos internacionais – até então chamados de mercado comum. Um exemplo “é o engavetamento” da chamada Rodada de Doha, em razão de divergência entre as partes sobre subsídios agrícolas concedidos por países desenvolvidos.

Expansão da área militar e infraestrutura – O antropólogo lembra que no auge dos organismos multilaterais a área de segurança, isto é, a dos militares, não era fomentada porque não fazia parte de uma política de mercado único. Ele observa, entretanto, uma mudança a partir de 2009 quando há um deslocamento do modelo e problemas com os militares começam a aparecer, em decorrência da reedição de projetos de fronteiras militarizadas. “A partir daí inicia um capítulo de conflitos”.

Afastamento de fundos internacionais e órgãos reguladores – Segundo ele, o que mais sobressai na “ideia do protecionismo” é a identificação de recursos naturais estratégicos, como commodities agrícolas e minérios, que – sob o argumento de desenvolvimento sustentável – podem ser utilizados para o incremento de grandes obras de infraestrutura.

“Tudo passa a ser interpretado como interesses nacionais. A ideia de bloco vai perdendo força, o que pode explicar as próprias tensões no Mercosul, quando a Venezuela é levada ao bloco em momentos de crise. Esses interesses nacionais passam a se articular de maneira disciplinada sem passar pelas entidades multilaterais”, considera o antropólogo.

Segundo ele, atual ação do Estado brasileiro não passa pelas entidades multilaterais. Reflexo é o afastamento do Fundo Monetário Internacional (FMI) e de duas normas estrangeiras. Uma delas é a Lei de Direitos Humanos Internacional da OEA (Organização dos Estados Americanos). Ele lembra que o Brasil deixou de investir “nessa corte” a partir do momento em que a hidrelétrica de Belo Monte foi condenada pelo órgão. “O Brasil passa a ter uma posição unilateral, semelhante a dos norte-americanos na Guerra do Golfo”, observa o antropólogo. “A ideia do protecionismo vem de forma bastante forte”.

Alfredo Wagner também observa sinais de afastamento da Convenção 169 em que obriga a consulta prévia de comunidades prejudicadas por grandes obras de infraestrutura, por exemplo. Segundo ele, o Brasil é condenado a seis violações em projetos militares. Uma é pela construção do Centro de Lançamentos de Alcântara (CLA) em comunidades quilombolas no Maranhão, sem licenciamento ambiental e sem consulta às comunidades “afetadas”.

Ele alerta também sobre quatro medidas preocupantes em andamento segundo as quais preveem a construção emergencial de hidrelétricas. Um exemplo é a Medida Provisória 558 de 18 de janeiro de 2012 em que prevê redução de unidades protegidas e de conservação de florestas sob o argumento de desenvolvimento. Segundo ele, o Ibama aprovou em apenas cinco dias uma minuta de termo de referência da Eletronorte para construção de uma hidrelétrica em São Luiz de Tapajós. Na prática, foi aprovado o plano de trabalho encaminhado para diagnosticar as obras. “Com o ritmo emergencial para essas obras parece que os direitos são colocados em suspenso”.

Recursos de inconstitucionalidade – Tal MP foi questionada pela Procuradoria Geral da República por uma ADIN (Ação Direta de Inconstitucionalidade). O Ministério Público Federal considerou que as unidades de conservação nas áreas de hidrelétricas são essenciais para minimizar os impactos ambientais dos projetos; e argumentou que qualquer discussão sobre a redução dessas áreas florestais deve ser realizada no Congresso Nacional, a fim de evitar a edição de uma MP. “O Brasil hoje vive o império das Medidas Provisórias que impedem a ampla discussão da sociedade. Isso dá uma ideia de capitalismo autoritário”, disse o antropólogo.

Privatização de terras na Amazônia – Ele também alerta sobre a privatização das terras públicas na Amazônia sob o “eufemismo” de regularização fundiária, via o programa Terra Legal, pela Lei 11.952 de julho de 2009. Encaminhada pela Presidência da República, a medida prevê privatizar 70 milhões de hectares de terras públicas, um volume considerável em relação ao total de 850 milhões de hectares de terras que compõem o Brasil, segundo o antropólogo. Alfredo Wagner alerta sobre a agilidade na titularidade das terras para grandes propriedades que a MP permite, em detrimento dos pequenos proprietários.

Inicialmente, a medida foi questionada pelo Ministério Público por uma ADIN pela justificativa de que ela estabelece “privilégios injustificáveis” em favor de grileiros que no passado se beneficiaram de terras públicas e houve concentração de terras. “Essa MP é tão cruel quanto a Lei de Terras Sarney de 1969”, disse o antropólogo.

Judicialização do Estado – Buscando tranquilizar os ânimos da plateia lotada por alunos, pesquisadores, cientistas, dentre outros – estimada em cerca de 140 pessoas – que temia ser a volta da ditadura militar, o antropólogo respondeu sobre o atual modelo: “Ele não é igual à ditadura militar”, respondeu o atribuindo a um “judicialização do Estado” e de “uma coisa esquisita”.

Na ocasião, o antropólogo usou a frase de sociólogos para explicar uma crise: “O velho ainda não morreu e o novo ainda não nasceu. Mas está havendo uma transformação.”

(Viviane Monteiro – Jornal da Ciência)