Arquivo da tag: comunicação científica

The Human Cause of Climate Change: Where Does the Burden of Proof Lie? (Science Daily)

ScienceDaily (Nov. 3, 2011) — The debate may largely be drawn along political lines, but the human role in climate change remains one of the most controversial questions in 21st century science. Writing in WIREs Climate Change Dr Kevin Trenberth, from the National Center for Atmospheric Research, argues that the evidence for anthropogenic climate change is now so clear that the burden of proof should lie with research which seeks to disprove the human role.

Polar bear on melting ice. Experts argue that the evidence for anthropogenic climate change is now so clear that the burden of proof should lie with research which seeks to disprove the human role. (Credit: iStockphoto/Kristian Septimius Krogh)

In response to Trenberth’s argument a second review, by Dr Judith Curry, focuses on the concept of a ‘null hypothesis’ the default position which is taken when research is carried out. Currently the null hypothesis for climate change attribution research is that humans have no influence.

“Humans are changing our climate. There is no doubt whatsoever,” said Trenberth. “Questions remain as to the extent of our collective contribution, but it is clear that the effects are not small and have emerged from the noise of natural variability. So why does the science community continue to do attribution studies and assume that humans have no influence as a null hypothesis?”

To show precedent for his position Trenberth cites the 2007 report by the Intergovernmental Panel on Climate Change which states that global warming is “unequivocal,” and is “very likely” due to human activities.

Trenberth also focused on climate attribution studies which claim the lack of a human component, and suggested that the assumptions distort results in the direction of finding no human influence, resulting in misleading statements about the causes of climate change that can serve to grossly underestimate the role of humans in climate events.

“Scientists must challenge misconceptions in the difference between weather and climate while attribution studies must include a human component,” concluded Trenberth. “The question should no longer be is there a human component, but what is it?”

In a second paper Dr Judith Curry, from the Georgia Institute of Technology, questions this position, but argues that the discussion on the null hypothesis serves to highlight fuzziness surrounding the many hypotheses related to dangerous climate change.

“Regarding attribution studies, rather than trying to reject either hypothesis regardless of which is the null, there should be a debate over the significance of anthropogenic warming relative to forced and unforced natural climate variability,” said Curry.

Curry also suggested that the desire to reverse the null hypothesis may have the goal of seeking to marginalise the climate sceptic movement, a vocal group who have challenged the scientific orthodoxy on climate change.

“The proponents of reversing the null hypothesis should be careful of what they wish for,” concluded Curry. “One consequence may be that the scientific focus, and therefore funding, would also reverse to attempting to disprove dangerous anthropogenic climate change, which has been a position of many sceptics.”

“I doubt Trenberth’s suggestion will find much support in the scientific community,” said Professor Myles Allen from Oxford University, “but Curry’s counter proposal to abandon hypothesis tests is worse. We still have plenty of interesting hypotheses to test: did human influence on climate increase the risk of this event at all? Did it increase it by more than a factor of two?”

The world at seven billion (BBC)

27 October 2011 Last updated at 23:08 GMT

File photograph of newborn babies in Lucknow, India, in July 2009

As the world population reaches seven billion people, the BBC’s Mike Gallagher asks whether efforts to control population have been, as some critics claim, a form of authoritarian control over the world’s poorest citizens.

The temperature is some 30C. The humidity stifling, the noise unbearable. In a yard between two enormous tea-drying sheds, a number of dark-skinned women patiently sit, each accompanied by an unwieldy looking cloth sack. They are clad in colourful saris, but look tired and shabby. This is hardly surprising – they have spent most of the day in nearby plantation fields, picking tea that will net them around two cents a kilo – barely enough to feed their large families.

Vivek Baid thinks he knows how to help them. He runs the Mission for Population Control, a project in eastern India which aims to bring down high birth rates by encouraging local women to get sterilised after their second child.

As the world reaches an estimated seven billion people, people like Vivek say efforts to bring down the world’s population must continue if life on Earth is to be sustainable, and if poverty and even mass starvation are to be avoided.

There is no doubting their good intentions. Vivek, for instance, has spent his own money on the project, and is passionate about creating a brighter future for India.

But critics allege that campaigners like Vivek – a successful and wealthy male businessman – have tended to live very different lives from those they seek to help, who are mainly poor women.

These critics argue that rich people have imposed population control on the poor for decades. And, they say, such coercive attempts to control the world’s population often backfired and were sometimes harmful.

Population scare

Most historians of modern population control trace its roots back to the Reverend Thomas Malthus, an English clergyman born in the 18th Century who believed that humans would always reproduce faster than Earth’s capacity to feed them.

Giving succour to the resulting desperate masses would only imperil everyone else, he said. So the brutal reality was that it was better to let them starve.

‘Plenty is changed into scarcity’

Thomas Malthus

From Thomas Malthus’ Essay on Population, 1803 edition:

A man who is born into a world already possessed – if he cannot get subsistence from his parents on whom he has a just demand, and if the society do not want his labour, has no claim of right to the smallest portion of food.

At nature’s mighty feast there is no vacant cover for him. She tells him to be gone, and will quickly execute her own orders, if he does not work upon the compassion of some of her guests. If these guests get up and make room for him, other intruders immediately appear demanding the same favour. The plenty that before reigned is changed into scarcity; and the happiness of the guests is destroyed by the spectacle of misery and dependence in every part of the hall.

Rapid agricultural advances in the 19th Century proved his main premise wrong, because food production generally more than kept pace with the growing population.

But the idea that the rich are threatened by the desperately poor has cast a long shadow into the 20th Century.

From the 1960s, the World Bank, the UN and a host of independent American philanthropic foundations, such as the Ford and Rockefeller foundations, began to focus on what they saw as the problem of burgeoning Third World numbers.

The believed that overpopulation was the primary cause of environmental degradation, economic underdevelopment and political instability.

Massive populations in the Third World were seen as presenting a threat to Western capitalism and access to resources, says Professor Betsy Hartmann of Hampshire College, Massachusetts, in the US.

“The view of the south is very much put in this Malthusian framework. It becomes just this powerful ideology,” she says.

In 1966, President Lyndon Johnson warned that the US might be overwhelmed by desperate masses, and he made US foreign aid dependent on countries adopting family planning programmes.

Other wealthy countries such as Japan, Sweden and the UK also began to devote large amounts of money to reducing Third World birth rates.

‘Unmet need’

What virtually everyone agreed was that there was a massive demand for birth control among the world’s poorest people, and that if they could get their hands on reliable contraceptives, runaway population growth might be stopped.

But with the benefit of hindsight, some argue that this so-called unmet need theory put disproportionate emphasis on birth control and ignored other serious needs.

Graph of world population figures

“It was a top-down solution,” says Mohan Rao, a doctor and public health expert at Delhi’s Jawaharlal Nehru University.

“There was an unmet need for contraceptive services, of course. But there was also an unmet need for health services and all kinds of other services which did not get attention. The focus became contraception.”

Had the demographic experts worked at the grass-roots instead of imposing solutions from above, suggests Adrienne Germain, formerly of the Ford Foundation and then the International Women’s Health Coalition, they might have achieved a better picture of the dilemmas facing women in poor, rural communities.

“Not to have a full set of health services meant women were either unable to use family planning, or unwilling to – because they could still expect half their kids to die by the age of five,” she says.

India’s sterilisation ‘madness’

File photograph of Sanjay and Indira Gandhi in 1980

Indira Gandhi and her son Sanjay (above) presided over a mass sterilisation campaign. From the mid-1970s, Indian officials were set sterilisation quotas, and sought to ingratiate themselves with superiors by exceeding them. Stories abounded of men being accosted in the street and taken away for the operation. The head of the World Bank, Robert McNamara, congratulated the Indian government on “moving effectively” to deal with high birth rates. Funding was increased, and the sterilising went on.

In Delhi, some 700,000 slum dwellers were forcibly evicted, and given replacement housing plots far from the city centre, frequently on condition that they were either sterilised or produced someone else for the operation. In poorer agricultural areas, whole villages were rounded up for sterilisation. When residents of one village protested, an official is said to have threatened air strikes in retaliation.

“There was a certain madness,” recalls Nina Puri of the Family Planning Association of India. “All rationality was lost.”

Us and them

In 1968, the American biologist Paul Ehrlich caused a stir with his bestselling book, The Population Bomb, which suggested that it was already too late to save some countries from the dire effects of overpopulation, which would result in ecological disaster and the deaths of hundreds of millions of people in the 1970s.

Instead, governments should concentrate on drastically reducing population growth. He said financial assistance should be given only to those nations with a realistic chance of bringing birth rates down. Compulsory measures were not to be ruled out.

Western experts and local elites in the developing world soon imposed targets for reductions in family size, and used military analogies to drive home the urgency, says Matthew Connelly, a historian of population control at Columbia University in New York.

“They spoke of a war on population growth, fought with contraceptive weapons,” he says. “The war would entail sacrifices, and collateral damage.”

Such language betrayed a lack of empathy with their subjects, says Ms Germain: “People didn’t talk about people. They talked of acceptors and users of family planning.”

Emergency measures

Critics of population control had their say at the first ever UN population conference in 1974.

Karan Singh, India’s health minister at the time, declared that “development is the best contraceptive”.

But just a year later, Mr Singh’s government presided over one of the most notorious episodes in the history of population control.

In June 1975, the Indian premier, Indira Gandhi, declared a state of emergency after accusations of corruption threatened her government. Her son Sanjay used the measure to introduce radical population control measures targeted at the poor.

The Indian emergency lasted less than two years, but in 1975 alone, some eight million Indians – mainly poor men – were sterilised.

Yet, for all the official programmes and coercion, many poor women kept on having babies.

And where they did not, it arguably had less to do with coercive population control than with development, just as Karan Singh had argued in 1974, says historian Matt Connelly.

For example, in India, a disparity in birth rates could already be observed between the impoverished northern states and more developed southern regions like Kerala, where women were more likely to be literate and educated, and their offspring more likely to be healthy.

Women there realised that they could have fewer births and still expect to see their children survive into adulthood.

China: ‘We will not allow your baby to live’

Steven Mosher was a Stanford University anthropologist working in rural China who witnessed some of the early, disturbing moments of Beijing’s One Child Policy.

“I remember very well the evening of 8 March, 1980. The local Communist Party official in charge of my village came over waving a government document. He said: ‘The Party has decided to impose a cap of 1% on population growth this year.’ He said: ‘We’re going to decide who’s going to be allowed to continue their pregnancy and who’s going to be forced to terminate their pregnancy.’ And that’s exactly what they did.”

“These were women in the late second and third trimester of pregnancy. There were several women just days away from giving birth. And in my hearing, a party official said: ‘Do not think that you can simply wait until you go into labour and give birth, because we will not allow your baby to live. You will go home alone’.”

Total control

By now, this phenomenon could be observed in another country too – one that would nevertheless go on to impose the most draconian population control of all.

The One Child Policy is credited with preventing some 400 million births in China, and remains in place to this day. In 1983 alone, more than 16 million women and four million men were sterilised, and 14 million women received abortions.

Assessed by numbers alone, it is said to be by far the most successful population control initiative. Yet it remains deeply controversial, not only because of the human suffering it has caused.

A few years after its inception, the policy was relaxed slightly to allow rural couples two children if their first was not a boy. Boy children are prized, especially in the countryside where they provide labour and care for parents in old age.

But modern technology allows parents to discover the sex of the foetus, and many choose to abort if they are carrying a girl. In some regions, there is now a serious imbalance between men and women.

Moreover, since Chinese fertility was already in decline at the time the policy was implemented, some argue that it bears less responsibility for China’s falling birth rate than its supporters claim.

“I don’t think they needed to bring it down further,” says Indian demographer AR Nanda. “It would have happened at its own slow pace in another 10 years.”

Backlash

In the early 1980s, objections to the population control movement began to grow, especially in the United States.

In Washington, the new Reagan administration removed financial support for any programmes that involved abortion or sterilisation.

“If you give women the tools they need – education, employment, contraception, safe abortion – then they will make the choices that benefit society”

Adrienne Germain

The broad alliance to stem birth rates was beginning to dissolve and the debate become more polarised along political lines.

While some on the political right had moral objections to population control, some on the left saw it as neo-colonialism.

Faith groups condemned it as a Western attack on religious values, but women’s groups feared changes would mean poor women would be even less well-served.

By the time of a major UN conference on population and development in Cairo in 1994, women’s groups were ready to strike a blow for women’s rights, and they won.

The conference adopted a 20-year plan of action, known as the Cairo consensus, which called on countries to recognise that ordinary women’s needs – rather than demographers’ plans – should be at the heart of population strategies.

After Cairo

Today’s record-breaking global population hides a marked long-term trend towards lower birth rates, as urbanisation, better health care, education and access to family planning all affect women’s choices.

With the exception of sub-Saharan Africa and some of the poorest parts of India, we are now having fewer children than we once did – in some cases, failing even to replace ourselves in the next generation. And although total numbers are set to rise still further, the peak is now in sight.

Chinese poster from the 1960s of mother and baby, captioned: Practicing birth control is beneficial for the protection of the health of mother and childChina promoted birth control before implementing its one-child policy

Assuming that this trend continues, total numbers will one day level off, and even fall. As a result, some believe the sense of urgency that once surrounded population control has subsided.

The term population control itself has fallen out of fashion, as it was deemed to have authoritarian connotations. Post-Cairo, the talk is of women’s rights and reproductive rights, meaning the right to a free choice over whether or not to have children.

According to Adrienne Germain, that is the main lesson we should learn from the past 50 years.

“I have a profound conviction that if you give women the tools they need – education, employment, contraception, safe abortion – then they will make the choices that benefit society,” she says.

“If you don’t, then you’ll just be in an endless cycle of trying to exert control over fertility – to bring it up, to bring it down, to keep it stable. And it never comes out well. Never.”

Nevertheless, there remain to this day schemes to sterilise the less well-off, often in return for financial incentives. In effect, say critics, this amounts to coercion, since the very poor find it hard to reject cash.

“The people proposing this argue ‘Don’t worry, everything’ s fine now we have voluntary programmes on the Cairo model’,” says Betsy Hartmann.

“But what they don’t understand is the profound difference in power between rich and poor. The people who provide many services in poor areas are already prejudiced against the people they serve.”

Work in progress

For Mohan Rao, it is an example of how even the Cairo consensus fails to take account of the developing world.

“Cairo had some good things,” he says. “However Cairo was driven largely by First World feminist agendas. Reproductive rights are all very well, but [there needs to be] a whole lot of other kinds of enabling rights before women can access reproductive rights. You need rights to food, employment, water, justice and fair wages. Without all these you cannot have reproductive rights.”

Perhaps, then, the humanitarian ideals of Cairo are still a work in progress.

Meanwhile, Paul Ehrlich has also amended his view of the issue.

If he were to write his book today, “I wouldn’t focus on the poverty-stricken masses”, he told the BBC.

“I would focus on there being too many rich people. It’s crystal clear that we can’t support seven billion people in the style of the wealthier Americans.”

Mike Gallager is the producer of the radio programme Controlling People on BBC World Service

Where do you fit into 7 billion?

The world’s population is expected to hit seven billion in the next few weeks. After growing very slowly for most of human history, the number of people on Earth has more than doubled in the last 50 years. Where do you fit into this story of human life? Fill in your date of birth here to find out.

Brasil já pesquisa efeitos da mudança do clima (Valor Econômico)

JC e-mail 4373, de 27 de Outubro de 2011.

As pesquisas em mudança climática no Brasil começam a mudar de rumo. Se há alguns anos o foco estava nos esforços de redução das emissões dos gases-estufa, agora miram a adaptação ao fenômeno.

“Sabemos que nos próximos cinco ou dez anos não há perspectiva para que seja firmado internacionalmente um acordo de redução nas emissões de gases-estufa de grandes proporções, com cortes entre 70% a 80%”, diz o físico Paulo Artaxo, da USP, um estudioso da Amazônia. “Esse panorama é cada vez mais longínquo. Portanto é fundamental que se estudem estratégias de adaptação.”

Em outras palavras, as pesquisas devem se voltar para os efeitos da mudança do clima nos ecossistemas, em ambientes urbanos, em contextos sociais. “Não é uma questão de dinheiro, mas de direcionamento dos estudos”, diz Artaxo, membro do conselho diretor do Painel Brasileiro de Mudança Climática, órgão científico ligado aos ministérios da Ciência e Tecnologia e Ambiente. “O País precisa se preparar mais adequadamente para a mudança climática.”

“É preciso pesquisar mais, por exemplo, as alterações no ciclo hidrológico”, cita Reynaldo Victoria, coordenador do Programa Fapesp de Pesquisa sobre Mudanças Climáticas Globais. “Saber onde vai chover mais e onde vai chover menos”, explica. É um dos braços da pesquisa de Artaxo na Amazônia. “Porque não se quer construir uma hidrelétrica onde choverá muito menos nas próximas décadas”, ilustra o físico.

O programa de mudança climática da Fapesp já conta com investimentos de US$ 30 milhões em projetos na área. É um dos braços mais novos da fundação, mas já está ganhando musculatura. Tem 21 projetos em andamento, 14 contratos novos, dois outros em parceria com instituições estrangeiras, como o britânico Natural Environment Research Council (Nerc) ou a francesa Agence Nationale de la Recherche (ANR). Em dez anos, a previsão é de investimentos de mais de R$ 100 milhões.

As pesquisas começam a se voltar para campos pouco estudados. “Vamos analisar questões críticas para o Brasil”, diz Artaxo. Ele cita, por exemplo, o ciclo de carbono na Amazônia – algo muito mais complexo do que estudar a fotossíntese e a respiração das plantas.

Victoria, que também é professor do Centro de Energia Nuclear Aplicada à Agricultura (Cena-USP), diz que a intenção do programa é mirar campos novos, como entender qual o papel do Atlântico Sul no clima da região Sul do Brasil e Norte da Argentina. Outro exemplo é obter registros históricos na área de paleoclima.

Os impactos na área de saúde também serão mais estudados. Já se sabe que a mudança do clima faz com que doenças que não existiam em determinado lugar, passem a ocorrer. A dengue, por exemplo, encontra ambiente propício em regiões mais quentes. Entre as novas pesquisas de doenças emergentes há o estudo de um tipo de leishmaniose, comum na Bolívia e no Peru, que não existia no Brasil e agora ameaça surgir no Acre. Provocada por um mosquito, a doença causa uma infecção cutânea e pode ser mortal.

Os pesquisadores falaram sobre seus projetos durante a Fapesp Week, evento que faz parte da comemoração pelos 50 anos da Fundação de Amparo à Pesquisa do Estado de São Paulo e terminou ontem, em Washington.

Atrair a atenção do público é o grande desafio para os satisfeitos jornalistas de ciência (Fapesp)

Pesquisa FAPESP
Edição 188 – Outubro 2011
Política de C & T > Cultura científica
Leitores esquivos

Mariluce Moura

Dois estudos brasileiros sobre divulgação científica, citados em primeira mão na Conferência Mundial de Jornalismo Científico 2011, em Doha, Qatar, no final de junho, propõem quando superpostos um panorama curiosamente desconexo para esse campo no país: se de um lado os jornalistas de ciência revelam um alto grau de satisfação com seu trabalho profissional, de outro, uma alta proporção de uma amostra representativa da população paulistana (76%) informa nunca ler notícias científicas nos jornais, revistas ou internet. Agora o mais surpreendente: no universo de entrevistados ouvidos no estado de São Paulo nesta segunda pesquisa, 52,5% declararam ter “muita admiração” pelos jornalistas e 49,2%, pelos cientistas, a despeito de poucos lerem as notícias elaboradas por uns sobre o trabalho dos outros. Esses e outros dados dos estudos provocam muitas questões para os estudiosos da cultura científica nacional. Uma, só para começar: a satisfação profissional do jornalista de ciência independe de ele atingir com sua produção seus alvos, ou seja, os leitores, os telespectadores, os ouvintes ou, de maneira mais geral, o público?

A Conferência Mundial, transferida de última hora do Cairo para Doha, em razão dos distúrbios políticos no Egito iniciados em janeiro, reuniu 726 jornalistas de 81 países que, durante quatro dias, debateram desde o conceito central de jornalismo científico, passando pelas múltiplas formas de exercê-lo e suas dificuldades, até os variados problemas de organização desses profissionais na Ásia, na África, na Europa, na América do Norte ou na América Latina, nos países mais democráticos e nos mais autoritários. Uma questão que atravessou todos esses debates foi o desenvolvimento da noção de que fazer jornalismo científico não é traduzir para o público a informação científica – seria mais encontrar meios eficazes de narrar em linguagem jornalística o que dentro da produção científica pode ser identificado como notícia de interesse para a sociedade. A próxima Conferência Mundial será realizada na Finlândia, em 2013.

Apresentado por um dos representantes da FAPESP na conferência, o estudo que trouxe à tona a medida preocupante do desinteresse por notícias de ciência chama-se “Percepção pública da ciência e da tecnologia no estado de São Paulo” (confira o pdf) e constitui o 12º capítulo dos Indicadores de ciência, tecnologia e inovação em São Paulo – 2010, lançado pela FAPESP em agosto último. Elaborado pela equipe do Laboratório de Estudos Avançados em Jornalismo da Universidade Estadual de Campinas (Labjor-Unicamp) sob a coordenação de seu diretor, o linguista Carlos Vogt, em termos empíricos a pesquisa se baseou num questionário composto por 44 perguntas aplicado a 1.076 pessoas na cidade de São Paulo e a mais 749 no interior e no litoral do estado, em 2007. Portanto, foram 1.825 entrevistados em 35 municípios, distribuídos nas 15 regiões administrativas (RAs).

Vale ressaltar que esse foi o segundo levantamento direto em uma amostra da população a respeito de sua percepção da ciência realizado pelo Labjor e ambos estavam integrados a um esforço ibero- -americano em torno da construção de indicadores capazes de refletir a cultura científica nessa região. A primeira enquete, feita entre 2002 e 2003, incluiu amostras das cidades de Campinas, Buenos Aires, Montevidéu, além de Salamanca e Valladolid, na Espanha, e seus resultados foram apresentados nos Indicadores de C,T&I em São Paulo – 2004, também publicado pela FAPESP. Já em 2007, a pesquisa, com a metodologia mais refinada e amostra ampliada, alcançou sete países: além do Brasil, Colômbia, Argentina, Chile, Venezuela, Panamá e Espanha. O núcleo comum do questionário era constituído por 39 perguntas e cada região podia desenvolver outras questões de sua livre escolha.

O outro estudo brasileiro apresentado em Doha chama-se “Jornalismo científico na América Latina: conhecendo melhor os jornalistas de ciência na região” e, a rigor, ainda está em curso. Os resultados preliminares apresentados baseavam-se nas respostas a um questionário composto por 44 perguntas – desenvolvido pela London School of Economics and Political Science (LSE) –, encaminhadas até 21 de junho. Mas a essa altura, mais de 250 jornalistas responderam ao questionário, dentre eles aproximadamente 80 brasileiros, segundo sua coordenadora, a jornalista Luisa Massarani, diretora da Rede Ibero-americana de Monitoramento e Capacitação em Jornalismo Científico, instituição responsável pelo estudo, em parceria com o LSE. O levantamento tem ainda o apoio de associações de jornalismo científico e outras instituições ligadas à área de divulgação científica na Argentina, Bolívia, Brasil, Chile, Colômbia, Costa Rica, Equador, México, Panamá e Venezuela.

No alvo desse estudo, como indicado, aliás, pelo título, está uma preocupação em saber quantos são, quem são e que visão têm da ciência os jornalistas envolvidos com a cobertura sistemática dessa área na América Latina. “Não temos ideia sobre isso, sequer sabemos quantos jornalistas de ciência existem no Brasil e se eles são ou não representativos dentro da categoria”, diz Luisa Massarani, que é também diretora do Museu da Vida da Fundação Oswaldo Cruz (Fiocruz) e coordenadora para a América Latina da Rede de Ciência e Desenvolvimento (SciDev.Net). Até algum tempo, lembra, “a Associação Brasileira de Jornalismo Científico (ABJC), com base em seu registro de sócios, situava esse número em torno de 500, mas isso na verdade incluía cientistas e outros profissionais interessados em divulgação da ciência”. A propósito, a ABJC vai iniciar no próximo mês o recadastramento dos sócios, junto com uma chamada para novos associados, o que poderá contribuir para esse censo dos jornalistas de ciência no Brasil.

Crença na ciência – Com 46 gráficos e 55 tabelas anexas que podem ser cruzados de acordo com o interesse específico de cada estudioso, o estudo de percepção da ciência bancado pela FAPESP e coordenado por Vogt permite uma infinidade de conclusões e novas hipóteses a respeito de como a sociedade absorve ciência por via da mídia ou como as várias classes sociais ou econômicas no estado de São Paulo reagem à exposição a notícias da área científica. Ao próprio coordenador, um dos pontos que mais chamaram a atenção nos resultados da pesquisa foi a relação inversa que ela permite estabelecer entre crença na ciência e informação sobre ciência. “O axioma seria quanto mais informação, menos crença na ciência”, diz. Assim, se consultado o gráfico relativo a grau de consumo autodeclarado de informação científica versus atitude quanto aos riscos e benefícios da ciência (gráfico 12.11), pode-se constatar que 57% dos entrevistados que declararam alto consumo acreditam que ciência e tecnologia podem oferecer muitos riscos e muitos benefícios simultaneamente e 6,3% acreditam que podem trazer muitos riscos e poucos benefícios. Já daqueles que declararam consumo nulo de informação científica, 42,9% veem muitos riscos e muitos benefícios ao mesmo tempo e 25,5% veem muitos riscos e poucos benefícios. “Ou seja, entre os mais informados é bem alta a proporção dos que veem riscos e benefícios na ciência ao mesmo tempo”, destaca Vogt, presidente da FAPESP de 2002 a 2007 e hoje coordenador da Universidade Virtual do Estado de São Paulo (Univesp), indicando que essa seria uma visão realista. Registre-se que o grau de pessimismo é muito maior entre os que declararam consumo nulo de informação científica: 8,1% deles disseram que a ciência não traz nenhum risco e nenhum benefício, enquanto esse percentual foi de 5,8% entre os que declararam consumo baixo, de 2,3% entre os que se situaram na faixa de consumo médio baixo, de 0,7% na faixa médio alto e de zero entre os altos consumidores de informação científica.

 

Na parte do trabalho sobre interesse geral em C&T, chama a atenção como o tema está medianamente situado pelos entrevistados em quinto lugar, depois de esporte e antes de cinema, arte e cultura, dentre 10 assuntos usualmente cobertos pela mídia (gráfico 12.1). Mas enquanto para esporte 30,5% deles se declaram muito interessados e 34,9%, interessados, em ciência e tecnologia são 16,3% os muito interessados e 47,1% os interessados, ou seja, a intensidade do interesse é menor. Vale também observar como os diferentes graus de interesse em C&T aproximam a cidade de São Paulo de Madri e a distanciam imensamente de Bogotá (gráfico 12.2). Assim, respectivamente, 15,4% dos entrevistados em São Paulo e 16,7% dos entrevistados em Madri declararam-se muito interessados em C&T; para a categoria interessado, os percentuais foram 49,6% e 52,7%; para pouco interessado, 25,5% e 24,8%, e para nada interessado, respectivamente, 9,4% e 5,9%. Já em Bogotá, nada menos que 47,5% declararam-se muito interessados. Por quê, não se sabe. Os interessados totalizam 33,2%, os pouco interessados, 15,3% e os nada interessados, 4%.

Não há muita diferença no nível de interesse por idade. Jovens e pessoas mais velhas se distribuem democraticamente pelos diversos graus considerados (gráfico 12.6a). Já quanto ao grau de escolaridade, se dá exatamente o oposto: entre os muito interessados em ciência e tecnologia, 21,9% são graduados e pós-graduados, 53,9% têm grau de ensino médio, 21,5%, ensino fundamental, 1,7%, educação infantil e 1% não teve nenhuma escolaridade. Já na categoria nada interessado se encontra 1,2% de graduados e pós-graduados, 26,3% de pessoas com nível médio, 47,4% com ensino fundamental, 8,8% com educação infantil e 16,4% de pessoas que não tiveram nenhum tipo de escolaridade (gráfico 12.5).

A par de todas as inferências que os resultados tabulados e interpretados dos questionários permitem, Vogt destaca que se a maioria da população não lê notícias científicas, ela entretanto está exposta de forma mais ou menos passiva à informação que circula sobre ciência. “Cada vez que o Jornal Nacional ou o Globo Repórter fala, por exemplo, sobre um alimento funcional, praticamente a sociedade como um todo passa a tratar disso nos dias seguintes”, diz. Ele acredita que pesquisas de mídia e de frequência do noticiário sobre ciência na imprensa poderão dar parâmetros de indicação para estudos que possam complementar o que já se construiu até agora sobre percepção pública da ciência.

Profissionais satisfeitos – Luisa Massarani observa que se hoje já se avançou nos estudos de audiência em muitos campos, especialmente para as telenovelas no Brasil, na área de jornalismo científico ainda não existem estudos capazes de indicar o que acontece em termos de percepção quando a pessoa ouve e vê uma notícia dessa especialidade no Jornal Nacional. “As pessoas entendem bem? A informação suscita desconfiança? Não sabemos.” De qualquer sorte, permanece em seu entendimento como uma grande questão o que significa fazer jornalismo científico, em termos da produção e da recepção.

Por enquanto, o estudo que ela coordena conseguiu identificar que as mulheres são maioria entre os jornalistas de ciência na América Latina, 61% contra 39% de homens, e que essa é uma especialidade de jovens: quase 30% da amostra situa-se na faixa de 31 a 40 anos e 23% têm entre 21 e 30 anos. De forma coerente com esse último dado, 39% dos entrevistados trabalham há menos de 5 anos em jornalismo científico e 23% entre 6 e 10 anos. E, o dado impressionante, 62% estão satisfeitos com seu trabalho em jornalismo científico e mais 9% muito satisfeitos. É possível que isso tenha relação com o fato de 60% terem emprego formal de tempo integral na área.

Por outro lado, se os jornalistas de ciência da América Latina não têm muitas fontes oficiais que lhes deem um feedback de seu trabalho, 40% deles es–tão seguros de que seu papel é informar o público, 26% pensam que sua função é traduzir material complexo, 13% educar e 9% mobilizar o público. E avaliando o resultado do trabalho, 50% creem que o jornalismo científico produzido no Brasil é médio, 21% bom e somente 2% o classificam como muito bom.

A melhor indicação do quanto os jornalistas de ciência gostam do que fazem está na resposta à questão sobre se recomendariam a outros a carreira. Nada menos do que a metade respondeu que sim, com certeza, enquanto 40% responderam que provavelmente sim. De qualquer sorte, ainda há um caminho a percorrer na definição do papel que cabe aos jornalistas entre os atores que dizem o que a ciência é e faz. “Quem são esses atores?”, indaga Vogt. “Os cientistas achavam que eram eles. Os governos acreditavam que eram eles. Mas hoje dizemos que é a sociedade. Mas de que forma?”

Global Warming May Worsen Effects of El Niño, La Niña Events (Climate Central)

Published: October 12th, 2011

By Michael D. Lemonick

Does this mean Texas is toast?

As just about everyone knows, El Niño is a periodic unusual warming of the surface water in the eastern and central tropical Pacific Ocean. Actually, that’s pretty much a lie. Most people don’t know the definition of El Niño or its mirror image, La Niña, and truthfully, most people don’t much care.

What you do care about if you’re a Texan suffering through the worst one-year drought on record, or a New Yorker who had to dig out from massive snowstorms last winter (tied in part to La Niña), or a Californian who has ever had to deal with the torrential rains that trigger catastrophic mudslides (linked to El Niño), is that these natural climate cycles can elevate the odds of natural disasters where you live.

At the moment, we’re now entering the second year of the La Niña part of the cycle. La Niña is one key reason why the Southwest was so dry last winter and through the spring and summer, and since La Niña is projected to continue through the coming winter, Texas and nearby states aren’t likely to get much relief.

Precipitation outlook for winter 2011-12, showing the likelihood of below average precipitation in Texas and other drought-stricken states.

But Niñas and Niños (the broader cycle, for you weather/climate geeks, is known as the “El Niño-Southern Oscillation,” or “ENSO”) don’t just operate in isolation. They’re part of the broader climate system, which means that climate change could theoretically change how they operate — make them develop more frequently, for example, or less frequently, or be more or less pronounced. Climate change could also intensify the effects of El Niño and La Niña events.

Climate scientists have been wrestling with the first question for a while now, and they still don’t really have a definitive answer. Some climate models have suggested that global warming has already begun to cause subtle changes in ENSO cycles, and that the changes will become more pronounced later this century. But a new study, published in the Journal of Climate, doesn’t find much evidence for that.

But on the second question, the new study is a lot more definitive. “Due to a warmer and moister atmosphere,” said co-author Baylor Fox-Kemper, of the University of Colorado in a press release, “the impacts of El Niño are changing even though El Niño itself doesn’t change.”

That’s because global warming has begun to change the playing field on which El Niño and La Niña operate, just as it’s changing the background conditions that give rise to our everyday weather. The Texas drought is a prime example. Its most likely cause is reduced rainfall from La Niña-related weather patterns. But however dry Texas and Oklahoma might have been otherwise, the killer heat wave that plagued the region this past summer — the sort of heat wave global warming is already making more commonplace — baked much of the remaining moisture out of both the soil and vegetation. No wonder large parts of the Lone Star State have gone up in smoke.

A map of sea surface temperature anomalies, showing a swath of cooler than average waters in the central and eastern tropical Pacific Ocean – a telltale sign La Niña conditions.

When the next El Niño occurs in a year or two, it will probably bring heavy rains to places like Southern California, whose unstable hillsides tend to slide when soggy. Except now, thanks to global warming, the typical El Niño-related storms that roll in off the Pacific may well be turbocharged, since a warmer atmosphere can hold more water. This is the reason, say many climate scientists, that downpours have become heavier in recent decades across broad geographical areas.

La Niña, plus the added moisture in the air from global warming, have also been partially implicated in the massive snowstorms that struck the Northeast and Mid-Atlantic states during the last two winters. Those could get worse as well, suggests the new analysis. “What we see,” says Fox-Kemper, “is that certain atmospheric patterns, such as the blocking high pressure south of Alaska typical of La Niña winters, strengthen…so, the cooling of North America expected in a La Niña winter would be stronger in future climates.” So to pre-answer the question that will inevitably be asked next winter: no, more snow does NOT contradict the idea that the planet is warming. Quite the contrary.

Finally, for those who really do want to know what El Niño and La Niña actually are, as opposed to what they do, you can go to NOAA’s El Niño page. But be warned: there will be a quiz, and the word “thermocline” will appear.

Comments

By Kirk Petersen (Maplewood, NJ 07040)
on October 13th, 2011

Seventh paragraph, third sentence should begin “Its most likely cause”—not “it’s”.

The Post-Normal Seduction of Climate Science (Forbes)

William Pentland10/14/2011 @ 12:22AM |2,770 views

In early 2002, former U.S. Defense Secretary Donald Rumsfeld explained why the lack of evidence linking Saddam Hussein with terrorist groups did not mean there was no connection during a televised press conference.

“[T]here are known ‘knowns’ – there are things we know we know,” said Rumsfeld. “We also know there are known ‘unknowns’ – that is to say we know there are some things we do not know. But there are also unknown ‘unknowns’ – the ones we don’t know we don’t know . . . it is the latter category that tend to be the difficult ones.”

Rumsfeld turned out to be wrong about Hussein, but what if he had been talking about global warming?  Well, he probably would have been on to something there.  Unknowns of any ilk are a real pickle in climate science.

Indeed, uncertainty in climate science has induced a state of severe political paralysis. The trouble is that nobody really knows why. A rash of recent surveys and studies have exonerated most of the usual suspects – scientific illiteracy, industry distortions, skewed media coverage.

Now, the climate-science community is scrambling to crack the code on the “uncertainty” conundrum. Exhibit A: the October 2011 issue of the journal Climatic Change, the closest thing in climate science to gospel truth, which is devoted entirely to the subject of uncertainty.

While I have yet to digest all of the dozen or so essays, I suspect they are only the opening salvo in what is will soon become a robust debate about the significance of uncertainty in climate-change science. The first item up on the chopping block is called post-normal science (PNS).

PNS is a model of the scientific process pioneered by Jerome Ravetz and Silvio Funtowicz, which describes the peculiar challenges science encounters where “facts are uncertain, values in dispute, stakes high and decisions urgent.” Unlike “normal” science in the sense described by the philosopher of science Thomas Kuhn, post-normal science commonly crosses disciplinary lines and involves new methods, instruments and experimental systems.

Judith Curry, a professor at Georgia Tech, weighs the wisdom of taking the plunge on PNS in an excellent piece called “Reasoning about climate uncertainty.” Drawing on the work of Dutch wunderkind, Jeroen van der Sluijs, Curry calls on the Intergovernmental Panel on Climate Change to stop marginalizing uncertainty and get real about bias in the consensus building process. Curry writes:

The consensus approach being used by the IPCC has failed to produce a thorough portrayal of the complexities of the problem and the associated uncertainties in our understanding . . . Better characterization of uncertainty and ignorance and a more realistic portrayal of confidence levels could go a long way towards reducing the “noise” and animosity portrayed in the media that fuels the public distrust of climate science and acts to stymie the policy process.

PNS is especially seductive in the context of uncertainty. Not surprisingly, Curry suggests that instituting PNS-like strategies at the IPCC “could go a long way towards reducing the ‘noise’ and animosity” surrounding climate-change science.

While I personally believe PNS is persuasive, the PNS model provokes something closer to revulsion in many people. Last year, members of the U.S. House of Representatives filed a petition challenging the U.S. Environmental Protection Agency‘s Greenhouse Gas Endangerment seemed less sanguine about post-normal science:

. . . the conclusions of organizing bodies, especially the IPCC, cannot be said to reflect scientific “consensus” in any meaningful sense of that word. Instead, they reflect a political movement that has commandeered science to the service of its agenda. This is “post-normal science”: the long-dreaded arrival of deconstructionism to the natural sciences, according to which scientific quality is determined not by its fidelity to truth, but by its fidelity to the political agenda.

It seems unlikely that taking the PNS plunge would appreciably improve the U.S. public’s perception of the credibility, legitimacy and salience of climate-change assessments. This probably says more about Americans than it does about the analytic force of the PNS model.

Let’s face it. Americans do not agree on a whole hell of a lot. And they never have. Many U.S. institutions were deliberately designed to tolerate the coexistence of free states and slave-owning states. Ironically, Americans appear to agree more on climate-change science than other high-profile scientific controversies like the safety of genetically-modified organisms.

National Science Foundation

While it pains me to admit this, I am increasingly convinced that the IPCC’s role in assessing the science of climate change needs to be scaled back. The IPCC was an overly optimistic experiment in international governance designed for a world that never materialized.  The U.N. General Assembly established the IPCC in the months immediately preceding the fall of the Berlin Wall. Only two few years later, the IPCC’s first assessment report and the creation of the U.N. Framework Convention on Climate Change coincided with the collapse of the Soviet Union and the end of the Cold War.

A new world order seemed to be dawning in those days, which is probably why it seemed like a good idea to ask scientists to tell us what constitutes “dangerous climate change.”   Two decades and two world trade towers later, the world is a decidedly less hospitable place for institutions like the IPCC.

The proof is in the pudding – or, in this case, the atmosphere.

O tempo da meteorologia (Tome Ciência)

A meteorologia é muito mais do que dar uma olhada na previsão do tempo quando se planeja uma viagem de fim de semana. No momento em que o aquecimento global é uma ameaça, e as grandes catástrofes climáticas tornam-se cada vez mais frequentes, ressalta-se a importância e a responsabilidade dos meteorologistas. O aumento do conhecimento e as inovações tecnológicas nessa área permitem hoje prever com certa antecedência e precisão os fenômenos do clima. E retirar rapidamente pessoas de áreas de risco pode salvar muitas vidas. O tema deste debate foi sugerido pela Sociedade Brasileira de Meteorologia, instituição vinculada à Sociedade Brasileira para o Progresso da Ciência – a SBPC.

Participantes:

Carlos Afonso Nobre, secretário de Políticas e Programas de Pesquisa e Desenvolvimento do Ministério da Ciência e Tecnologia (MCT), dirigiu por mais de 10 anos o Centro de Previsão de Tempo e Estudos Climáticos do Instituto Nacional de Pesquisas Espaciais (INPE) e participa da criação, em 2011, do Centro Nacional de Monitoramento e Alerta de Desastres Naturais.

Maria Gertrudes Justi da Silva, coordenadora do curso de meteorologia da Universidade Federal do Rio de Janeiro (UFRJ). Ex-presidente da Sociedade Brasileira de Meteorologia faz parte do Conselho de Coordenação das Atividades de Meteorologia, Climatologia e Hidrologia no Governo Federal.

José Marques é o presidente do Conselho Deliberativo da Sociedade Brasileira de Meteorologia. Foi da primeira turma de meteorologistas formados em universidade brasileira, graduado em 1967 pela UFRJ. Até então os cursos eram só no exterior, onde depois, na França, ele fez o pós-doutorado.

Ednaldo Oliveira dos Santos, professor adjunto do Departamento de Ciências Ambientais do Instituto de Florestas da Universidade Federal Rural do Rio de Janeiro(UFRRJ), é presidente da União Nacional dos Estudiosos em Meteorologia e representante da América do Sul no comitê internacional que estuda educação sem distância de meteorologia. É também pesquisador associado do Instituto Virtual Internacional de Mudanças Globais, da COPPE/UFRJ.

Unshakeable stereotypes of science (New Scientist)

13 September 2011 by Roger Highfield
Magazine issue 2829.

Science has transformed our world, so why does the public have such an old-fashioned view of scientists, asks Quentin Cooper

What is the problem with the public’s image of scientists?
If you ask anyone, they will tell you that science has transformed their world with amazing discoveries. But then if you invite them to draw a scientist, what they depict is precisely what people would have described 50 years ago, back when the anthropologist Margaret Mead came up with what we now call the “draw a scientist” test.

How do people generally depict scientists?
It is uncanny: they draw someone with a hangdog look, frizzy hair and test tube in hand, all in a scene where things are going wrong. There are national variations. In Italy, scientists tend to be scarred and have bolts in their necks, like Frankenstein’s monster. In general, though, they are mostly white, male, bald and wearing a white coat. No wonder we have a problem recruiting scientists.

What do you think of attempts to make scientists cool, like the Studmuffins of Science calendar and GQ’s Rock Stars of Science?
They are doomed because for geek calendars and suchlike to work, they have to bounce off the stereotype. As a result, they reinforce it.

On TV there are plenty of science presenters who defy the stereotype, such as the physicist Brian Cox. Surely that helps?
It is true. They are not all white, male and old. Some have hair. Some, like Brian, arguably have too much! But while people know them and are familiar with their TV programmes, it is surprising what happens when you ask the public about their favourite science presenters. In the UK they usually nominate veterans, such as David Attenborough. In fact, in the last poll I saw, half the people could not name a TV science presenter. They don’t seem to recognise them as scientists because they don’t conform to the stereotype.

And this stereotype also applies to the best known scientist of all time, Einstein?
The image of the old Einstein with tongue out is the one everyone knows – the one taken on his 72nd birthday. But he was a dapper 26-year-old when he had his “annus mirabilis” and wrote the four papers that changed physics.

What do you think about the depiction of scientists in films?
What I find striking is you almost never see scientists on screen unless they are doing science. There are very few characters who happen to be scientists. And those scientists shown tend to be at best eccentric, at worst mad and/or evil.

How can we improve the image of scientists?
Even though the “draw a scientist” test started half a century ago, it was only in the 1980s that someone had the idea of introducing children to a real scientist after they had drawn one, and then asking them to have another go at drawing. One of my favourite examples is of the schoolgirl who initially drew a man with frizzy hair and a white coat, but afterwards depicted a smiling young woman holding a test tube. Above it is the word “me”. I still find myself choking up when I show it.

Profile
Quentin Cooper is a science journalist and presenter of the BBC radio programme Material World. He is hosting the Cabaret of the Elements at the British Science Festival in Bradford on 10 September.

Confronting the ‘Paradox of Progress’ (Yale Forum on Climate Change & The Media)

A First-Hand Perspective
Google’s Science Communication Fellows

Paul Higgins August 2, 2011

The ‘paradox of progress’ illustrated by climate change prompts a first-hand participant in a recent Google fellowship program to ponder how best to combine scientific and technological advances with improved public understanding for the benefit of society overall.

Scientific and technological advances are creating a challenging paradox for society, a paradox of progress.

Advances in the sciences and technical fields provide our society with tremendous capacity to overcome the numerous challenges we face. But those challenges in many cases are driven by the rapidly expanding scale of human activities, which are made possible in the first place by advances in science and technology.

Circumventing this paradox of progress — reaping the benefits that science and technology bring us, while avoiding the unintended negative consequences — will depend on using those advances more effectively throughout all of society.

Climate change illustrates the paradox of progress extremely well. The social and technological advances that powered the Industrial Revolution vastly improve our quality of life and well-being, but also drive our global disruption of the climate system. All the while, scientific and technological advances help us understand the causes, consequences, and potential risk management solutions to climate change.

It’s a serious concern that these massive advances in scientific knowledge have had little impact on public understanding of climate science, its implications, or society’s risk management efforts.

Given the importance of circumventing the paradox of progress, for climate change and more broadly, I was pleased to learn that Google was initiating a Science Communication Fellowship Program and thrilled to be named a member of the inaugural class of 21 fellows. Still, the question, to me, is this: Can the combination of the technological capabilities of one of the world’s leading IT companies and the expertise of the scientific community transform scientific communication for climate change and, more broadly, for all socially-relevant scientific disciplines?

The fellowship program centered on a workshop held in June at Google’s headquarters in Mountain View, California. There were, in my view, three specific goals of the workshop and the fellowship:

– to promote collaboration among the fellows;

– to develop transformative project ideas that could harness new media and information technology (IT) for more effective communication of the science of climate change; and

– to help develop communication approaches that would be broadly and generally transferable to other scientific disciplines.

The workshop included perspectives from outside experts, presentations from Googlers (the internal moniker for Google staff) on new media and advances in IT, and brainstorming activities designed to generate new ideas.

The Googlers’ presentations were impressive, perhaps even a little daunting. They brought home, in a way I hadn’t realized before (despite my heavy reliance on new technology), how rapidly the IT world is advancing and how much potential IT has to transform society. At times, seeing what these Googlers could do with information technology left me questioning what was left for me to contribute. Fortunately, the brainstorming provided an answer.

I had gone to Google with two ideas for climate science communication and I had two more ideas while there. That seemed fairly standard among the fellows so by the end we were awash in new, interesting, and potentially transformative ideas for communicating climate science.

Of course, with so many ideas and a need to winnow them to a tractable number of actual project proposals, everyone was bound to see some of their favorite ideas end up forlorn and abandoned on the bottom of a white board. I had two ideas that I was sorry to see stall during the vetting but that I intend to pursue separately nonetheless. (At this stage it is appropriate and consistent with the workshop protocol, in my view, to discuss only those ideas that were both my own and that are not moving forward formally within the fellowship program).

The first is a multi-media show featuring leading climate experts. Each show will follow a one-on-one interview format and will showcase the expert’s knowledge and understanding. The discussion will explore what the expert does, why their work is important, what the current state of knowledge is (what is known and understood and with what level of confidence), what key questions remain unresolved or contentious, and the broader implications of their work to society.

The show would meet three critical needs: 1) it would help educate the public about climate change; 2) it would provide a new venue for rapid responses to important events (e.g., ground-breaking research findings and public controversies and misunderstandings), and 3) it would help develop the communication skills of climate experts.

The second idea involves development of an interactive game that would give users a chance to assess and manage climate change risks for themselves. Subjective preferences have major implications, good and bad, for policy choices, and this tool could help reveal and encourage reflection over those opinions. Are you risk averse? If so, how do you balance your risk aversion between policy choices that are too aggressive (e.g., that risk excessive increases in energy and transportation prices) or too weak (i.e., that risk disruption of key life-support services)? How do your answers change as you learn more about the nature of the risk management problem (i.e., with additional information from the physical, natural, and social sciences)?

Breakthroughs in Science and Public Understanding

Over the next few weeks, the 21 fellows will refine project ideas and submit proposals to Google for possible seed funding. Whether these ideas can ultimately transform science communication will take time to determine. Regardless, the process of generating new ideas during the workshop was profoundly successful. That’s a good first step because resolving the paradox of progress will depend on achieving breakthroughs not only in science but also in how society uses the knowledge and understanding that results.

With more effective use of scientific knowledge and understanding, we can make choices with the greatest chance to benefit society overall. So far the massive advances in scientific understanding of climate change appear to have little impact on public understanding of climate science, its implications, or society’s risk management efforts. But perhaps Google’s Science Communication Fellowship Program over time can do for civic engagement of science what Google has done for information technology.

Championing Ideas … Your Own and Those of Others

The combination of talks and brainstorming made for an invigorating three days but also a grueling workshop experience. By the end of each day, many participants were clearly spent and more than a little confused about best paths forward. That is what happens when people’s horizons are expanding and they are confronting new challenges.

Fortunately, by the following morning, I had integrated what I’d learned the day before and found what I thought would be a good path forward.

For me, the most critical breakthrough was to recognize and accommodate two complementary approaches: 1) to champion the idea(s) that I thought most promising, regardless of whether others at the workshop liked my vision or not; and 2) to help, however possible, champions of other ideas successfully implement their visions.

This two-pronged approach for me captures the nature of scientific pursuit at its best. Science relies on personal autonomy, individual incentives, and unique contributions, but also depends on collaboration and cooperation to help make everyone’s work more effective. The first component reflects the importance of individual insight and ability, the second the importance of staying focused on broader, shared goals: the pursuit of knowledge and understanding in the case of scientific research, increased public understanding in the case of science communication.

Author
Paul Higgins is the Associate Director of the American Meteorological Society’s Policy Program in Washington, D.C.

Devagar e sempre (FSP)

JC e-mail 4317, de 08 de Agosto de 2011.

Movimento ‘Slow Science’ defende o direito de cientistas fugirem da corrida pelo grande número de publicações e priorizarem qualidade da pesquisa.

Um movimento que começou na Alemanha está ganhando, aos poucos, os corredores acadêmicos. A causa é nobre: mais tempo para os cientistas fazerem pesquisa. Quem encabeça a ideia é a organização “Slow Science” (http://slow-science.org), criada por cientistas gabaritados da Alemanha.

Aderir ao movimento significa não se render à produção desenfreada de artigos em revistas especializadas, que conta muitos pontos nos sistemas de avaliação de produção científica. Hoje, quem publica em revistas científicas muito lidas e mencionadas por outros cientistas consegue mais recursos para pesquisa.

Por isso, os cientistas acabam centrando seu trabalho nos resultados (publicações). “Somos uma guerrilha de neurocientistas que luta para que o modelo midiático de produção científica seja revisto”, disse à Folha o neurocientista Jonas Obleser, do Instituto Max Planck, um dos criadores do “Slow Science”. O grupo chegou a criar um manifesto, no final do ano passado, em que proclama: “Somos cientistas, não blogamos, não tuitamos, temos nosso tempo”.

“A ciência lenta sempre existiu ao longo de séculos. Agora, precisa de proteção.” O documento está na porta da geladeira do laboratório do médico brasileiro Rachid Karam, que faz pós-doutorado na Universidade da Califórnia em San Diego.

“O manifesto faz sentido. Temos de verificar os dados antes de tirarmos conclusões precipitadas”, analisa. “A ‘Slow Science’ nos daria tempo para analisar uma hipótese em profundidade e tirar conclusões acertadas.”

De acordo com Obleser, o número de cientistas simpatizantes do movimento está crescendo, “especialmente na América Latina”. “Mas não é preciso se filiar formalmente. Basta imprimir o manifesto e montar guarda no seu departamento”, diz.

O Slow Science é um braço do já conhecido “Slow Food”, que defende uma alimentação mais lenta e saudável, tanto no preparo quanto no consumo dos alimentos. Na ciência, a ideia é pregar a pesquisa que não se paute só pelo resultado rápido.

Ceticismo – “É improvável que o ritmo de fazer pesquisa seja diminuído por meio de um acordo mundial em que cada cientista assume o compromisso de desacelerar seus trabalhos”, diz o especialista em cientometria (medição da produtividade científica) Rogério Meneghini. Ele é coordenador científico do Projeto SciELO, que reúne publicações da América Latina com acesso livre.

Para Meneghini, o “Slow Science” é um movimento “anêmico” num contexto em que a rapidez do fluxo de ideias e informações acelera as descobertas. “Parece uma reivindicação de um velho movimento com uma roupagem nova. É certamente a sensação de quem está perdendo as pernas para correr”, conclui.
(Folha de São Paulo)

TV 10 weather forecasts worse than a crap shoot (City Pulse)

Media Muckraker
November 12, 2002

The surprise 3.1-inch snowfall last Monday, Dec. 2, resulted in more than 100 Lansing area accidents. Little did I know, as I chugged my car on U.S. 127 that morning, that over 20 of those accidents were taking place at the I-96 exchange just up around the bend. Fortunately, before I arrived at the ice-slick, my instincts got the better of me and I averted a possible accident by turning off I-127 early.

No thanks to the weathermen of WILX-10 (who share double duty as the forecasters for the Lansing State Journal). They had forecast snow, but had never said how much, hinting at just an inch or so.

Then it happened again. On Tuesday, Rockcole, Provenzano and Drummond predicted a low temperature “near 10.” In fact the mercury fell to 18 degrees below zero, the day’s lowest temperature since 1869!

How could the weathermen be so wrong? I decided to do a little weather muckraking.

In Britain, earlier this year, Ben Magoo wondered about the accuracy of the BBC’s weather reporting after the sunny vacation day they predicted for him turned out soggy. “Is the super computer in the [BBC] office accurately modeling the world’s climate, or is it resting its brain and picking out sun and rain symbols at random? We will find the answer!” Magoo developed a computer program to automatically analyze their weather data at 10 sites, including York, the Tower of London and Cambridge. Here’s what he found at Cambridge:

Cambridge, England | Days Monitored: 126
Days Ahead
1
2
3
4
Accuracy
55%
50%
43%
35%

Incredibly, the chance of the next day’s forecast being right was just 55 percent. Note that Magoo ignored the same-day predictions, making “the assumption that predicting today’s weather is dead simple, so the BBC couldn’t possibly get this wrong.” Really now?

Turning to Lansing, I analyzed 14 days of WILX-LSJ forecasts between Nov. 24 and Dec. 7. I determined a forecast to be in error if at least one of the following occurred: 1) the predicted temperature was incorrect by 5 degrees or more (for either the high or low); 2) precipitation did not occur as predicted (e.g… they predicted snow, but there was none, or the converse), or 3) the precipitation prediction was off by 100% or more (e.g.,. they predicted 1 inch of snow, but it snowed 3 inches, a 200 percent difference).

Lansing, MI | Days Monitored: 14
Days Ahead
Same Day
1
2
3
4
Accuracy
50%
38%
50%
55%
20%

Remarkably, my analysis demonstrated that the WILX-LSJ forecasters were unable to predict the day’s weather – for the same day – a full seven of 14 days (50 percent)! The British chap had evidently presumed way too much. Distant predictions tended to be about 50/50, with fifth day a poor 20 percent.

You’d figure that predicting the weather a few hours hence would be a breeze. But they missed 3.1 inches of snow on Dec. 2 and were off by 28 degrees on Dec. 3. On Nov 29, the LSJ predicted that day’s weather would have a high in “the upper 30s,” which was significantly lower than the actual high of 46. And on Dec. 4, the LSJ predicted a low temp in the “low teens,” which was a far cry (for the freezing news carriers delivering the newspaper to your doorstep) from the actual low of 4 degrees below zero.

All tolled, of 60 days forecast, the accuracy rate was just 43 percent. Don’t believe it? Check it out for yourself, the evidence is in the library (the other TV weathermen do not have evidence so accessible). Lansing’s numbers are remarkably close to the Cambridge study, suggesting that this level of miscalculation might be consistent over the entire year.

One moral is to not rely on the forecasts to plan time off work.

At the very least, weathermen should humbly state the truth; there is a 50/50 chance that our forecasts will be wrong in at least one important area.Incompetence? Arrogance? It goes much deeper than that.

In Oscar Wilde’s “The Importance of Being Earnest,” Jack comments on the weather thus, “Charming day it has been, Miss Fairfax.” To which Gwendolen Fairfax replies, “Pray don’t talk to me about the weather, Mr. Worthing. Whenever people talk to me about the weather, I always feel quite certain that they mean something else. And that makes me so nervous.”

It’s true. Weather forecasts are less about the weather than about cementing social relations – telling you who has authority. While weather seems so bloody innocuous, in fact, culturally speaking, the weather forecast is a covert agent of social control.

It doesn’t matter to the mainstream media bosses that weathermen are wrong most of the time (if they even know it). What’s important is that weathermen exude an aura of certainty (precision numbers) while expressing an undercurrent of fear (of the possible storm). Just like the IRS, the traffic cop or your boss, no matter how wrong, he’s the person in charge – with certainty. There’s no way out. That’s one hidden message.

The good news is that they’re wrong!

Here’s what needs to be done. Lose the “Stormtracker” and hire a muckraker. Don’t circumvent serious issues like the amount of PCBs in the morning’s snowfall, or the amount of soot in a Lansing fog. And tell the viewers/readers where the historic danger spots are (like I-127& I-96) before the next snowstorm.

Here’s my forecast. Under the current corporate structure, they’ll never do it.

Alex Peter Zenger is the pen name for the Media Muckraker. It is inspired by the work of John Peter Zenger, one of the founding fighters for press freedom in the United States.

Climate Chaos (Against the Grain)

Tues 6.28.11| Climate Chaos

Christian Parenti speaking at a KPFA benefit on July 14th, on Tropic of Chaos: Climate Change and the New Geography of Violence, Nation Books, 2011

Listen to this Program here.

Download program audio (mp3, 49.82 Mbytes)

Residents of the Global North may be justly wringing their hands about flooding, droughts, and freak weather, but the most worrying effects of climate change are expected to hit the countries of the Global South, especially those in the broad regions on either side of the equator. Christian Parenti has reported from that vast area and discusses the shape that climate-related social dislocation is already taking, as well as the militarized plans of the rich countries to keep poor climate refugees out.

© Against the Grain, a program of KPFA Radio, 94.1fm Berkeley CA and online at KPFA.org.

Why Global Warming Slowed in the 2000’s: Another Possible Explanation (Climate Central)

Published: July 21st, 2011
By Michael D. Lemonick

The world is getting progressively warmer, and the vast majority of evidence points to greenhouse gases spewed into the atmosphere by humans — carbon dioxide (CO2), especially — as the main culprit. But while the buildup of greenhouse gases has been steadily increasing, the warming goes in fits and starts. From one year to the next it might get a little warmer or a lot warmer, or even cooler.

That’s because greenhouse gases aren’t the whole story. Natural variations in sunlight and ocean currents; concentrations of particles in the air, manmade and otherwise; and even plain old weather variations can speed the warming up or slow it down, even as the underlying temperature trend continues upward. And while none of those factors is likely to change that trend over the long haul, scientists really want to understand how they affect projections of where our climate is heading.

The latest attempt to do so just appeared in Science Express, the online counterpart of the journal Science, where a team of climate scientists is reporting on their investigations of airborne particles, or aerosols, in the stratosphere. It’s well known, says co-author John Daniel, of the National Oceanic and Atmospheric Administration’s Earth System Research Laboratory in Boulder, Colo., that these particles have a cooling effect, since they reflect sunlight that would otherwise warm the planet.

Mt. Pinatubo’s erruption in the Philippines, in 1991. Credit: USGS.

It’s also well known that major volcanic eruptions, like Mt. Pinatubo’s in the Philippines in 1991, can pump lots of aerosols into the stratosphere — and indeed, Pinatubo alone temporarily cooled the planet for about two years. The explosion of Mt. Tambora in 1815 had even more catastrophic effects, which you can imagine given that 1816 came to be known as “the year without a summer.” But what lots of people thought, says Daniel, “is that since there haven’t been any eruptions on that scale recently, aerosols have become relatively unimportant for climate.”

That, says the study, is not true: even without major eruptions, aerosols in the stratosphere increased by about 7 percent per year from 2000 to 2010. Plug that figure into climate models, and they predict a reduction in the warming you’d otherwise expect from the rise in greenhouse gases by up to 20 percent.

In the real world, as it happens, the rise in temperature slowed during that same decade. “That,” says Daniel, “was the motivation for doing this research. It could have just been natural climate variability, but we wondered if it could be something else.” Some climate scientists attribute the slowdown to heat being temporarily stored in the deep oceans, but stratospheric aerosols could clearly be part of the answer as well.

Whether these aerosols are natural or manmade, however, is something the scientists didn’t address. Just last week, a paper in Proceedings of the National Academy of Sciences (PNAS) suggested the cause was a construction boom of coal-fired power plants in China over the same decade. The new study doesn’t necessarily contradict that. “Human emissions could play a role,” says Daniel, although the PNAS study was talking about aerosols in the lower atmosphere, not the stratosphere. “But even in the absences of colossal volcanic eruptions,” he says, “smaller eruptions could still add up.”

The other difference between the two studies is that the one from last week looked at the relatively slow temperature rise over the most recent decade and tried to tease out what might have changed since the previous decades, when the warming was faster. The new one took actual observations of aerosols and tried to predict what the temperature rise should be. That sort of approach tends to produce more credible results, since an incorrect prediction would stick out like a sore thumb.

Where the two studies emphatically agree is that if the level of aerosols goes down — due to a lull in eruptions, or a reduction in coal-plant pollution, or both — the pace of warming would likely pick up. That would mean that current projections for up to a 4.5°C increase in global average surface temperatures by the end of the century might turn out to be an underestimate. And if aerosol levels increase, the temperature in 2100 could be lower than everyone expects.

Science and truth have been cast aside by our desire for controversy (Guardian)

Last week’s report into media science coverage highlighted an over-reliance on pointless dispute

Robin McKie
The Observer, Sunday 24 July 2011

Thomas Huxley, the British biologist who so vociferously, and effectively, defended Darwin’s theory of natural selection in the 19th century, had a basic view of science. “It is simply common sense at its best – rigidly accurate in observation and merciless to fallacy in logic.”

It is as neat a description as you can get and well worth remembering when considering how science is treated by the UK media and by the BBC in particular. Last week, a study, written by geneticist Steve Jones, warned that far too often the corporation had failed to appreciate the nature of science and to make a distinction “between well-established fact and opinion”. In doing so, the corporation had given free publicity to marginal belief, he said.

Jones was referring to climate change deniers, anti-MMR activists, GM crop opponents and other fringe groups who have benefited from wide coverage despite the paucity of evidence that supports their beliefs. By contrast, scientists, as purveyors of common sense, have found themselves sidelined because producers wanted to create controversy and so skewed discussions to hide researchers’ near unanimity of views in these fields. In this way, the British public has been misled into thinking there is a basic division among scientists over global warming or MMR.

It is a problem that can be blamed on the media that believe, with some justification, that adversarial dispute is the best way to cover democracy in action. It serves us well with politics and legal affairs, but falls down badly when it comes to science because its basic processes, which rely heavily on internal criticism and disproof, are so widely misunderstood.

Yet there is nothing complicated about the business, says Robert May, the former UK government science adviser. “In the early stages of research, ideas are like hillocks on a landscape. So you design experiments to discriminate among them. Most hillocks shrink and disappear until, in the end, you are left with a single towering pinnacle of virtual certitude.”

The case of manmade climate change is a good example, adds May. “A hundred years ago, scientists realised carbon dioxide emissions could affect climate. Twenty years ago, we thought they were now having an impact. Today, after taking more and more measurements, we can see there is no other explanation for the behaviour of the climate. Humans are changing it. Of course, deniers disagree, but that’s because they hold fixed positions that have nothing to do with science.”

It is the scientist, not the denier, who is the real sceptic, adds Paul Nurse, president of the Royal Society. “When you carry out research, you cannot afford to cherry-pick data or ignore inconvenient facts. You have to be brutal. You also have to be sceptical about your own ideas and attack them. If you don’t, others will.”

When an idea reaches the stage where it’s almost ready to become a paper, it has therefore been subjected to savage scrutiny by its own authors and by their colleagues – and that is before writing has started. Afterwards, the paper goes to peer review where there is a further round of critical appraisal by a separate group of researchers. What emerges is a piece of work that has already been robustly tested – a point that is again lost in the media.

Over the centuries, this process has been honed to near perfection. By proposing and then attacking ideas and by making observations to test them, humanity has built up a remarkable understanding of the universe. The accuracy of Einstein’s theories of relativity, Crick and Watson’s double helix structure of DNA and plate tectonics were all revealed this way, though no scientist would admit these discoveries are the last word, as the palaeontologist Stephen Jay Gould once pointed out: “In science, ‘fact’ can only mean ‘confirmed to such a degree that it would be perverse to withhold provisional assent’,” he admitted.

Certainly, things can go wrong, as Huxley acknowledged. Science may be organised common sense but all too often a beautiful theory created this way has been skewered by “a single ugly fact”, as he put it. Think of Fred Hoyle’s elegant concept of a steady state universe that is gently expanding and eternal. The idea was at one time considered to be philosophically superior to its rival, the big bang theory that proposed the cosmos erupted into existence billions of years ago. The latter idea explained the expansion of the universe by recourse to a vast explosion. The former accounted for this expansion in more delicate, intriguing terms.

The steady state theory continued to hold its own until, in 1964, radio-astronomers Arno Penzias and Robert Woodrow Wilson noted interference on their radio telescope at the Bell Labs in New Jersey and tried to eliminate it. The pair went as far as shovelling out the pigeon droppings in the telescope and had the guilty pigeons shot (each blamed the other for giving the order). Yet the noise persisted. Only later did the two scientists realise what they were observing. The static hiss they were picking up was caused by a microwave radiation echo that had been set off when the universe erupted into existence after its big bang birth.

That very ugly fact certainly ruined Hoyle’s beautiful theory and, no doubt, his breakfast when he read about it in his newspaper. But then the pursuit of truth has always been a tricky and cruel business. “It is true that some things come along like that to throw scientists into a tizz but it doesn’t happen very often,” adds Jones. “The trouble is, the BBC thinks it happens every day.”

And this takes us to the nub of the issue: how should science be reported and recorded? How can you take a topic such as climate change, about which there is virtual unanimity of views among scientists, and keep it in the public’s eye. The dangers of rising greenhouse gas emissions have dramatic implications after all. But simply reporting every tiny shrinkage in polar ice sheets or rise in sea levels will only alienate readers or viewers, a point acknowledged by May. “Newspapers, radio and TV have a duty to engage and there is no point in doing a lot of excellent reporting on a scientific issue if it is boring or trivial. The alternative is to trivialise or distort, thus subordinating substance in the name of attraction. It is a paradox for which I can see no answer.”

Jones agrees. “What we don’t want to do is go back to the days when fawning reporters asked great figures to declaim on scientific issues – or political ones, for that matter. On the other hand, we cannot continue to distort views in the name balance,” It is a tricky business, but as former Times editor Charlie Wilson once told a member of staff upset at a task’s complexity: “Of course, it’s hard. If it was easy we would get an orang-utan to do it.”

Jones, in highlighting a specific problem for the BBC, has opened up a far wider, far more important issue – the need to find ways to understand how science works and to appreciate its insights and complexities. It certainly won’t be easy.

Can a Candid Climate Modeler Convince Contrarians? (Scientific American)

Intrepid British climate scientist sets out to win over global warming doubters

By Jeremy Lovell and ClimateWire | July 19, 2011

CONVINCING CONTRARIANS: Scientists attempt to win over climate change doubters. Image: Courtesy of NOAA

LONDON — David Stainforth is a brave man. His mission is to try to remove some of the confusion over the climate debate by explaining why uncertainty has to be a part of the computerized climate models that scientists use to forecast the expected impacts of climate change, including more violent storms as well as more flooding and droughts.

Stainforth, a climate modeler and senior research fellow at the London School of Economics, hopes that by coming clean on the degree of difficulty in making such predictions, he and his fellow climate scientists will find it easier to make — and win — the argument that prompt action now is not only necessary but the far cheaper alternative to inaction.

“Governments and people want certainty about what will happen with climate change, so scientists tend to turn to climate modeling. But the models are wrong in so many ways because there are so many uncertainties and unknowns built into them,” Stainforth told ClimateWire here at the Royal Academy’s recent annual Summer Science Exhibition.

“The reason is that they are just that, models, not reality. The bottom line is that they give a quite useful message from science to the adaptation community. But it is all relative and hedged about with qualifications. They give likelihoods not certainties, ranges of probabilities, not absolutes. That is where the discussion then must start, not end,” he added.

It is a bold step to take at a time when the climate skeptics appear to be making the most of the continuing public confusion and denial over the issues shown in repeated polls in the United States and United Kingdom. Skeptics have taken advantage of the revelations of scientific infighting with the leaked emails from the United Kingdom’s University of East Anglia in late 2009. They have also pointed to evidence of some sloppy science by the Intergovernmental Panel on Climate Change to assert that the feared results of climate change may be more fiction than science.

Take that, add the diplomatic bickering and backsliding in international climate change talks, then fold in the news of the continuing global economic crisis and reports that renewable energy will drive up energy costs. You will get a sense that what Stainforth is attempting is a very hard sell.

The ‘trouble’ with climate models

“You can explain in five or 10 minutes why we need to do something about climate change — and do it without using climate models. But it is far harder to persuade people of the degree and speed of what needs to be done without the models, and that is where the trouble starts,” said Stainforth.

“Governments and the media demand certainty. They don’t want uncertainties and probabilities. For example, all our models predict wetter winters and warmer summers, but they are far less certain about wetter or drier summers, and that has major implications for the siting and size of flood defenses,” he explained, referring to dams and levees.

“Climate scientists have moved a long way beyond discussing whether climate change is a threat to our societies and economies. That is settled. But that is not to say they do not still disagree about a lot of things like the design of the models and the degree of change,” he added.

He remains hopeful that the non-scientific public will understand the strong consensus among climate scientists that makes the remaining bickering look small. “There is uncertainty, but there is also probability. By showing and discussing the degree of each in public and with the public, we hope to involve them and therefore get out of the loop and move forward.”

Stainforth’s mission is backed by an array of groups including the United Kingdom’s Natural Environment Research Council, the Economic and Social Research Council and the Centre for Climate Change Economics and Policy as well as the London School of Economics. There is also the Grantham Research Institute on Climate Change and the Environment — headed by Lord Nicholas Stern, whose report on the economics of climate change in 2006 electrified governments worldwide on the issue.

Trying some interactive games

Using literature and interactive games at the Confidence in Climate website, the project sets out to show how probabilities work and why different models may come up with quite widely differing predictions. It then applies this to a composite of theories and observations on the climate conundrum.

“When you make a decision about the future — whether it is based on theory or observation — it is a sort of gamble. You can never know what is going to happen. When we make decisions about how to tackle climate change it is no different,” the website says.

“Because of the uncertainty we can’t be sure exactly what degree of challenge we will face. None the less, some things are clear — uncertainty doesn’t mean ignorance. … We also know that bigger increases in atmospheric greenhouse gas levels are likely to lead to much bigger impacts; the impact of a 4 degree warming is likely to be more than twice the impact of a 2 degree warming,” it adds.

As for Stainforth, he thinks the debate urgently needs to be widened considerably from the rather restricted inner core of scientists, modelers, meteorologists and statisticians who have monopolized it to date.

“We need ecologists, farmers, doctors, anthropologists, sociologists, engineers, psychologists, hydrologists, social scientists. The climate change problem involves everyone and should therefore include everyone,” he said.

“We have to grasp the nettle here and communicate openly the uncertainty, to explain what is uncertain, where, why and to what degree. We don’t want it split into ‘believers’ and ‘unbelievers’; we want people to understand.”

Reprinted from Climatewire with permission from Environment & Energy Publishing, LLC. http://www.eenews.net, 202-628-6500

On Experts and Global Warming (N.Y. Times)

July 12, 2011, 4:01 PM
By GARY GUTTING

Experts have always posed a problem for democracies. Plato scorned democracy, rating it the worst form of government short of tyranny, largely because it gave power to the ignorant many rather than to knowledgeable experts (philosophers, as he saw it). But, if, as we insist, the people must ultimately decide, the question remains: How can we, nonexperts, take account of expert opinion when it is relevant to decisions about public policy?

Once we accept the expert authority of climate science, we have no basis for supporting the minority position.

To answer this question, we need to reflect on the logic of appeals to the authority of experts. First of all, such appeals require a decision about who the experts on a given topic are. Until there is agreement about this, expert opinion can have no persuasive role in our discussions. Another requirement is that there be a consensus among the experts about points relevant to our discussion. Precisely because we are not experts, we are in no position to adjudicate disputes among those who are. Finally, given a consensus on a claim among recognized experts, we nonexperts have no basis for rejecting the truth of the claim.

These requirements may seem trivially obvious, but they have serious consequences. Consider, for example, current discussions about climate change, specifically about whether there is long-term global warming caused primarily by human activities (anthropogenic global warming or A.G.W.). All creditable parties to this debate recognize a group of experts designated as “climate scientists,” whom they cite in either support or opposition to their claims about global warming. In contrast to enterprises such as astrology or homeopathy, there is no serious objection to the very project of climate science. The only questions are about the conclusions this project supports about global warming.

There is, moreover, no denying that there is a strong consensus among climate scientists on the existence of A.G.W. — in their view, human activities are warming the planet. There are climate scientists who doubt or deny this claim, but even they show a clear sense of opposing a view that is dominant in their discipline. Nonexpert opponents of A.G.W. usually base their case on various criticisms that a small minority of climate scientists have raised against the consensus view. But nonexperts are in no position to argue against the consensus of scientific experts. As long as they accept the expert authority of the discipline of climate science, they have no basis for supporting the minority position. Critics within the community of climate scientists may have a cogent case against A.G.W., but, given the overall consensus of that community, we nonexperts have no basis for concluding that this is so. It does no good to say that we find the consensus conclusions poorly supported. Since we are not experts on the subject, our judgment has no standing.

It follows that a nonexpert who wants to reject A.G.W. can do so only by arguing that climate science lacks the scientific status needed be taken seriously in our debates about public policy. There may well be areas of inquiry (e.g., various sub-disciplines of the social sciences) open to this sort of critique. But there does not seem to be a promising case against the scientific authority of climate science. As noted, opponents of the consensus on global warming themselves argue from results of the discipline, and there is no reason to think that they would have had any problem accepting a consensus of climate scientists against global warming, had this emerged.

Some nonexpert opponents of global warming have made much of a number of e-mails written and circulated among a handful of climate scientists that they see as evidence of bias toward global warming. But unless this group is willing to argue from this small (and questionable) sample to the general unreliability of climate science as a discipline, they have no alternative but to accept the consensus view of climate scientists that these e-mails do not undermine the core result of global warming.

I am not arguing the absolute authority of scientific conclusions in democratic debates. It is not a matter of replacing Plato’s philosopher-kings with scientist-kings in our polis. We the people still need to decide (perhaps through our elected representatives) which groups we accept as having cognitive authority in our policy deliberations. Nor am I denying that there may be a logical gap between established scientific results and specific policy decisions. The fact that there is significant global warming due to human activity does not of itself imply any particular response to this fact. There remain pressing questions, for example, about the likely long-term effects of various plans for limiting CO2 emissions, the more immediate economic effects of such plans, and, especially, the proper balance between actual present sacrifices and probable long-term gains. Here we still require the input of experts, but we must also make fundamental value judgments, a task that, pace Plato, we cannot turn over to experts.

The essential point, however, is that once we have accepted the authority of a particular scientific discipline, we cannot consistently reject its conclusions. To adapt Schopenhauer’s famous remark about causality, science is not a taxi-cab that we can get in and out of whenever we like. Once we board the train of climate science, there is no alternative to taking it wherever it may go.

Our Extreme Future: Predicting and Coping with the Effects of a Changing Climate (Scientific American)

Adapting to extreme weather calls for a combination of restoring wetland and building drains and sewers that can handle the water. But leaders and the public are slow to catch on. Final part of a three-part series

By John Carey | Thursday, June 30, 2011 | 97

Image: Fikret Onal/Flickr

Editor’s note: This article is the last of a three-part series by John Carey. Part 1, “Storm Warning: Extreme Weather Is a Product of Climate Change,” was posted on June 28. Part 2, “Global Warming and the Science of Extreme Weather,” was posted on June 29.

Extreme weather events have become both more common and more intense. And increasingly, scientists have been able to pin at least part of the blame on humankind’s alteration of the climate. What’s more, the growing success of this nascent science of climate attribution (finding the telltale fingerprints of climate change in extreme events) means that researchers have more confidence in their climate models—which predict that the future will be even more extreme.

Are we prepared for this future? Not yet. Indeed, the trend is in the other direction, especially in Washington, D.C., where a number of members of Congress even argue that climate change itself is a hoax.

Scientists hope that rigorously identifying climate change’s contribution to individual extreme events can indeed wake people up to the threat. As the research advances, it should be possible to say that two extra inches (five centimeters) of rain poured down in a Midwestern storm because of greenhouse gases, or that a California heat wave was 10 times more likely to occur thanks to humans’ impacts on climate. So researchers have set up rapid response teams to assess climate change’s contribution to extreme events while the events are still fresh in people’s minds. In addition, the Intergovernmental Panel on Climate Change (IPCC) is preparing a special report on extreme events and disasters, due out by the end of 2011. “It is important for us emphasize that climate change and its impacts are not off in the future, but are here and now,” explained Rajendra Pachauri, chair of the IPCC, during a briefing at United Nations climate talks in Cancún last December.

The message is beginning to sink in. The Russian government, for instance, used to doubt the existence of climate change, or argue that it might be beneficial for Russia. But now, government officials have realized that global warming will not bring a gradual and benign increase in temperatures. Instead, they’re likely to see more crippling heat waves. As Russian President Dmitry Medvedev told the Security Council of the Russian Federation last summer: “Everyone is talking about climate change now. Unfortunately, what is happening now in our central regions is evidence of this global climate change, because we have never in our history faced such weather conditions.”

Doubts persist despite evidence

Among the U.S. public, the feeling is different. Opinion pollsand anecdotal reports show that most Americans do not perceive a threat from climate change. And a sizable number of Americans, including many newly elected members of Congress, do not even believe that climate change exists. Extreme weather? Just part of nature, they say. After all, disastrous floods and droughts go back to the days of Noah and Moses. Why should today’s disasters be any different? Was the July 23, 2010, storm that spawned Les Scott’s record hailstone evidence of a changing climate, for instance? “Not really,” Scott says. “It was just another thunderstorm. We get awful bad blizzards that are a lot worse.”

And yes, 22 of Maryland’s 23 counties were declared natural disaster areas after record-setting heat and drought in 2010. “It was the worst corn crop I ever had,” says fourth-generation farmer Earl “Buddy” Hance. But was it a harbinger of a more worrisome future? Probably not, says Hance, the state’s secretary of agriculture. “As farmers we are skeptical, and we need to see a little more. And if it does turn out to be climate change, farmers would adapt.” By then, adaptation could be really difficult, frets Minnesota organic farmer Jack Hedin, whose efforts to raise the alarm are “falling on deaf ears,” he laments.

Many scientists share Hedin’s worry. “The real honest message is that while there is debate about how much extreme weather climate change is inducing now, there is very little debate about its effect in the future,” says Michael Wehner, staff scientist at Lawrence Berkeley National Laboratory and member of the lead author teams of the interagency U.S. Climate Change Science Program’s Synthesis and Assessment reports on climate extremes. For instance, climate models predict that by 2050 Russia will have warmed up so much that every summer will be as warm as the disastrous heat wave it just experienced, says Richard Seager of Columbia University’s Lamont–Doherty Earth Observatory. In other words, many of today’s extremes will become tomorrow’s everyday reality. “Climate change will throw some significant hardballs at us,” says Martin Hoerling, a research meteorologist at the National Oceanic and Atmospheric Administration’s Earth System Research Laboratory in Boulder, Colo. “There will be a lot of surprises that we are not adapted to.”

A dusty future

One of the clearest pictures of this future is emerging for the U.S. Southwest and a similar meteorological zone that stretches across Italy, Greece and Turkey. Work by Tim Barnett of the Scripps Institution of Oceanography, Seager and others predicts that these regions will get hotter and drier—and, perhaps more important, shows that the change has already begun. “The signal of a human influence on climate pops up in 1985, then marches on getting strong and stronger,” Barnett says. By the middle of the 21st century, the models predict, the climate will be as dry as the seven-year long Dust Bowl drought of the 1930s or the damaging 1950s drought centered in California and Mexico, Seager says: “In the future the drought won’t last just seven years. It will be the new norm.”

That spells trouble. In the Southwest the main worry is water—water that makes cities like Los Angeles and Las Vegas possible and that irrigates the enormously productive farms of California’s Central Valley. Supplies are already tight. During the current 11-year dry spell, the demand for water from the vast Colorado River system, which provides water to 30 million people and irrigates four million acres (1.6 million hectares) of cropland, has exceeded the supply. The result: water levels in the giant Lake Mead reservoir dropped to a record low in October (before climbing one foot, or 30 centimeters, after torrential winter rains in California reduced the demand for Colorado River water). Climate change will just make the problem worse. “The challenge will be great,” says Terry Fulp, deputy regional director of the U.S. Department of the Interior’s Bureau of Reclamation’s Lower Colorado Region. “I rank climate change as probably my largest concern. When I’m out on my boat on Lake Mead, it’s on my mind all the time.”

The Southwest is just a snapshot of the challenges ahead. Imagine the potential peril to regions around the world, scientists say. “Our civilization is based on a stable base climate—it doesn’t take very much change to raise hell,” Scripps’s Barnett says. And given the lag in the planet’s response to the greenhouse gases already in the atmosphere, many of these changes are coming whether we like them or not. “It’s sort of like that Kung Fu guy who said, ‘I’m going to kick your head off now, and there’s not a damn thing you can do about it,'” Barnett says.

Grassroots action

Although efforts to fight climate change are now stalled in Washington, many regions do see the threat and are taking action both to adapt to the future changes and to try to limit the amount of global warming itself. The Bureau of Reclamation’s Lower Colorado Region office, for instance, has developed a plan to make “manageable” cuts in the amounts of water that the river system supplies, which Fulp hopes will be enough to get the region through the next 15 years. In Canada, after experiencing eight extreme storms (of more than one-in-25-year intensity) between 1986 and 2006, Toronto has spent hundreds of millions of dollars to upgrade its sewer and storm water system for handling deluges. “Improved storm drains are the cornerstone of our climate adaptation policy,” explains Michael D’Andrea, Toronto’s director of water infrastructure management.

In Iowa, even without admitting that climate change is real, farmers are acting as if it is, spending millions of dollars to alter their practices. They are adding tile drainage to their fields to cope with increased floods, buying bigger machinery to move more quickly because their planting window has become shorter, planting a month earlier than they did 50 years ago, and sowing twice as many corn plants per acre to exploit the additional moisture, says Gene Takle, professor of meteorology at Iowa State University in Ames. “Iowa’s floods are in your face—and in your basement—evidence that the climate has changed, and the farmers are adapting,” he says.

Local officials have seen the connection, too. After the huge floods of 2008, the Iowa town of Cedar Falls passed an ordinance requiring that anyone who lives in the 500-year flood plain must have flood insurance—up from the previous 200-year flood requirement. State Sen. Robert Hogg wants to make the policy statewide. He also is pushing to restore wetlands that can help soak up floodwaters before they devastate cities. “Wetland restoration costs money, but it’s cheaper than rebuilding Cedar Rapids,” he says. “I like to say that dealing with climate change is not going to require the greatest sacrifices, but it is going to require the greatest foresight Americans have ever had.”

Right now, that foresight is more myopia, many scientists worry. So when and how will people finally understand that far more is needed? It may require more flooded basements, more searing heat waves, more water shortages or crop failures, more devastating hurricanes or other examples of the increases in extreme weather that climate change will bring. “I don’t want to root for bad things to happen, but that’s what it will take,” says one government scientist who asked not to be identified. Or as Nashville resident Rich Hays says about his own experience with the May 2010 deluge: “The flood was definitely a wake-up call. The question is: How many wake-up calls do we need?”

Reporting for this story was funded by the Pew Center on Global Climate Change.

Global Warming and the Science of Extreme Weather (Scientific American)

How rising temperatures change weather and produce fiercer, more frequent storms. Second of a three-part series

By John Carey | Wednesday, June 29, 2011 | 46

HURRICANE KATRINA battered New Orleans in 2005. Image: NOAA

Editor’s note: This article is the second of a three-part series by John Carey. Part 1, posted on June 28, is “Storm Warning: Extreme Weather Is a Product of Climate Change”.

Extreme floods, prolonged droughts, searing heat waves, massive rainstorms and the like don’t just seem like they’ve become the new normal in the last few years—they have become more common, according to data collected by reinsurance company Munich Re (see Part 1 of this series). But has this increase resulted from human-caused climate change or just from natural climatic variations? After all, recorded floods and droughts go back to the earliest days of mankind, before coal, oil and natural gas made the modern industrial world possible.

Until recently scientists had only been able to say that more extreme weather is “consistent” with climate change caused by greenhouse gases that humans are emitting into the atmosphere. Now, however, they can begin to say that the odds of having extreme weather have increased because of human-caused atmospheric changes—and that many individual events would not have happened in the same way without global warming. The reason: The signal of climate change is finally emerging from the “noise”—the huge amount of natural variability in weather.

Scientists compare the normal variation in weather with rolls of the dice. Adding greenhouse gases to the atmosphere loads the dice, increasing odds of such extreme weather events. It’s not just that the weather dice are altered, however. As Steve Sherwood, co-director of the Climate Change Research Center at the University of New South Wales in Australia, puts it, “it is more like painting an extra spot on each face of one of the dice, so that it goes from 2 to 7 instead of 1 to 6. This increases the odds of rolling 11 or 12, but also makes it possible to roll 13.”

Why? Basic physics is at work: The planet has already warmed roughly 1 degree Celsius since preindustrial times, thanks to CO2and other greenhouse gases emitted into the atmosphere. And for every 1-degree C (1.8 degrees Fahrenheit) rise in temperature, the amount of moisture that the atmosphere can contain rises by 7 percent, explains Peter Stott, head of climate monitoring and attribution at the U.K. Met Office’s Hadley Center for Climate Change. “That’s quite dramatic,” he says. In some places, the increase has been much larger. Data gathered by Gene Takle, professor of meteorology at Iowa State University in Ames, show a 13 percent rise in summer moisture over the past 50 years in the state capital, Des Moines.

The physics of too much rain

The increased moisture in the atmosphere inevitably means more rain. That’s obvious. But not just any kind of rain, the climate models predict. Because of the large-scale energy balance of the planet, “the upshot is that overall rainfall increases only 2 to 3 percent per degree of warming, whereas extreme rainfall increases 6 to 7 percent,” Stott says. The reason again comes from physics. Rain happens when the atmosphere cools enough for water vapor to condense into liquid. “However, because of the increasing amount of greenhouse gases in the troposphere, the radiative cooling is less efficient, as less radiation can escape to space,” Stott explains. “Therefore the global precipitation increases less, at about 2 to 3 percent per degree of warming.” But because of the extra moisture, when precipitation does occur (in both rain and snow), it’s more likely to be in bigger events.

Iowa is one of many places that fits the pattern. Takle documented a three- to seven-fold increase in high rainfall events in the state, including the 500-year Mississippi River flood in 1993, the 2008 Cedar Rapids flood as well as the 500-year event in 2010 in Ames, which inundated the Hilton Coliseum basketball court in eight feet (2.5 meters) of water . “We can’t say with confidence that the 2010 Ames flood was caused by climate change, but we can say that the dice are loaded to bring more of these events,” Takle says.

And more events seem to be in the news every month, from unprecedented floods in Riyadh, Saudi Arabia, to massive snowstorms that crippled the U.S. Northeast in early 2011, to the November 2010 to January 2011 torrents in Australia that flooded an area the size of Germany and France . This “disaster of biblical proportions,” as local Australian officials called it, even caused global economic shock waves: The flooding of the country’s enormously productive coal mines sent world coal prices soaring.

More stormy weather

More moisture and energy in the atmosphere, along with warmer ocean temperatures also mean more intense hurricanes, many scientists say. In fact, 2010 was the first year in decades in which two simultaneous category 4 hurricanes, Igor and Julia, formed in the Atlantic Ocean. In addition, the changed conditions bring an increased likelihood of more powerful thunderstorms with violent updrafts, like a July 23, 2010, tempest in Vivian, S.D., that produced hailstones that punched softball-size holes through roofs—and created a behemoth ball of ice measured at a U.S. record 8 inches (20 centimeters) in diameter even after it had partially melted. “I’ve never seen a storm like that before—and hope I’ll never go through anything like it,” says Les Scott, the Vivian farmer and rancher who found the hailstone .

Warming the planet alters large-scale circulation patterns as well. Scientists know that the sun heats moist air at the equator, causing the air to rise. As it rises, the air cools and sheds most of its moisture as tropical rain. Once six to 10 miles (9.5 to 16 kilometers) aloft, the now dry air travels toward the poles, descending when it reaches the subtropics, normally at the latitude of the Baja California peninsula. This circulation pattern, known as a Hadley cell, contributes to desertification, trade winds and the jet stream.

On a warmer planet, however, the dry air will travel farther north and south from the equator before it descends, climate models predict, making areas like the U.S. Southwest and the Mediterranean even drier. Such an expanded Hadley cell would also divert storms farther north. Are the models right? Richard Seager of Columbia University’s Lamont–Doherty Earth Observatory has been looking for a climate change–induced drying trend in the Southwest, “and there seems to be some tentative evidence that it is beginning to happen,” he says. “It gives us confidence in the models.” In fact, other studies show that the Hadley cells have not only expanded, they’ve expanded more than the models predicted.

Such a change in atmospheric circulation could explain both the current 11-year drought in the Southwest and Minnesota’s status as the number one U.S. state for tornadoes last year. On October 26, 2010, the Minneapolis area even experienced record low pressure in what Paul Douglas, founder and CEO of WeatherNation in Minnesota, dubbed a “landicane”—a hurricanelike storm that swept across the country. “I thought the windows of my home would blow in,” Douglas recalls. “I’ve chased tornados and flown into hurricanes but never experienced anything like this before.” Yet it makes sense in the context of climate change, he adds. “Every day, every week, another piece of the puzzle falls into place,” he says. “More extreme weather seems to have become the rule, not just in the U.S. but in Europe and Asia.”

The rise of climate attribution

Is humankind really responsible? That’s where the burgeoning field of climate attribution, pioneered by Hadley’s Peter Stott and other scientists, comes in. The idea is to look for trends in the temperature or precipitation data that provide evidence of overall changes in climate. When those trends exist, it then becomes possible to calculate how much climate change has contributed to extreme events. Or in more technical terms, the probability of a particular temperature or rainfall amount is shaped roughly like a bell curve. A change in climate shifts the whole curve. That, in turn, increases the likelihood of experiencing the more extreme weather at the tail end of the bell curve. Whereas day-to-day weather remains enormously variable, the underlying human-caused shift in climate increases the power and number of the events at the extreme. The National Oceanic and Atmospheric Administration’s (NOAA) Deke Arndt puts it more colorfully: “Weather throws the punches, but climate trains the boxer,” he says. By charting the overall shift, then, it’s possible to calculate the increased chances of extreme events due to global warming.

This idea was already in the air in 2003 when Stott traveled though the worst heat wave in recorded European history on a wedding anniversary trip to Italy and Switzerland. One of the striking consequences he noticed was that the Swiss mountains were missing their usual melodious tinkling of cowbells. “There was no water in the mountains, and the farmers had to take all their cows down in the valley,” he says. He decided to see if he could pin part of the blame on climate change after he returned to his office in Exeter, England. “I didn’t expect to get a positive result,” he says

But he did. In fact, the signal of a warming climate was quite clear in Europe, even using data up to only 2000. In a landmark paper in Nature Stott and colleagues concluded that the chances of a heat wave like the 2003 event have more than doubled because of climate change. (Scientific American is part of Nature Publishing Group.) Data collected since then show that the odds are at least four times higher compared with pre-industrial days. “We are very aware of the risks of misattribution,” Stott says. “We don’t want to point to specific events and say that they are part of climate change when they really are due to natural variability. But for some events, like the 2003 heat wave, we have the robust evidence to back it up.”

Case in point: Hurricane Katrina

Another event with a clear global warming component, says Kevin Trenberth, head of climate analysis at the National Center for Atmospheric Research (NCAR) in Boulder, Colo., was Hurricane Katrina. Trenberth calculated that the combination of overall planetary warming, elevated moisture in the atmosphere, and higher sea-surface temperatures meant that “4 to 6 percent of the precipitation—an extra inch [2.5 centimeters] of rain—in Katrina was due to global warming,” he says. “That may not sound like much, but it could be the straw that breaks the camel’s back or causes a levee to fail.” It was also a very conservative estimate. “The extra heat produced as moisture condenses can invigorate a storm, and at a certain point, the storm just takes off,” he says. “That would certainly apply to Nashville.” So climate change’s contribution to Katrina could have been twice as high as his calculations show, he says. Add in higher winds to the extra energy, and it is easy to see how storms can become more damaging.

This science of attribution is not without controversies. Another case in point: the 2010 Russian heat wave, which wiped out one quarter of the nation’s wheat crop and darkened the skies of Moscow with smoke from fires. The actual meteorological cause is not in doubt. “There was a blocking of the atmospheric circulation,” explains Martin Hoerling, a research meteorologist at the NOAA’s Earth System Research Laboratory, also in Boulder. “The jet stream shifted north, bringing a longer period of high pressure and stagnant weather conditions.” But what caused the blocking? Hoerling looked for an underlying long-term temperature trend in western Russia that might have increased the odds of a heat wave, as Stott had done for the 2003 European event. He found nothing. “The best explanation is a rogue black swan—something that came out of the blue,” he says.

Wrong, retorts NCAR’s Trenberth. He sees a clear expansion of the hot, dry Mediterranean climate into western Russia that is consistent with climate change predictions—and that also intensified the Pakistan monsoon. “I completely repudiate Marty—and it doesn’t help to have him saying you can’t attribute the heat wave to climate change,” he says. “What we can say is that, as with Katrina, this would not have happened the same way without global warming.”

Yet even this dispute is smaller than it first appears. What is not in doubt is that the Russian heat wave is a portent—a glimpse of the future predicted by climate models. Even Hoerling sees it as a preview of coming natural disasters. By 2080, such events are expected to happen, on average, once every five years, he says: “It’s a good wake-up call. This type of phenomenon will become radically more common.”

Storm Warnings: Extreme Weather Is a Product of Climate Change (Scientific American)

More violent and frequent storms, once merely a prediction of climate models, are now a matter of observation. Part 1 of a three-part series

By John Carey | Tuesday, June 28, 2011 | 130

DROWNING: The Souris River overflowed levees in Minot, N.D., as seen here on June 23. Image: Patrick Moes/U.S. Army Corps of Engineers

In North Dakota the waters kept rising. Swollen by more than a month of record rains in Saskatchewan, the Souris River topped its all time record high, set back in 1881. The floodwaters poured into Minot, North Dakota’s fourth-largest city, and spread across thousands of acres of farms and forests. More than 12,000 people were forced to evacuate. Many lost their homes to the floodwaters.

Yet the disaster unfolding in North Dakota might be bringing even bigger headlines if such extreme events hadn’t suddenly seemed more common. In this year alone massive blizzards have struck the U.S. Northeast, tornadoes have ripped through the nation, mighty rivers like the Mississippi and Missouri have flowed over their banks, and floodwaters have covered huge swaths of Australia as well as displaced more than five million people in China and devastated Colombia. And this year’s natural disasters follow on the heels of a staggering litany of extreme weather in 2010, from record floods in Nashville, Tenn., and Pakistan, to Russia’s crippling heat wave.

These patterns have caught the attention of scientists at the National Climatic Data Center in Asheville, N.C., part of the National Oceanic and Atmospheric Administration (NOAA). They’ve been following the recent deluges’ stunning radar pictures and growing rainfall totals with concern and intense interest. Normally, floods of the magnitude now being seen in North Dakota and elsewhere around the world are expected to happen only once in 100 years. But one of the predictions of climate change models is that extreme weather—floods, heat waves, droughts, even blizzards—will become far more common. “Big rain events and higher overnight lows are two things we would expect with [a] warming world,” says Deke Arndt, chief of the center’s Climate Monitoring Branch. Arndt’s group had already documented a stunning rise in overnight low temperatures across the U.S. So are the floods and spate of other recent extreme events also examples of predictions turned into cold, hard reality?

Increasingly, the answer is yes. Scientists used to say, cautiously, that extreme weather events were “consistent” with the predictions of climate change. No more. “Now we can make the statement that particular events would not have happened the same way without global warming,” says Kevin Trenberth, head of climate analysis at the National Center for Atmospheric Research (NCAR) in Boulder, Colo.

That’s a profound change—the difference between predicting something and actually seeing it happen. The reason is simple: The signal of climate change is emerging from the “noise”—the huge amount of natural variability in weather.

Extreme signals

There are two key lines of evidence. First, it’s not just that we’ve become more aware of disasters like North Dakota or last year’s Nashville flood, which caused $13 billion in damage, or the massive 2010 summer monsoon in Pakistan that killed 1,500 people and left 20 million more homeless. The data show that the number of such events is rising. Munich Re, one of the world’s largest reinsurance companies, has compiled the world’s most comprehensive database of natural disasters, reaching all the way back to the eruption of Mount Vesuvius in A.D. 79. Researchers at the company, which obviously has a keen financial interest in trends that increase insurance risks, add 700 to 1,000 natural catastrophes to the database each year, explains Mark Bove, senior research meteorologist in Munich Re’s catastrophe risk management office in Princeton, N.J. The data indicate a small increase in geologic events like earthquakes since 1980 because of better reporting. But the increase in the number of climate disasters is far larger. “Our figures indicate a trend towards an increase in extreme weather events that can only be fully explained by climate change,” says Peter Höppe, head of Munich Re’s Geo Risks Research/Corporate Climate Center: “It’s as if the weather machine had changed up a gear.

The second line of evidence comes from a nascent branch of science called climate attribution. The idea is to examine individual events like a detective investigating a crime, searching for telltale fingerprints of climate change. Those fingerprints are showing up—in the autumn floods of 2000 in England and Wales that were the worst on record, in the 2003 European heat wave that caused 14,000 deaths in France, in Hurricane Katrina—and, yes, probably even in Nashville. This doesn’t mean that the storms or hot spells wouldn’t have happened at all without climate change, but as scientists like Trenberth say, they wouldn’t have been as severe if humankind hadn’t already altered the planet’s climate.This new science is still controversial. There’s an active debate among researchers about whether the Russian heat wave bears the characteristic signature of climate change or whether it was just natural variability, for instance. Some scientists worry that trying to attribute individual events to climate change is counterproductive in the larger political debate, because it’s so easy to dismiss the claim by saying that the planet has always experienced extreme weather. And some researchers who privately are convinced of the link are reluctant to say so publicly, because global warming has become such a target of many in Congress.

But the evidence is growing for a link between the emissions of modern civilization and extreme weather events. And that has the potential to profoundly alter the perception of the threats posed by climate change. No longer is global warming an abstract concept, affecting faraway species, distant lands or generations far in the future. Instead, climate change becomes personal. Its hand can be seen in the corn crop of a Maryland farmer ruined when soaring temperatures shut down pollination or the $13 billion in damage in Nashville, with the Grand Ole Opry flooded and sodden homes reeking of rot. “All of a sudden we’re not talking about polar bears or the Maldives any more,” says Nashville-based author and environmental journalist Amanda Little. “Climate change translates into mold on my baby’s crib. We’re talking about homes and schools and churches and all the places that got hit.”

Drenched in Nashville

Indeed, the record floods in Nashville in May 2010 shows how quickly extreme weather can turn ordinary life into a nightmare. The weekend began innocuously. The forecast was a 50 percent chance of rain. Musician Eric Normand and his wife Kelly were grateful that the weather event they feared, a tornado, wasn’t anticipated. Eric’s Saturday concert in a town south of Nashville should go off without a hitch, he figured.

He was wrong. On Saturday, it rained—and rained. “It was a different kind of rain than any I had experienced in my whole life,” says Nashville resident Rich Hays. Imagine the torrent from an intense summer thunderstorm, the sort of deluge that prompts you to duck under an underpass for a few minutes until the rain stops and it’s safe to go on, Little says. It was like that, she recalls—except that on this weekend in May 2010 it didn’t stop. Riding in the bus with his fellow musicians, Normand “looked through a window at a rain-soaked canopy of green and gray,” he wrote later. Scores of cars were underwater on the roads they had just traveled. A short 14-hour bus gig turned out to be “one of the most stressful and terrifying we had ever experienced,” Normand says.

And still it rained—more than 13 inches (33 centimeters) that weekend. The water rose in Little’s basement—one foot, two feet, three feet (one meter) deep. “You get this panicky feeling that things are out of control,” she says. Over at Hays’s home, fissures appeared in the basement floor, and streams of water turned into a “full-on river,” Hays recalls. Then in the middle of night, “I heard this massive crack, almost like an explosion,” he says. The force of the water had fractured the house’s concrete foundation. He and his wife spent the rest of the night in fear that the house might collapse.

Sunday morning, Normand went out in the deluge to ask his neighbor if he knew when the power might go back on—it was then he realized that his normal world had vanished. A small creek at the bottom of the hill was now a lake one-half mile (0.8 kilometer) wide, submerging homes almost up to their second stories. “My first reaction was disbelief,” Normand says. He and his family were trapped, without power and surrounded by flooded roads. “We were just freaked out,” he recalls.

And all across the flooded city the scenes were surreal, almost hallucinatory, Little says. “There were absurdities heaped upon absurdities. Churches lifted off foundations and floating down streets. Cars floating in a herd down highways.” In her own basement her family’s belongings bobbed like debris in a pond.

By time the deluge ended, more than 13 inches (33 centimeters) of rain had fallen, as recorded at Nashville’s airport. The toll: 31 people dead, more than $3 billion in damage—and an end to the cherished perception that Nashville was safe from major weather disasters. “A community that had never been vulnerable to this incredible force of nature was literally taken by storm,” Little says.

But can the Nashville deluge, the North Dakota floods and the many other extreme weather events around the world be connected with the greenhouse gases that humans have spewed into the atmosphere? Increasingly the answer seems to be yes. Whereas it will never be possible to say that any particular event was caused by climate change, new science is teasing out both the contributions that it makes to individual events—and the increase in the odds of extreme weather occurring as a result of climate change.

Reviewing the Nisbet ‘Climate Shift’ Report and Controversial Claims of Media Progress (Yale Forum on Climate Change & The Media)

by John Wihbey | July 11, 2011

Matt Nisbet’s ‘Climate Shift’ research report raised headline-grabbing points on fundraising successes by those advocating action on climate change. But it’s what lies behind those headlines — and relating specifically to media coverage — that also warrants further review and analysis.

Few pieces of recent academic research on climate change have stirred up as much controversy as American University professor Matthew Nisbet’s April 2011 report “Climate Shift: Clear Vision for the Next Decade of Public Debate.”

The report’s biggest headline-grabbing finding — that the environmental lobby is now holding its own in the money race with industry groups opposing carbon regulations — doubtless will generate further analysis, and one can imagine more such annual scorecards assessing this power struggle. And the questions “Climate Shift” raises about the relative political wisdom — or lack of same — in pushing the failed cap-and-trade bill in Congress may well be debated by historians for years to come.

Perhaps the most underappreciated facet of the scholarship that Nisbet put forth, however, involves his analysis of media coverage in the years 2009-2010, contained in his provocatively titled chapter 3, “The Death of a Norm: Evaluating False Balance in News Coverage.”

According to Nisbet’s story-by-story analysis that covers the vertiginous period involving Copenhagen, the so-called “climategate” hacked e-mails, and federal cap-and-trade, the mainstream media — represented in his analysis by The New York Times, CNN.com, The Wall Street Journal, Politico, and The Washington Post — basically moved past the oft-criticized journalistic mode of “he said, she said,” or “false balance.” In its place, those media generally reflected the “consensus science” as backed by organizations such as the U.N.’s IPCC and the National Academy of Sciences and most of its international counterparts. (The opinion pages of the Journal are bracketed as an exception, and Nisbet’s analysis shows that its editorials do indeed continue to cast doubt on climate science.)

Nisbet’s assertion is a profound one, with significant implications. His stated goal with “Climate Shift” is to help reorient the priorities of groups trying to combat global change through the promotion of science and smart messaging to the public. (See companion posting based on author’s extensive e-mail interview with Nisbet.)
“[I]f trend-setting national media have overwhelmingly portrayed the consensus views on the fundamentals of climate science (as the report’s findings indicate),” Nisbet wrote in a recent e-mail interview with The Yale Forum, “then we should be turning to other types of media organizations in our engagement efforts and focusing on other dimensions of coverage, including … subsidizing the ability of local and regional media to cover climate change and energy insecurity as these challenges relate to their region and communities.” These are ideas Nisbet has raised also in previous reports.

Lines of Criticism

Bloggers at Media Matters do criticize how Nisbet interprets his data around the “climategate” period — one of the few on-the-numbers critiques. Nisbet responds that changes in coverage since then are either not “statistically significant” or “not meaningful.”

Other than that, few have questioned the particulars of Nisbet’s labor-intensive analysis of how those five outlets performed. Their selection — and the exclusion of others — though, is the subject of debate.

Nisbet says he chose those specific news outlets because they set the news agenda and have high-volume traffic, as reflected in Nielsen-tabulated figures. CNN.com, the Post and the Times ranked numbers 4, 5 and 9, respectively, in terms of web traffic in 2009. But given that news aggregators such as Yahoo, AOL, and Google ranked 1, 3, and 6, respectively, one might think that Nisbet’s universe of analysis did not capture the true flow of public news information.

The combined traffic of the aggregators is nearly twice that of the news sites Nisbet focused on. Admittedly, though, these aggregators would be a moving target — and an empirical analysis of the quality of news linked to would be difficult — but that’s where some huge portion of the public gets its news and information, and therefore its impressions and opinions.

(One other quibble, about the selection of Politico: Nisbet calls it “the paper ‘the White House wakes up to,’ as memorably headlined in a profile at The New York Times.” In fact, the article he cites is really just a profile of Politico reporter Mike Allen and his important day calendar “Playbook” blog. Though Politico is powerful and prolific, what constitutes “the paper of record for members of Congress,” as Nisbet puts it, may be an issue of reasonable disagreement among media watchers.)

Climate communications expert and University of Colorado-Boulder professor Max Boykoff was one of the formal reviewers for the “Climate Shift” report. He told The Yale Forum in an e-mail interview, “Overall, I found [Nisbet’s] work in Chapter 3 to be good. As he assembled it I spoke with Matt multiple times. (Chapter 3 was the part of the report I most focused on). We discussed how to replicate the methods and approaches that I undertook in my work on empirically testing the accuracy of coverage about human contributions to climate change (aka, the ‘balance as bias’ thesis). His methods and findings (re: WSJ op-ed divergence etc.) appeared valid and reliable.”

Still, Boykoff stated a potentially striking limitation of this type of analysis in his reviewer comments submitted back to Nisbet: Such analysis “still isn’t equipped to gauge how one particular carefully/prominently/well- or ill-timed article or commentary could have a much greater influence on public perceptions and views than consistently inaccurate treatment. In other words, the sometimes haphazard nature of media consumption — from skimming articles to just hearing/watching portions of a segment — isn’t accounted for through this approach. At the end of the day, these studies … struggle to account for ‘selective listening’ or ‘selective reading’ that we actually engage in during our daily lives.”

Boykoff also said he told Nisbet that his (Nisbet’s) research had not provided sufficient support for the “Climate Shift” report’s contention that “even in a world of blogs and fragmented audiences, the coverage appearing at these outlets strongly shapes the news decisions made at the broadcast and cable networks and informs the decisions of policymakers.”

The Fox News Question

Other notable criticisms of Nisbet’s approach in Chapter 3 of his report have focused on his exclusion of television sources, particularly Fox News. Prolific blogger and energy/climate expert Joseph Romm, who leveled ferocious criticism of Nisbet on his “Climate Progress” blog, makes much of this point. This dispute is a tricky one, resting on a difficult-to-resolve social science debate about how “persuade-able” the Fox News audience is, and just how best to measure the impacts of its huge ratings and online readership as part of American political consciousness.

In his comments to The Yale Forum, Nisbet replied, “As I discuss in the report, the audience for Fox News and political talk radio tend to be strongly self-selecting with consumption of these media tending to reinforce the views of those already doubtful or dismissive of climate change (approximately 25 percent of Americans).” Moreover, he says it “is not clear how these unsurprising findings would help us to move forward since any level of engagement with Fox News producers or talk radio hosts is unlikely to lead to changes in their coverage patterns. We can complain about and criticize these outlets, but much of the criticism and anger, I would argue, often ends up distracting us from initiatives where we can make a difference with journalists, editors, and with different publics.”

This latter point, of course, highlights an important facet of Nisbet’s project, namely that it has a particular goal, an “agenda” even, that puts an emphasis on both utility, or making a “difference,” and on truth as criteria for inquiry. (It’s possible this is where he opens the door for controversy, as it leaves him open to criticisms that he is downplaying conservative media and thereby painting an unduly positive picture of the U.S. media as a whole on climate issues.)

Columbia Journalism Review science editor Curtis Brainard told The Yale Forum recently that he thinks the spirit of Nisbet’s report is basically right in Chapter 3, at least as it relates to “news reporters and news articles.” For Nisbet and Brainard both, broad accusations that public ignorance is the media’s “fault” are no longer well-founded.

“There is this conventional wisdom floating around out there that journalists are inept, rarely able to get their facts straight or explain or deliver an accurate account of events,” Brainard wrote in an e-mail. “They’re not. But it’s much easier for activists and other policy or program stakeholders to blame the media when things don’t go their way than to analyze the much more complicated interplay of multiple factors.”

(As an aside, Brainard notes that he wrote about precisely this dynamic in his recent article, “Tornadoes and Climate Change,” which pushes back against such charges leveled by environmental writer and activist Bill McKibben. Brainard says McKibben is too quick to condemn the media as a whole for not making connections between various extreme weather events.)

We’re past those earlier days, Brainard told The Yale Forum, when the basic questions about climate science are portrayed in most mainstream news media as being unsettled: “The coverage has become so much more sophisticated since then, delving into the specific consequences of climate change, from sea level rise, to changing precipitation and drought patterns, to consequences for flora and fauna. Many reporters struggle to accurately explain the highly uncertain and nuanced science underlying these phenomena, but the flaws in the coverage are quite different from the false balance that was on exhibit before, say, 2006. First of all, there is nowhere near as much scientific consensus about these finer points of climate science as there is about the fundamentals (i.e., the Earth is warming, and humans are most likely to blame), so today’s stories are really apples compared with yesterday’s oranges.”

Work Ahead for Media, Scholars

If Nisbet’s report has an underlying flaw, perhaps, it may be in its packaging, particularly in its “Move On”-style message and ambition to deliver a definitive verdict. Its real virtue is that it has just very effectively — whether or not one buys it all — started a different kind of conversation. And given that just five outlets were analyzed in the report, there is certainly much more conversation to be had.

As mentioned, Nisbet has said he is already carrying out new research and further study on local and regional media. (See his latest thoughts on this issue as they relate to Chicago.) It’s a cause on which all academics and media professionals and critics might agree, as the business model for such outlets continues to erode. Local information ecosystems are changing, shifting, and in many cases decaying. But many observers point out how essential they remain.

“It would also be good to look at the practically countless number of local TV network affiliates across the country since, collectively, they are where most Americans still get their news,” Brainard also noted.

“Local newspapers, as Pew has documented, remain at the center of the local media ecosystem, with the overwhelming number of regional/local issues covered by local TV news and at local blogs originating from local newspaper coverage,” Nisbet said. “In this sense, on climate change and energy, we should think about local and regional newspapers as being part of the central communication infrastructure that regions and communities need to learn, connect, plan and make collective choices on the issue.”

Perhaps, through further studies by Nisbet and others, this important work on local and regional media — their shortcomings and needs — can shed additional light.

John Wihbey is a regular contributor to the Yale Forum. He is a journalist and researcher, and he can be reached at jpwihb@yahoo.com.

La Niña’s Exit Leaves Climate Forecasts in Limbo (NASA)

06.29.11

The latest satellite data of Pacific Ocean sea surface heights from the NASA/European Ocean Surface Topography Mission/Jason-2 satellite show near-normal conditions in the equatorial Pacific. The image is based on the average of 10 days of data centered on June 18, 2011. Higher (warmer) than normal sea surface heights are indicated by yellows and reds, while lower (cooler) than normal sea surface heights are depicted in blues and purples. Green indicates near-normal conditions. Image credit: NASA/JPL Ocean Surface Topography Team

It’s what Bill Patzert, a climatologist and oceanographer at NASA’s Jet Propulsion Laboratory in Pasadena, Calif., likes to call a “La Nada” – that puzzling period between cycles of the El Niño-Southern Oscillation climate pattern in the Pacific Ocean when sea surface heights in the equatorial Pacific are near average.

The comings and goings of El Niño and La Niña are part of a long-term, evolving state of global climate, for which measurements of sea surface height are a key indicator. For the past three months, since last year’s strong La Niña event dissipated, data collected by the U.S.-French Ocean Surface Topography Mission (OSTM)/Jason-2 oceanography satellite have shown that the equatorial Pacific sea surface heights have been stable and near average. Elsewhere, however, the northeastern Pacific Ocean remains quite cool, with sea levels much lower than normal. The presence of cool ocean waters off the U.S. West Coast has also been a factor in this year’s cool and foggy spring there.

The current state of the Pacific is shown in this OSTM/Jason-2 image, based on the average of 10 days of data centered on June 18, 2011. The image depicts places where Pacific sea surface height is higher (warmer) than normal as yellow and red, while places where the sea surface is lower (cooler) than normal are shown in blue and purple. Green indicates near-normal conditions. Sea surface height is an indicator of how much of the sun’s heat is stored in the upper ocean.

For oceanographers and climate scientists like Patzert, “La Nada” conditions can bring with them a high degree of uncertainty. While some forecasters (targeting the next couple of seasons) have suggested La Nada will bring about “normal” weather conditions, Patzert cautions previous protracted La Nadas have often delivered unruly jet stream patterns and wild weather swings.

In addition, some climatologists are pondering whether a warm El Niño pattern (which often follows La Niña) may be lurking over the horizon. Patzert says that would be perfectly fine for the United States.

“For the United States, there would be some positives to the appearance of El Niño this summer,” Patzert said. “The parched and fire-ravaged southern tier of the country would certainly benefit from a good El Niño soaking. Looking ahead to late August and September, El Niño would also tend to dampen the 2011 hurricane season in the United States. We’ve had enough wild and punishing weather this year. Relief from the drought across the southern United States and a mild hurricane season would be very welcome.”

Jason-2 scientists will continue to monitor Pacific Ocean sea surface heights for signs of El Niño, La Niña or prolonged neutral conditions.

JPL manages the U.S. portion of the OSTM/Jason-2 mission for NASA’s Science Mission Directorate, Washington, D.C.

For more information on NASA’s ocean surface topography missions, visit: http://sealevel.jpl.nasa.gov/missions/.

To view the latest Jason-1 and OSTM/Jason-2 data, visit: http://sealevel.jpl.nasa.gov/science/elninopdo/latestdata/.

Alan Buis 818-354-0474
Jet Propulsion Laboratory, Pasadena, Calif.
Alan.buis@jpl.nasa.gov

2011-199

IPCC aprimora rigor científico e estratégias de comunicação (FAPESP)

POLÍTICA DE C & T
Em clima de diálogo

Carlos Fioravanti
Edição Impressa 184 – Junho 2011

Dos Andes para a Amazônia: bactérias da bartonelose se espalham. © EDUARDO CESAR

O Painel Intergovernamental de Mudanças Climáticas (IPCC) está em fase de reformulação. Deve ampliar o rigor científico com que sua equipe de cientistas tem trabalhado e se tornar mais sensível às inquietações de negociadores internacionais como Sir John Beddington, conselheiro científico chefe do governo do Reino Unido (ver entrevista). No dia 11 de maio, o primeiro de um workshop do Programa FAPESP de Pesquisa sobre Mudanças Climáticas Globais (PFPMCG), Beddington alertou para as consequências provavelmente dramáticas das mudanças do clima, da urbanização, da escassez de alimentos e de água no mundo. Dois dias depois, 13 de maio, em Abu Dabi, capital dos Emirados Árabes Unidos, os líderes do IPCC anunciaram que adotarão as recomendações sobre mudanças de métodos de trabalho e estratégias de comunicação propostas pelo InterAcademy Council (IAC), que embasam as mudanças em curso.

Em abril de 2010 as Nações Unidas, que mantêm o IPCC, tinham pedido ao IAC para formar um comitê independente de revisão dos procedimentos do IPCC, que havia perdido credibilidade após a divulgação de uma série de mensagens eletrônicas indicando que algumas previsões sobre os efeitos das alterações climáticas tinham sido precipitadas. Uma delas era que as geleiras do Himalaia desapareceriam até 2035. “Os erros, embora pequenos, tiveram um efeito imenso”, observou para Pesquisa FAPESP Robbert Dijkgraaf, membro do IAC, presidente da Academia Real Holandesa de Ciências e Artes e professor da Universidade de Amsterdã, Holanda. “Eles deveriam ter sido corrigidos imediatamente, mas o IPCC não achava que havia necessidade de comunicação ou de explicações, já que as medidas que apresentavam eram consensuais.”

Dijkgraaf acompanhou o trabalho do comitê do IAC, que reuniu 12 especialistas de academias de ciências e conselhos de pesquisa de diversos países, entre os quais o Brasil, representado por Carlos Henrique de Brito Cruz, diretor científico da FAPESP. “Os dirigentes do IPCC aceitaram a maioria de nossas recomendações e sugestões”, comentou o economista Harold Shapiro, professor e ex-reitor da Universidade Princeton, nos Estados Unidos, e coordenador do comitê.

As recomendações do comitê do IAC sugerem mudanças na governança e no gerenciamento, nos métodos de revisão do trabalho científico, na caracterização e na comunicação das incertezas científicas e nas estratégias de comunicação. “Qualquer organização precisa se rever, de tempos em tempos, porque os tempos mudam”, disse Shapiro para Pesquisa FAPESP. O comitê do IAC sugeriu que o presidente do IPCC tenha apenas um mandato e que todo o enfoque de trabalho seja revisto a cada quatro ou seis anos.

O IAC sugeriu que o IPCC explicitasse mais claramente os modos pelos quais os documentos técnicos serão revisados, apresentasse uma variedade maior de visões científicas, incluindo aquelas sujeitas a controvérsias. Outro ponto relevante: explicitar as incertezas científicas. “O IPCC e os cientistas do clima devem reconhecer mais claramente o que sabem e também o que não sabem”, disse Dijkgraaf. Outra recomendação seguida à risca: implementar uma estratégia de comunicação que enfatize a transparência e respostas rápidas e satisfatórias a qualquer interessado. “O IPCC deve se tornar mais interativo e os cientistas do clima, mais críticos do que fazem.”

Colaborações – Na abertura do workshop do Programa FAPESP de Pesquisa em Mudanças Climáticas Globais (PFPMCG), Shaun Quegan, pesquisador da Universidade de Sheffield, Reino Unido, comentou: “As estimativas anuais de áreas desmatadas em florestas tropicais são precisas e altamente confiáveis, pelo menos no Brasil”. No entanto, acrescentou, “a utilização desses dados para avaliar as emissões de carbono provenientes de mudanças de uso do solo traz grandes incertezas, principalmente porque o mapeamento da biomassa das florestas é precário”. O objetivo do encontro era estimular a integração entre as equipes dos vários projetos de pesquisa que compõem o PFPMCG, agora coordenado por Reynaldo Luiz Victoria, pesquisador da Universidade de São Paulo, que substituiu o climatologista Carlos Nobre.

Em uma das apresentações do segundo dia, o médico Manuel Cesario, pesquisador da Universidade de Franca (Unifran), relatou seu estudo sobre disseminação de doenças infecciosas na Amazônia – ampliadas pelas mudanças no uso da terra promovidas pelo asfaltamento de estradas, pelo desmatamento e pela urbanização – e as alterações do clima na América do Sul. Pesquisadores da Universidade de São Paulo, Universidade Estadual Paulista, Universidade Federal da Bahia, Universidade Federal de Santa Catarina e Fundação Oswaldo Cruz participam desse trabalho.

Cesario acredita que a bartonelose, doença de origem bacteriana com sintomas semelhantes aos da malária, antes restrita a regiões dos Andes de 500 a 3.200 metros de altitude, pode ter se expandido geograficamente e se adaptado a regiões mais baixas na esteira da crescente migração e das alterações climáticas. A seu ver, essa doença, detectada pela primeira vez em 2004 na região de Madre de Dios, sudeste do Peru, pode passar facilmente pela fronteira com o Acre, no Brasil, e com Pando, na Bolívia. As cidades dessa região estão cada vez mais interligadas pelo prolongamento da rodovia BR-317: a Rodovia Interoceânica, também chamada de Estrada do Pacífico, já em operação e quase toda asfaltada.

A leishmaniose também avança. “As duas formas de leishmaniose, a visceral e a cutânea, no Brasil, eram doenças associadas ao desmatamento, transmitidas por vetores tipicamente de florestas, mas hoje estão ligadas à urbanização e ao desmatamento”, disse. A bartonelose e a leishmaniose são transmitidas por insetos do gênero Lutzomyia, abundantes na região. Em 2008 Cesario e sua equipe percorreram o município de Assis Brasil e, para capturar insetos, instalavam armadilhas das seis da noite às seis da manhã. Em uma semana coletaram mais de 3 mil insetos de 56 espécies de Lutzomyia. “As casas com frestas, próximas à floresta e com animais de criação por perto”, disse ele, “formam o ambiente ideal para os insetos que saem de seus espaços naturais e usam restos de material orgânico para se reproduzir e animais para sugar o sangue, aproximando-se das pessoas e transmitindo as doenças”.

Academic Adaptation and “The New Communications Climate” (Open the Echo Chamber blog)

Posted by Edward R. Carr (31 May 2011)

[Original post here].

Andrew Revkin has a post up on Dot Earth that suggests some ways of rethinking scientific engagement with the press and the public.  The post is something of a distillation of a more detailed piece in the WMO Bulletin.  Revkin was kind enough to solicit my comments on the piece, as I have appeared in Dot Earth before in an effort to deal with this issue as it applies to the IPCC, and this post is something of a distillation of my initial rapid response.

First, I liked the message of these two pieces a lot, especially the push for a more holistic engagement with the public through different forms of media, including the press.  As Revkin rightly states, we need to “recognize that the old model of drafting a press release and waiting for the phone to ring is not the path to efficacy and impact.” Someone please tell my university communications office.

A lot of the problem stems from our lack of engagement with professionals in the messaging and marketing world.  As I said to the very gracious Rajendra Pachauri in an email exchange back when we had the whole “don’t talk to the media” controversy:

I am in no way denigrating your [PR] efforts. I am merely suggesting that there are people out there who spend their lives thinking about how to get messages out there, and control that message once it is out there. Just as we employ experts in our research and in these assessment reports precisely because they bring skills and training to the table that we lack, so too we must consider bringing in those with expertise in marketing and outreach.

I assume that a decent PR team would be thinking about multiple platforms of engagement, much as Revkin is suggesting.  However, despite the release of a new IPCC communications strategy, I’m not convinced that the IPCC (or much of the global change community more broadly) yet understands how desperately we need to engage with professionals on this front.  In some ways, there are probably good reasons for the lack of engagement with pros, or with the “new media.” For example, I’m not sure Twitter will help with managing climate change rumors/misinformation as it is released, if only because we are now too far behind the curve – things are so politicized that it is too late for “rapid response” to misinformation. I wish we’d been on this twenty years ago, though . . .

But this “behind the curve” mentality does not explain our lack of engagement.  Instead, I think there are a few other things lurking here.  For example, there is the issue of institutional politics. I love the idea of using new media/information and communication technologies for development (ICT4D) to gather and communicate information, but perhaps not in the ways Revkin suggests.  I have a section later inDelivering Development that outlines how, using existing mobile tech in the developing world, we could both get better information about what is happening to the global poor (the point of my book is that, as I think I demonstrate in great detail, we actually have a very weak handle on what is going on in most parts of the developing world) and could empower the poor to take charge of efforts to address the various challenges, environmental, economic, political and social, that they face every day.  It seems to me, though, that the latter outcome is a terrifying prospect for some in development organizations, as this would create a much more even playing field of information that might force these organizations to negotiate with and take seriously the demands of the people with whom they are working.  Thus, I think we get a sort of ambiguity about ICT4D in development practice, where we seem thrilled by its potential, yet continue to ignore it in our actual programming.  This is not a technical problem – after all, we have the tech, and if we want to do this, we can – it is a problem of institutional politics.  I did not wade into a detailed description of the network I envision in the book because I meant to present it as a political challenge to a continued reticence on the part of many development organizations and practitioners to really engage the global poor (as opposed to tell them what they need and dump it on them).  But my colleagues and I have a detailed proposal for just such a network . . . and I think we will make it real one day.

Another, perhaps more significant barrier to major institutional shifts with regard to outreach is the a chicken-and-egg situation of limited budgets and a dominant academic culture that does not understand media/public engagement or politics very well and sees no incentive for engagement.  Revkin nicely hits on the funding problem as he moves past simply beating up on old-school models of public engagement:

As the IPCC prepares its Fifth Assessment Report, it does so with what, to my eye, appears to be an utterly inadequate budget for communicating its findings and responding in an agile way to nonstop public scrutiny facilitated by the Internet.

However, as much as I agree with this point (and I really, really agree), the problem here is not funding unto itself – it is the way in which a lack of funding erases an opportunity for cultural change that could have a positive feedback effect on the IPCC, global assessments, and academia more generally that radically alters all three. The bulk of climate science, as well as social impact studies, come from academia – which has a very particular culture of rewards.  Virtually nobody in academia is trained to understand that they can get rewarded for being a public intellectual, for making one’s work accessible to a wide community – and if I am really honest, there are many places that actively discourage this engagement.  But there is a culture change afoot in academia, at least among some of us, that could be leveraged right now – and this is where funding could trigger a positive feedback loop.

Funding matters because once you get a real outreach program going, productive public engagement would result in significant personal, intellectual and financial benefits for the participants that I believe could result in very rapid culture change.  My twitter account has done more for the readership of my blog, and for my awareness of the concerns and conversations of the non-academic development world, than anything I have ever done before – this has been a remarkable personal and intellectual benefit of public engagement for me.  As universities continue to retrench, faculty find themselves ever-more vulnerable to downsizing, temporary appointments, and a staggering increase in administrative workload (lots of tasks distributed among fewer and fewer full-time faculty).  I fully expect that without some sort of serious reversal soon, I will retire thirty-odd years hence as an interesting and very rare historical artifact – a professor with tenure.  Given these pressures, I have been arguing to my colleagues that we must engage with the public and with the media to build constituencies for what we do beyond our academic communities.  My book and my blog are efforts to do just this – to become known beyond the academy such that I, as a public intellectual, have leverage over my university, and not the other way around.  And I say this as someone who has been very successful in the traditional academic model.  I recognize that my life will need to be lived on two tracks now – public and academic – if I really want to help create some of the changes in the world that I see as necessary.

But this is a path I started down on my own, for my own idiosyncratic reasons – to trigger a wider change, we cannot assume that my academic colleagues will easily shed the value systems in which they were intellectually raised, and to which they have been held for many, many years.  Without funding to get outreach going, and demonstrate to this community that changing our model is not only worthwhile, but enormously valuable, I fear that such change will come far more slowly than the financial bulldozers knocking on the doors of universities and colleges across the country.  If the IPCC could get such an effort going, demonstrate how public outreach improved the reach of its results, enhanced the visibility and engagement of its participants, and created a path toward the progressive politics necessary to address the challenge of climate change, it would be a powerful example for other assessments.  Further, the participants in these assessments would return to their campuses with evidence for the efficacy and importance of such engagement . . . and many of these participants are senior members of their faculties, in a position to midwife major cultural changes in their institutions.

All this said, this culture change will not be birthed without significant pains.  Some faculty and members of these assessments will want nothing to do with the murky world of politics, and prefer to continue operating under the illusion that they just produce data and have no responsibility for how it is used.  And certainly the assessments will fear “politicization” . . . to which I respond “too late.”  The question is not if the findings of an assessment will be politicized, but whether or not those who best understand those findings will engage in these very consequential debates and argue for what they feel is the most rigorous interpretation of the data at hand.  Failure to do so strikes me as dereliction of duty.  On the other hand, just as faculty might come to see why public engagement is important for their careers and the work they do, universities will be gripped with contradictory impulses – a publicly-engaged faculty will serve as a great justification for faculty salaries, increased state appropriations, new facilities, etc.  Then again, nobody likes to empower the labor, as it were . . .

In short, in thinking about public engagement and the IPCC, Revkin is dredging up a major issue related to all global assessments, and indeed the practices of academia.  I think there is opportunity here – and I feel like we must seize this opportunity.  We can either guide a process of change to a productive end, or ride change driven by others wherever it might take us.  I prefer the former.