Tag Archives: comunicação científica

How to Talk About Climate Change So People Will Listen (The Atlantic)


Environmentalists warn us that apocalypse awaits. Economists tell us that minimal fixes will get us through. Here’s how we can move beyond the impasse. 

Josh Cochran

Not long ago, my newspaper informed me that glaciers in the western Antarctic, undermined by the warmer seas of a hotter world, were collapsing, and their disappearance “now appears to be unstoppable.” The melting of these great ice sheets would make seas rise by at least four feet—ultimately, possibly 12—more than enough to flood cities from New York to Tokyo to Mumbai. Because I am interested in science, I read the two journal articles that had inspired the story. How much time do we have, I wondered, before catastrophe hits?

One study, in Geophysical Research Letters, provided no guidance; the authors concluded only that the disappearing glaciers would “significantly contribute to sea level rise in decades to centuries to come.” But the other, in Science, offered more-precise estimates: during the next century, the oceans will surge by as much as a quarter of a millimeter a year. By 2100, that is, the calamity in Antarctica will have driven up sea levels by almost an inch. The process would get a bit faster, the researchers emphasized, “within centuries.”

How is one supposed to respond to this kind of news? On the one hand, the transformation of the Antarctic seems like an unfathomable disaster. On the other hand, the disaster will never affect me or anyone I know; nor, very probably, will it trouble my grandchildren. How much consideration do I owe the people it will affect, my 40-times-great-grandchildren, who, many climate researchers believe, will still be confronted by rising temperatures and seas? Americans don’t even save for their own retirement! How can we worry about such distant, hypothetical beings?

In our ergonomic chairs and acoustical-panel cubicles, we sit cozy as kings atop 300 years of flaming carbon.

Worse, confronting climate change requires swearing off something that has been an extraordinary boon to humankind: cheap energy from fossil fuels. In the 3,600 years between 1800B.C. and 1800 A.D., the economic historian Gregory Clark has calculated, there was “no sign of any improvement in material conditions” in Europe and Asia. Then came the Industrial Revolution. Driven by the explosive energy of coal, oil, and natural gas, it inaugurated an unprecedented three-century wave of prosperity. Artificial lighting, air-conditioning, and automobiles, all powered by fossil fuels, swaddle us in our giddy modernity. In our ergonomic chairs and acoustical-panel cubicles, we sit cozy as kings atop 300 years of flaming carbon.

In the best of times, this problem—given its apocalyptic stakes, bewildering scale, and vast potential cost—would be difficult to resolve. But we are not in the best of times. We are in a time of legislative paralysis. In an important step, the Obama administration announced in June its decision to cut power-plant emissions 30 percent by 2030. Otherwise, this country has seen strikingly little political action on climate change, despite three decades of increasingly high-pitched chatter by scientists, activists, economists, pundits, and legislators.

The chatter itself, I would argue, has done its share to stall progress. Rhetorical overreach, moral miscalculation, shouting at cross-purposes: this toxic blend is particularly evident when activists, who want to scare Americans into taking action, come up against economists, with their cool calculations of acceptable costs. Eco-advocates insist that only the radical transformation of society—the old order demolished, foundation to roof—can fend off the worst consequences of climate change. Economists argue for adapting to the most-likely consequences; cheerleaders for industrial capitalism, they propose quite different, much milder policies, and are ready to let nature take a bigger hit in the short and long terms alike. Both envelop themselves in the mantle of Science, emitting a fug of charts and graphs. (Actually, every side in the debate, including the minority who deny that humans can affect the climate at all, claims the backing of Science.) Bewildered and battered by the back-and-forth, the citizenry sits, for the most part, on its hands. For all the hot air expended on the subject, we still don’t know how to talk about climate change.

As an issue, climate change was unlucky: when nonspecialists first became aware of it, in the 1990s, environmental attitudes had already become tribal political markers. As the Yale historian Paul Sabin makes clear in The Bet, it wasn’t always this way. The votes for the 1970 Clean Air Act, for example, were 374–1 in the House, 73–0 in the Senate. Sabin’s book takes off from a single event: a bet between the ecologist Paul R. Ehrlich and the economist Julian Simon a decade later. Ehrlich’s The Population Bomb (1968), which decried humankind’s rising numbers, was a foundational text in the environmental movement. Simon’s Ultimate Resource (1981) was its antimatter equivalent: a celebration of population growth, it awakened opposition to the same movement.

Activist led by Bill McKibben, the founder of 350.org, protest the building of the Keystone XL pipeline at the White House, February 2013. (AP)

Ehrlich was moderately liberal in his politics but unrestrained in his rhetoric. The second sentence of The Population Bomb promised that “hundreds of millions of people” would starve to death within two decades, no matter what “crash programs” the world launched to feed them. A year later, Ehrlich gave even odds that “England will not exist in the year 2000.” In 1974, he told Congress that “a billion or more people” could starve in the 1980s “at the latest.” When the predictions didn’t pan out, he attacked his critics as “incompetent” and “ignorant,” “morons” and “idiots.”

Simon, who died in 1998, argued that “human resourcefulness and enterprise” will extricate us from our ecological dilemma. Moderately conservative in his politics, he was exuberantly uninhibited in his scorn for eco-alarmists. Humankind faces no serious environmental problems, he asserted. “All long-run trends point in exactly the opposite direction from the projections of the doomsayers.” (All? Really?) “There is no convincing economic reason why these trends toward a better life should not continue indefinitely.” Relishing his role as a spoiler, he gave speeches while wearing red plastic devil horns. Unsurprisingly, he attracted disagreement, to which he responded with as much bluster as Ehrlich. Critics, motivated by “blatant intellectual dishonesty” and indifference to the poor, were “corrupt,” their ideas “ignorant and wrongheaded.”

In 1980, the two men wagered $1,000 on the prices of five metals 10 years hence. If the prices rose, as Ehrlich predicted, it would imply that these resources were growing scarcer, as Homo sapiens plundered the planet. If the prices fell, this would be a sign that markets and human cleverness had made the metals relatively less scarce: progress was continuing. Prices dropped. Ehrlich paid up, insisting disingenuously that he had been “schnookered.”

Schnookered, no; unlucky, yes. In 2010, three Holy Cross economists simulated the bet for every decade from 1900 to 2007. Ehrlich would have won 61 percent of the time. The results, Sabin says, do not prove that these resources have grown scarcer. Rather, metal prices crashed after the First World War and spent most of a century struggling back to their 1918 levels. Ecological issues were almost irrelevant.

The bet demonstrated little about the environment but much about environmental politics. The American landscape first became a source of widespread anxiety at the beginning of the 20th century. Initially, the fretting came from conservatives, both the rural hunters who established the licensing system that brought back white-tailed deer from near-extinction and the Ivy League patricians who created the national parks. So ineradicable was the conservative taint that decades later, the left still scoffed at ecological issues as right-wing distractions. At the University of Michigan, the radical Students for a Democratic Society protested the first Earth Day, in 1970, as elitist flimflam meant to divert public attention from class struggle and the Vietnam War; the left-wing journalist I. F. Stone called the nationwide marches a “snow job.” By the 1980s, businesses had realized that environmental issues had a price tag. Increasingly, they balked. Reflexively, the anticorporate left pivoted; Earth Day, erstwhile snow job, became an opportunity to denounce capitalist greed.

Climate change is a perfect issue for symbolic battle, because it is as yet mostly invisible.

The result, as the Emory historian Patrick Allitt demonstrates in A Climate of Crisis, was a political back-and-forth that became ever less productive. Time and again, Allitt writes, activists and corporate executives railed against each other. Out of this clash emerged regulatory syntheses: rules for air, water, toxins. Often enough, businesspeople then discovered that following the new rules was less expensive than they had claimed it would be; environmentalists meanwhile found out that the problems were less dire than they had claimed.


Throughout the 1980s, for instance, activists charged that acid rain from midwestern power-plant emissions was destroying thousands of East Coast lakes. Utilities insisted that anti-pollution equipment would be hugely expensive and make homeowners’ electric bills balloon. One American Electric Power representative predicted that acid-rain control could lead to the “destruction of the Midwest economy.” A 1990 amendment to the Clean Air Act, backed by both the Republican administration and the Democratic Congress, set up a cap-and-trade mechanism that reduced acid rain at a fraction of the predicted cost; electric bills were barely affected. Today, most scientists have concluded that the effects of acid rain were overstated to begin with—fewer lakes were hurt than had been thought, and acid rain was not the only cause.

Rather than learning from this and other examples that, as Allitt puts it, “America’s environmental problems, though very real, were manageable,” each side stored up bitterness, like batteries taking on charge. The process that had led, however disagreeably, to successful environmental action in the 1970s and ’80s brought on political stasis in the ’90s. Environmental issues became ways for politicians to signal their clan identity to supporters. As symbols, the issues couldn’t be compromised. Standing up for your side telegraphed your commitment to take back America—either from tyrannical liberal elitism or right-wing greed and fecklessness. Nothing got done.

As an issue, climate change is perfect for symbolic battle, because it is as yet mostly invisible. Carbon dioxide, its main cause, is not emitted in billowing black clouds, like other pollutants; nor is it caustic, smelly, or poisonous. A side effect of modernity, it has for now a tiny practical impact on most people’s lives. To be sure, I remember winters as being colder in my childhood, but I also remember my home then as a vast castle and my parents as godlike beings.

In concrete terms, Americans encounter climate change mainly in the form of three graphs, staples of environmental articles. The first shows that atmospheric carbon dioxide has been steadily increasing. Almost nobody disputes this. The second graph shows rising global temperatures. This measurement is trickier: carbon dioxide is spread uniformly in the air, but temperatures are affected by a host of factors (clouds, rain, wind, altitude, the reflectivity of the ground) that differ greatly from place to place. Here the data are more subject to disagreement. A few critics argue that for the past 17 years warming has mostly stopped. Still, most scientists believe that in the past century the Earth’s average temperature has gone up by about 1.5 degrees Fahrenheit.

Rising temperatures per se are not the primary concern. What matters most is their future influence on other things: agricultural productivity, sea levels, storm frequency, infectious disease. As the philosopher Dale Jamieson points out in the unfortunately titled Reason in a Dark Time, most of these effects cannot be determined by traditional scientific experiments—white-coats in laboratories can’t melt a spare Arctic ice cap to see what happens. (Climate change has no lab rats.) Instead, thousands of researchers refine ever bigger and more complex mathematical models. The third graph typically shows the consequences such models predict, ranging from worrisome (mainly) to catastrophic (possibly).

Such charts are meaningful to the climatologists who make them. But for the typical citizen they are a muddle, too abstract—too much like 10th-grade homework—to be convincing, let alone to motivate action. In the history of our species, has any human heart ever been profoundly stirred by a graph? Some other approach, proselytizers have recognized, is needed.

To stoke concern, eco-campaigners like Bill McKibben still resort, Ehrlich-style, to waving a skeleton at the reader. Thus the first sentence of McKibben’sOil and Honey, a memoir of his climate activism, describes 2011–12, the period covered by his book, as “a time when the planet began to come apart.” Already visible “in almost every corner of the earth,” climate “chaos” is inducing “an endless chain of disasters that will turn civilization into a never-ending emergency response drill.”

Bill McKibben says we must “start producing a nation of careful, small-scale farmers … who can adapt to the crazed new world with care and grace.”

The only solution to our ecological woes, McKibben argues, is to live simpler, more local, less resource-intensive existences—something he believes is already occurring. “After a long era of getting big and distant,” he writes, “our economy, and maybe our culture, has started to make a halting turn toward the small and local.” Not only will this shift let us avoid the worst consequences of climate change, it will have the happy side effect of turning a lot of unpleasant multinational corporations to ash. As we “subside into a workable, even beautiful, civilization,” we will lead better lives. No longer hypnotized by the buzz and pop of consumer culture, narcotized couch potatoes will be transformed into robust, active citizens: spiritually engaged, connected to communities, appreciative of Earth’s abundance.

For McKibben, the engagement is full throttle: The Oil half of his memoir is about founding 350.org, a group that seeks to create a mass movement against climate change. (The 350 refers to the theoretical maximum safe level, in parts per million, of atmospheric carbon dioxide, a level we have already surpassed.) The Honey half is about buying 70 acres near his Vermont home to support an off-the-grid beekeeper named Kirk Webster, who is living out McKibben’s organic dream in a handcrafted, solar-powered cabin in the woods. Webster, McKibben believes, is the future. We must, he says, “start producing a nation of careful, small-scale farmers such as Kirk Webster, who can adapt to the crazed new world with care and grace, and who don’t do much more damage in the process.”

Poppycock, the French philosopher Pascal Bruckner in effect replies in The Fanaticism of the Apocalypse. A best-selling, telegenic public intellectual (a species that hardly exists in this country), Bruckner is mainly going after what he calls “ecologism,” of which McKibbenites are exemplars. At base, he says, ecologism seeks not to save nature but to purify humankind through self-flagellating asceticism.

To Bruckner, ecologism is both ethnocentric and counterproductive. Ethnocentric because eco-denunciations of capitalism simply give new, green garb to the long-standing Euro-American fear of losing dominance over the developing world (whose recent growth derives, irksomely, from fossil fuels). Counterproductive because ecologism induces indifference, or even hostility to environmental issues. In the quest to force humanity into a puritanical straitjacket of rural simplicity, ecologism employs what should be neutral, fact-based descriptions of a real-world problem (too much carbon dioxide raises temperatures) as bludgeons to compel people to accept modes of existence they would otherwise reject. Intuiting moral blackmail underlying the apparently objective charts and graphs, Bruckner argues, people react with suspicion, skepticism, and sighing apathy—the opposite of the reaction McKibbenites hope to evoke.

The ranchers and farmers in Tony Horwitz’s Boom, a deft and sometimes sobering e-book, suggest Bruckner may be on to something. Horwitz, possibly best known for his study of Civil War reenactors, Confederates in the Attic, travels along the proposed path of the Keystone XL, a controversial pipeline intended to take oil from Alberta’s tar-sands complex to refineries in Steele City, Nebraska—and the project McKibben has used as the rallying cry for 350.org. McKibben set off on his anti-Keystone crusade after the climatologist-provocateur James Hansen charged in 2011 that building the pipeline would be “game over” for the climate. If Keystone were built, Hansen later wrote, “civilization would be at risk.” Everyone Horwitz meets has heard this scenario. But nobody seems to have much appetite for giving up the perks of industrial civilization, Kirk Webster–style. “You want to go back to the Stone Age and use only wind, sun, and water?” one person asks. A truck driver in the tar-sands project tells Horwitz, “This industry is giving me a future, even if it’s a short one and we’re all about to toast together.” Given the scale of the forces involved, individual action seems futile. “It’s going to burn up anyhow at the end,” explains a Hutterite farmer, matter-of-factly. “The world will end in fire.”


Whereas McKibbenites see carbon dioxide as an emblem of a toxic way of life, economists like William Nordhaus of Yale tend to view it as simply a by-product of the good fortune brought by capitalism. Nordhaus, the president of the American Economic Association, has researched climate issues for four decades. His The Climate Casino has an even, unhurried tone; a classic Voice of Authority rumbles from the page. Our carbon-dioxide issues, he says, have a “simple answer,” one “firmly based in economic theory and history”:

The best approach is to use market mechanisms. And the single most important market mechanism that is missing today is a high price on CO2 emissions, or what is called “carbon prices” … The easiest way is simply to tax CO2 emissions: a “carbon tax” … The carbon price [from the tax] will be passed on to the consumer in the form of higher prices.

Nordhaus provides graphs (!) showing how a gradually increasing tax—or, possibly, a market in emissions permits—would slowly and steadily ratchet down global carbon-dioxide output. The problem, as he admits, is that the projected reduction “assumes full participation.” Translated from econo-speak, “full participation” means that the Earth’s rich and populous nations must simultaneously apply the tax. Brazil, China, France, India, Russia, the United States—all must move in concert, globally cooperating.

To say that a global carbon tax is a simple answer is like arguing that the simple answer to death is repealing the Second Law of Thermodynamics.

Alas, nothing like Nordhaus’s planetary carbon tax has ever been enacted. The sole precedent is the Montreal Protocol, the 1987 treaty banning substances that react with atmospheric ozone and reduce its ability to absorb the sun’s harmful ultraviolet radiation. Signed by every United Nations member and successfully updated 10 times, the protocol is a model of international eco-cooperation. But it involves outlawing chemicals in refrigerators and spray cans, not asking nations to revamp the base of their own prosperity. Nordhaus’s declaration that a global carbon tax is a simple answer is like arguing that the simple answer to death is repealing the Second Law of Thermodynamics.

Does climate change, as Nordhaus claims, truly slip into the silk glove of standard economic thought? The dispute is at the center of Jamieson’s Reason in a Dark Time. Parsing logic with the care of a raccoon washing a shiny stone, Jamieson maintains that economists’ discussions of climate change are almost as problematic as those of environmentalists and politicians, though for different reasons.

Remember how I was complaining that all discussions of climate change devolve into homework? Here, sadly, is proof. To critique economists’ claims, Jamieson must drag the reader through the mucky assumptions underlying cost-benefit analysis, a standard economic tool. In the case of climate change, the costs of cutting carbon dioxide are high. What are the benefits? If the level of carbon dioxide in the atmosphere rises only slightly above its current 400 parts per million, most climatologists believe, there is (roughly) a 90 percent chance that global temperatures will eventually rise between 3 and 8 degrees Fahrenheit, with the most likely jump being between 4 and 5 degrees. Nordhaus and most other economists conclude that humankind can slowly constrain this relatively modest rise in carbon without taking extraordinary, society-transforming measures, though neither decreasing the use of fossil fuels nor offsetting their emissions will be cheap or easy. But the same estimates show (again in rough terms) a 5 percent chance that letting carbon dioxide rise much above its current level would set off a domino-style reaction leading to global devastation. (No one pays much attention to the remaining 5 percent chance that the carbon rise would have very little effect on temperature.)

In our daily lives, we typically focus on the most likely result: I decide whether to jaywalk without considering the chance that I will trip in the street and get run over. But sometimes we focus on the extreme: I lock up my gun and hide the bullets in a separate place to minimize the chance that my kids will find and play with them. For climate change, should we focus on adapting to the mostprobable outcome or averting the most dangerous one? Cost-benefit analyses typically ignore the most-radical outcomes: they assume that society has agreed to accept the small but real risk of catastrophe—something environmentalists, to take one particularly vehement section of society, have by no means done.

On top of this, Jamieson argues, there is a second problem in the models economists use to discus climate change. Because the payoff from carbon-dioxide reduction will occur many decades from now, Nordhausian analysis suggests that we should do the bare minimum today, even if that means saddling our descendants with a warmer world. Doing the minimum is expensive enough already, economists say. Because people tomorrow will be richer than we are, as we are richer than our grandparents were, they will be better able to pay to clean up our emissions. Unfortunately, this is an ethically problematic stance. How can we weigh the interests of someone born in 2050 against those of someone born in 1950? In this kind of trade-off between generations, Jamieson argues, “there is no plausible value” for how much we owe the future.

Given their moral problems, he concludes, economic models are much less useful as guides than their proponents believe. For all their ostensible practicality—for all their attempts to skirt the paralysis-inducing specter of the apocalypse—economists, too, don’t have a good way to talk about climate change.

Years ago, a colleague and I spoke with the physicist Richard Feynman, later a national symbol of puckish wit and brash truth-telling. At the frontiers of science, he told us, hosts of unclear, mutually contradictory ideas are always swarming about. Researchers can never agree on how to proceed or even on what is important. In these circumstances, Feynman said, he always tried to figure out what would take him forward no matter which theory eventually turned out to be correct. In this agnostic spirit, let’s assume that rising carbon-dioxide levels will become a problem of some magnitude at some time and that we will want to do something practical about it. Is there something we should do, no matter what technical arcanae underlie the cost-benefit analyses, no matter when we guess the bad effects from climate change will kick in, no matter how we value future generations, no matter what we think of global capitalism? Indeed, is there some course of action that makes sense even if we think that climate change isn’t much of a problem at all?

As my high-school math teacher used to say, let’s do the numbers. Roughly three-quarters of the world’s carbon-dioxide emissions come from burning fossil fuels, and roughly three-quarters of that comes from just two sources: coal in its various forms, and oil in its various forms, including gasoline. Different studies produce slightly different estimates, but they all agree that coal is responsible for more carbon dioxide than oil is—about 25 percent more. That number is likely to increase, because coal consumption is growing much faster than oil consumption.

Geo-engineering involves tinkering with planetary systems we only partially understand. But planet-hacking does have an overarching advantage: it’s cheap.​

Although coal and oil are both fossil fuels, they are used differently. In this country, for example, the great majority of oil—about three-quarters—is consumed by individuals, as they heat their homes and drive their cars. Almost all U.S. coal (93 percent) is burned not in homes but by electric-power plants; the rest is mainly used by industry, notably for making cement and steel. Cutting oil use, in other words, requires huge numbers of people to change their houses and automobiles—the United States alone has 254 million vehicles on the road. Reducing U.S. coal emissions, by contrast, means regulating 557 big power plants and 227 steel and cement factories. (Surprisingly, many smaller coal plants exist, some at hospitals and schools, but their contributions are negligible.) I’ve been whacking poor old Nordhaus for his ideas about who should pay for climate change, but he does make this point, and precisely: “The most cost-effective way to reduce CO2 emissions is to reduce the use of coal first and most sharply.” Note, too, that this policy comes with a public-health bonus: reining in coal pollution could ultimately avoid as many as 6,600 premature deaths and 150,000 children’s asthma attacks per year in the United States alone.


Different nations have different arrangements, but almost everywhere the basic point holds true: a relatively small number of industrial coal plants—perhaps 7,000 worldwide—put out an amazingly large amount of carbon dioxide, more than 40 percent of the global total. And that figure is rising; last year, coal’s share of energy production hit a 44-year high, because Asian nations are building coal plants at a fantastic rate (and, possibly, because demand for coal-fired electricity will soar as electric cars become popular). No matter what your views about the impact and import of climate change, you are primarily talking about coal. To my mind, at least, retrofitting 7,000 industrial facilities, however mind-boggling, is less mind-boggling than, say, transforming the United States into “a nation of careful, small-scale farmers” or enacting a global carbon tax with “full participation.” It is, at least, imaginable.

The focus of the Obama administration on reducing coal emissions suggests that it has followed this logic. If the pattern of the late 20th century still held, industry would reply with exaggerated estimates of the cost, and compromises would be worked out. But because the environment has become a proxy for a tribal battle, an exercise in power politics will surely ensue. I’ve given McKibben grief for his apocalyptic rhetoric, but he’s exactly correct that without a push from a popular movement—without something like 350.org—meaningful attempts to cut back coal emissions are much less likely to yield results.

Regrettably, 350.org has fixated on the Keystone pipeline, which the Congressional Research Service has calculated would raise this nation’s annual output of greenhouse gases by 0.05 to 0.3 percent. (James Hansen, in arguing that the pipeline would be “game over” for the climate, erroneously assumed that all of the tar-sands oil could be burned rapidly, instead of dribbling out in relatively small portions year by year, over decades.) None of this is to say that exploiting tar sands is a good idea, especially given the apparent violation of native treaties in Canada. But a popular movement focused on symbolic goals will have little ability to win practical battles in Washington.

If politics fail, the only recourse, says David Keith, a Harvard professor of public policy and applied physics, will be a technical fix. And soon—by mid-century. Keith is talking about geo-engineering: fighting climate change with more climate change. A Case for Climate Engineering is a short book arguing that we should study spraying the stratosphere with tiny glittering droplets of sulfuric acid that bounce sunlight back into space, reducing the Earth’s temperature. Physically speaking, the notion is feasible. The 1991 eruption of Mount Pinatubo, in the Philippines, created huge amounts of airborne sulfuric acid—and lowered the Earth’s average temperature that year by about 1 degree.

Keith is candid about the drawbacks. Not only does geo-engineering involve tinkering with planetary systems we only partially understand, it can’t cancel out, even in theory, greenhouse problems like altered rainfall patterns and increased ocean acidity. The sulfur would soon fall to the Earth, a toxic rain of pollution that could kill thousands of people every year. The carbon dioxide that was already in the air would remain. To continue to slow warming, sulfur would have to be lofted anew every year. Still, Keith points out, without this relatively crude repair, unimpeded climate change could be yet more deadly.

Planet-hacking does have an overarching advantage: it’s cheap. “The cost of geoengineering the entire planet for a decade,” Keith writes, “could be less than the $6 billion the Italian government is spending on dikes and movable barriers to protect a single city, Venice, from climate change–related sea level rise.”

That advantage is also dangerous, he points out. A single country could geo-engineer the whole planet by itself. Or one country’s geo-engineering could set off conflicts with another country—a Chinese program to increase its monsoon might reduce India’s monsoon. “Both are nuclear weapons states,” Keith reminds us. According to Forbes, the world has 1,645 billionaires, several hundred of them in nations threatened by climate change. If their businesses or homes were at risk, any one of them could single-handedly pay for a course of geo-engineering. Is anyone certain none of these people would pull the trigger?

Few experts think that relying on geo-engineering would be a good idea. But no one knows how soon reality will trump ideology, and so we may finally have hit on a useful form of alarmism. One of the virtues of Keith’s succinct, scary book is to convince the reader that unless we find a way to talk about climate change, planes full of sulfuric acid will soon be on the runway.

CONCLIMA 2013 – acesse vídeos de todas as palestras (Rede Clima)

CONCLIMA 2013 – acesse vídeos de todas as palestras

imagem video conclimaEstão disponíveis na Internet os vídeos de todas as apresentações realizadas durante a 1ª CONCLIMA – Conferência Nacional da Rede CLIMA, INCT para Mudanças Climáticas (INCT-MC) e Programa Fapesp de Pesquisas sobre Mudanças Climáticas Globais (PFPMCG), realizada de 9 a 13 de setembro em São Paulo. A Rede CLIMA também produziu uma síntese de toda a conferência, com duração de 30 minutos.

O objetivo da CONCLIMA foi apresentar os resultados das pesquisas e o conhecimento gerado por esses importantes programas e projetos – um ambicioso empreendimento científico criado pelos governos federal e do Estado de São Paulo para prover informações de alta qualidade em estudos de clima, detecção de variabilidade climática e mudança climática, e seus impactos em setores chaves do Brasil.

Acesse os vídeos:

Vídeo da CONCLIMA – 1a Conferência Nacional de Mudanças Climáticas Globais:

Apresentações – arquivos PDF

Íntegra das apresentações – VÍDEOS

Mesa de Abertura


Paulo Nobre – INPE

Iracema Cavalcanti – INPE

Léo Siqueira – INPE

Marcos Heil Costa – UFV

Sérgio Correa – UERJ


Tércio Ambrizzi – USP 

Eduardo Assad – Embrapa

Mercedes Bustamante – UnB


Agricultura – Hilton Silveira Pinto – Embrapa

Recursos Hídricos – Alfredo Ribeiro Neto – UFPE

Energias Renováveis – Marcos Freitas – COPPE/UFRJ

Biodiversidade e Ecossistemas – Alexandre Aleixo – MPEG

Desastres Naturais – Regina Rodrigues – UFSC 

Zonas Costeiras – Carlos Garcia – FURG

Urbanização e Cidades – Roberto do Carmo – Unicamp

Economia – Eduardo Haddad – USP

Saúde – Sandra Hacon – Fiocruz

Desenvolvimento Regional – Saulo Rodrigues Filho – UnB


O INCT para Mudanças Climáticas – José Marengo – INPE

Detecção e atribuição e variabilidade natural do clima – Simone Ferraz – UFSM

Mudanças no uso da terra – Ana Paula Aguiar – INPE

Ciclos Biogeoquímicos Globais e Biodiversidade – Mercedes Bustamante – UnB

Oceanos – Regina Rodrigues – UFSC

REDD – Osvaldo Stella – IPAM

Cenários Climáticos Futuros e Redução de Incertezas – José Marengo – INPE

Gases de Efeito Estufa – Plínio Alvalá – INPE

Estudos de ciência, tecnologia e políticas públicas – Myanna Lahsen – INPE

Interações biosfera-atmosfera – Gilvan Sampaio – INPE

Amazônia – Gilberto Fisch – IAE/DCTA


Sistema de Alerta Precoce para Doenças Infecciosas Emergentes na Amazônia Ocidental – Manuel Cesario – Unifran

Clima e população em uma região de tensão entre alta taxa de urbanização e alta biodiversidade: Dimensões sociais e ecológicas das mudanças climáticas – Lucia da Costa Ferreira – Unicamp

Cenários de impactos das mudanças climáticas na produção de álcool visando a definição de políticas públicas – Jurandir Zullo – Unicamp

Fluxos hidrológicos e fluxos de carbono – casos da Bacia Amazônica e reflorestamento de microbacias – Humberto Rocha – USP

O papel dos rios no balanço regional do carbono – Maria Victoria Ballester – USP

Aerossóis atmosféricos, balanço de radiação, nuvens e gases traços associados com mudanças de uso de solo na Amazônia – Paulo Artaxo – USP

Socio-economic impacts of climate change in Brazil: quantitative inputs for the design of public policies – Joaquim José Martins Guilhoto e Rafael Feltran Barbieri- USP

Emissão de dióxido de carbono em solos de áreas de cana-de-açúcar sob diferentes estratégias de manejo – Newton La Scala Jr – Unesp

Impacto do Oceano Atlantico Sudoeste no Clima da America do Sul ao longo dos séculos 20 e 21 – Tércio Ambrizzi – USP


Apresentação Sergio Margulis – SAE – Presidência da República

Apresentação Gustavo Luedemann (MCTI)

Apresentação Carlos Klink (SMCQ/MMA)

Apresentação Couto Silva (MMA): Ambiente sobre o status da Elaboração do Plano Nacional de Adaptação. Funcionamento do GT Adaptação e suas redes temáticas. Proposta de Calendário. Proposta de Estrutura do Plano. 

Apresentação Alexandre Gross (FGV): Recortes temáticos do Plano Nacional de Adaptação: apresentação do Relatório sobre dimensões temporal, espacial e temática na adaptação às mudanças climáticas (Produto 4), processo e resultados do GT Adaptação, coleta de contribuições e discussão.

Mesa redonda: Mudanças climáticas, extremos e desastres naturais 

Apresentação Rafael Schadeck – CENAD 

Apresentação Marcos Airton de Sousa Freitas – ANA 

Mesa redonda: Relação ciência – planos setoriais; políticas públicas

Apresentação Carlos Nobre – SEPED/MCTI

Apresentação Luiz Pinguelli Rosa (COPPE UFRJ, FBMC)

Apresentação Eduardo Viola – UnB

Mesa redonda: Inventários e monitoramento das emissões e remoções de GEE 

Apresentação Gustavo Luedemann – MCTI 


Apresentação Patrícia Pinho – IGBP/INPE

Apresentação Paulo Artaxo – USP

John Oliver Does Science Communication Right (I Fucking Love Science)

May 15, 2014 | by Stephen Luntz

photo credit: Last Week Tonight With John Oliver (HBO). Satirist John Oliver shows how scientific pseudo-debates should be covered

One of the most frustrating experiences scientists, science communicators and anyone who cares about science have is the sight of media outlets giving equal time to positions held by a tiny minority of researchers.

This sort of behavior turns up for all sorts of concocted “controversies”, satirized as “Opinions differ on the Shape of the Earth”. However, the most egregious examples occur in reporting climate change. Thousands of carefully researched peer reviewed papers are weighed in the balance and judged equal to a handful of shoddily writtennumerically flaky publications whose flaws take less than a day  to come to light.

That is, of course, if you ignore the places where the anti-science side pretty much gets free range.

So it is a delight to see John Oliver show how it should be done.
We have only one problem with Oliver’s work. He repeats the claim that 97% of climate scientists agree that humans are warming the planet. In fact the study he referred to has 97.1% of peer reviewed papers on climate change endorsing this position. However, these papers were usually produced by large research teams, while the opposing minority were often cooked up by a couple of kooks in their garage. When you look at the numbers of scientists involved the numbers are actually 98.4% to 1.2%, with the rest undecided. Which might not sound like a big difference, but would make Oliver’s tame “skeptic” look even more lonely.
HT Vox, with a nice summary of the evidence

Read more at http://www.iflscience.com/environment/john-oliver-does-science-communication-right#9A4CD6abdJTOOMHK.99

Brasil sedia conferência sobre comunicação pública da ciência (Fapesp)

Questões relacionadas à prática e à pesquisa em divulgação científica integram a programação do evento que ocorrerá em Salvador; inscrições com desconto vão até 7 de abril (PSCT)


Agência FAPESP –A Rede Internacional de Comunicação Pública da Ciência e da Tecnologia (PCST, na sigla em inglês) realiza, pela primeira vez na América Latina, entre 5 e 8 de maio, sua conferência internacional que tem, entre os objetivos principais, promover debates sobre o engajamento público em temas relacionados a ciência e tecnologia.

Formada por indivíduos de diversas partes do mundo que produzem e estudam na área de comunicação pública de ciência e tecnologia, a Rede PCST patrocina conferências, discussões virtuais e outras atividades que promovam novas ideias e perspectivas.

A edição deste ano terá como tema central “Divulgação da Ciência para a inclusão social e o engajamento político”. A ideia é mostrar que, mesmo com investimento em ciência e tecnologia, a maior parte do mundo ainda enfrenta a exclusão social e o desenvolvimento desigual, separando cada vez mais os países ricos dos pobres. O papel da difusão científica é criar possibilidades de ação dos cidadãos que permitam reduzir essa diferença.

Na plenária “Comunicação da ciência e mídia social”, os participantes poderão conferir as palestras de Dominique Brossard, da Universidade de Wisconsin, dos Estados Unidos, e membro do comitê científico da Rede PCST, e de Mohammed Yahia, editor da Nature Middle East, do Egito.

Além da Rede Internacional do PCST, o evento é promovido pelo Museu da Vida, da Fundação Oswaldo Cruz (Fiocruz), e pelo Laboratório de Estudos Avançados em Jornalismo (Labjor), da Universidade Estadual de Campinas (Unicamp).

As inscrições com desconto na taxa podem ser realizadas até 7 de abril. As atividades ocorrerão no Hotel Pestana Bahia, que fica na Rua Fonte do Boi, 216, em Salvador.

Mais informações http://www.pcst-2014.org/index.php/pt-BR/

Against storytelling of scientific results (Nature Methods)

Yarden Katz

Nature Methods 10, 1045 (2013) doi:10.1038/nmeth.2699 – Published online

30 October 2013

To the Editor:

Krzywinski and Cairo1 beautifully illustrate the widespread view that scientific writing should follow a journalistic ‘storytelling’, wherein the choice of what data to plot, and how, is tailored to the message the authors want to deliver. However, they do not discuss the pitfalls of the approach, which often result in a distorted and unrepresentative display of data—one that does not do justice to experimental complexities and their myriad of interpretations.

If we project the features of great storytellers onto a scientist, the result is a portrait of a scientist far from ideal. Great storytellers embellish and conceal information to evoke a response in their audience. Inconvenient truths are swept away, and marginalities are spun to make a point more spectacular. A storyteller would plot the data in the way most persuasive rather than most informative or representative.

Storytelling encourages the unrealistic view that scientific projects fit a singular narrative. Biological systems are difficult to measure and control, so nearly all experiments afford multiple interpretations—but storytelling actively denies this fact of science.

The ‘story-told’ scientific paper is a constrictive mapping between figures and text. Figures produced by masters of scientific storytelling are so tightly controlled to match the narrative that the reader is left with little to ponder or interpret. Critical reading of such papers becomes a detective’s game, in which one reads between the lines for clues of data relegated to a supplement for their deviance from ‘the story’.

Dissecting the structure of scientific papers, Bruno Latour explains the utility of the storytelling approach in giving readers the sense that they are evaluating the data along with the authors while simultaneously persuading them of the story. The storytelling way to achieve this is “to lay out the text so that wherever the reader is there is only one way to go”2—or as Krzywinski and Cairo put it, “Inviting readers to draw their own conclusions is risky”1. Authors prevent this by “carefully stacking more black boxes, less easily disputable arguments”2. This is consistent with the visualization advice that Krzywinski and Cairo give: the narrower and more processed the display of the data is to fit the story, the more black boxes are stacked, making it harder for the reader to access data raw enough to support alternative models or ‘stories’.

Readers and authors know that complex experiments afford multiple interpretations, and so such deviances from the singular narrative must be present somewhere. It would be better for both authors and readers if these could be discussed openly rather than obfuscated. For those who plan to follow up on the results, these discrepancies are often the most important. Storytelling therefore impedes communication of critical information by restricting the scope of the data to that agreeable with the story.

Problems arise when experiments are driven within a storytelling framework. In break rooms of biology research labs, one often hears: “It’d be a great story if X regulated Y by novel mechanism Z.” Experiments might be prioritized by asking, “Is it important for your story?” Storytelling poses a dizzying circularity: before your findings are established, you should decide whether these are the findings you would like to reach. Expectations of a story-like narrative can also be demoralizing to scientists, as most experimental data do not easily fold into this framing.

Finally, a great story in the journalistic sense is a complete one. Papers that make the unexplained observations transparent get penalized in the storytelling framework as incomplete. This prevents the communal puzzle-solving that arises by piecing together unexplained observations from multiple papers.

The alternative to storytelling is the usual language of evidence and arguments that are used—with varying degrees of certainty—to support models and theories. Speaking of models and their evidence goes back to the oldest of scientific discourse, and this framing is also standard in philosophy and law. This language allows authors to discuss evidence for alternative models without imposing a singular journalistic-like story.

There might be other roles for storytelling. Steven McKnight’s lab recently found, entirely unexpectedly, that a small molecule can be used to purify a complex of RNA-binding proteins in the cell, revealing a wide array of striking biological features3. It is that kind of story of discovery—what François Jacob called “night science”—that is often best suited for storytelling, though these narratives are often deemed by scientists as irrelevant ‘fluff’.

As practiced, storytelling shares more with journalism than with science. Journalists seek a great story, and the accompanying pressures sometimes lead to distortion in the portrayal of events in the press. When exerted on scientists, these pressures can yield similar results. Storytelling encourages scientists to design experiments according to what constitutes a ‘great story’, potentially closing off unforeseen avenues more exciting than any story imagined a priori. For the alternative framing to be adopted, editors, reviewers and authors (particularly at the higher-profile journals) will have to adjust their evaluation criteria and reward authors who choose representative displays while discussing alternative models to their own.


  1. Krzywinski, M. & Cairo, A. Nat. Methods 10, 687 (2013).
  2. Latour, B. Science in Action (Harvard Univ. Press, 1987).
  3. Baker, M. Nat. Methods 9, 639 (2012).

Quando os jornalistas falam sobre ciência (Fapesp)

Martin Bauer foi um dos palestrantes do painel da FAPESP Week London sobre cultura científica (foto: LSE)


Por Fernando Cunha, de Londres

Agência FAPESP – O retrato que os jornalistas fazem da ciência, a tensão quando repórteres tentam relatar ao público o que faz a ciência e a maneira como a ciência aparece na ficção – algumas vezes de forma estereotipada – foram algumas das questões discutidas no painel sobre cultura científica, apresentado na sexta-feira (27/09), último dia de atividades da FAPESP Week London 2013.

Martin Bauer, da London School of Economics, falou sobre um projeto de pesquisa coordenado por ele que criará indicadores para verificar o nível de mobilização do mundo científico para divulgar ciência. Temas controversos como transgenia e mudanças climáticas, que provocam mobilização política, sempre vêm acompanhados de um momento educativo, segundo o pesquisador.

O projeto propõe a construção, até o fim de 2015, de um sistema de indicadores a partir de parâmetros como: a atenção do público à informação científica, as aspirações de leitores em termos de bem-estar, a percepção da contribuição da ciência para a cultura do público e a distância relativa entre o sujeito que recebe a informação científica e a própria ciência.

“Estamos interessados na análise dos conteúdos para entender a eficácia da linguagem utilizada e, particularmente, no conceito de autoridade da ciência para transmitir conteúdos de pesquisa para o público”, disse Bauer.

Os indicadores previstos estão agrupados em três tipos: percepção pública da ciência; presença, em termos quantitativos, da ciência na imprensa e na produção cultural da mídia – incluindo as diferentes editorias em jornais e a programação de rádio e TV, incluindo telenovelas; e assuntos científicos tratados.

A pesquisa e coleta de dados serão feitas principalmente em países da Europa e na Índia, e incluirá também o Brasil, em uma colaboração com o Laboratório de Estudos Avançados em Jornalismo da Universidade Estadual de Campinas (Labjor/Unicamp) e a Fundação Oswaldo Cruz (Fiocruz).

Interesse público

Marcelo Leite, repórter do jornal Folha de S. Paulo, tratou do jornalismo de ciência e da percepção pública da ciência no Brasil a partir de uma análise de matérias publicadas sobre as mudanças climáticas globais.

“Estou convencido de que o jornalismo de ciência pode funcionar como um modelo que todas as formas de jornalismo poderiam e deveriam seguir, porque vai além de opiniões, crenças e ideologias para chegar ao mais perto que podemos esperar da verdade”, disse Leite. “O jornalismo de ciência instrui leitores sobre o processo de pesquisa científica, tentando desfazer a percepção equivocada de que os cientistas são detentores da verdade eterna”, disse.

Para o jornalista, a presença do tema das mudanças climáticas nos veículos de comunicação e, em especial, a divulgação na imprensa do conteúdo do quinto relatório do Painel Intergovernamental sobre Mudanças Climáticas (IPCC), da Organização das Nações Unidas (ONU), no fim de setembro, oferecem interessante oportunidade de análise da cobertura e da percepção dos conteúdos.

Uma pesquisa realizada pelo Ministério da Ciência, Tecnologia e Inovação no Brasil, concluída em 2010, mostrou que o índice de atenção do público brasileiro em ciência tem aumentado, atingindo 65% dos cidadãos consultados.

“Sobre o tema ambiente, o interesse subiu de 58%, em 2006, para 82% em 2010, provavelmente por questões relacionadas com a Amazônia e mudanças climáticas, que parecem atrair a maioria das pessoas”, disse.

Em relação à cobertura jornalística de temas relacionados ao clima, a pesquisa concluiu que há uma concentração de textos na área política – com foco em negociações e na mitigação das consequências das emissões de gases de efeito estufa (50% das matérias) – e no desmatamento da floresta (23%).

O painel teve também a participação de Philip Macnaghten, da Universidade de Durham e professor visitante na Unicamp, que tratou dos cenários da pesquisa e inovação responsável no Brasil, e de Maria Immacolata Vassalo de Lopes, professora na Escola de Comunicações e Artes (ECA) da Universidade de São Paulo (USP), que falou sobre a experiência de criação de uma rede ibero-americana para o estudo da ficção na televisão, o Obitel, formada em 2005 na Colômbia, com a participação de pesquisadores de 12 países, incluindo do Brasil.

O projeto teve apoio da FAPESP, da USP, do Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq), da Globo Universidade (TV Globo) e do Ibope.

Realizada pela FAPESP na capital britânica entre 25 e 27 de setembro, com apoio da Royal Society e do British Council, a FAPESP Week London promoveu a discussão de temas avançados de pesquisa para ampliar oportunidades de colaboração entre cientistas brasileiros e europeus nos campos da Biodiversidade, Mudanças Climáticas, Ciências da Saúde, Bioenergia, Nanotecnologia e Comunicação.

Seca no semiárido deve se agravar nos próximos anos – e outros artigos relacionados à Conclima (Fapesp)

Pesquisadores alertam para necessidade de executar ações urgentes de adaptação e mitigação aos impactos das mudanças climáticas previstos na região (foto: Fred Jordão/Acervo ASACom)


Por Elton Alisson

Agência FAPESP – Os problemas de seca prolongada registrados atualmente no semiárido brasileiro devem se agravar ainda mais nos próximos anos por causa das mudanças climáticas globais. Por isso, é preciso executar ações urgentes de adaptação e mitigação desses impactos e repensar os tipos de atividades econômicas que podem ser desenvolvidas na região.

A avaliação foi feita por pesquisadores que participaram das discussões sobre desenvolvimento regional e desastres naturais realizadas no dia 10 de setembro durante a 1ª Conferência Nacional de Mudanças Climáticas Globais (Conclima).

Organizado pela FAPESP e promovido em parceria com a Rede Brasileira de Pesquisa e Mudanças Climáticas Globais (Rede Clima) e o Instituto Nacional de Ciência e Tecnologia para Mudanças Climáticas (INCT-MC), o evento ocorre até a próxima sexta-feira (13/09), no Espaço Apas, em São Paulo.

De acordo com dados do Centro Nacional de Gerenciamento de Riscos e Desastres (Cenad), só nos últimos dois anos foram registrados 1.466 alertas de municípios no semiárido que entraram em estado de emergência ou de calamidade pública em razão de seca e estiagem – os desastres naturais mais recorrentes no Brasil, segundo o órgão.

O Primeiro Relatório de Avaliação Nacional do Painel Brasileiro de Mudanças Climáticas (PBMC) – cujo sumário executivo foi divulgado no dia de abertura da Conclima – estima que esses eventos extremos aumentem principalmente nos biomas Amazônia, Cerrado e Caatinga e que as mudanças devem se acentuar a partir da metade e até o fim do século 21. Dessa forma, o semiárido sofrerá ainda mais no futuro com o problema da escassez de água que enfrenta hoje, alertaram os pesquisadores.

“Se hoje já vemos que a situação é grave, os modelos de cenários futuros das mudanças climáticas no Brasil indicam que o problema será ainda pior. Por isso, todas as ações de adaptação e mitigação pensadas para ser desenvolvidas ao longo dos próximos anos, na verdade, têm de ser realizadas agora”, disse Marcos Airton de Sousa Freitas, especialista em recursos hídricos e técnico da Agência Nacional de Águas (ANA).

Segundo o pesquisador, o semiárido – que abrange Bahia, Sergipe, Alagoas, Pernambuco, Rio Grande do Norte, Paraíba, Ceará, Piauí e o norte de Minas Gerais – vive hoje o segundo ano do período de seca, iniciado em 2011, que pode se prolongar por um tempo indefinido.

Um estudo realizado pelo órgão, com base em dados de vazão de bacias hidrológicas da região, apontou que a duração média dos períodos de seca no semiárido é de 4,5 anos. Estados como o Ceará, no entanto, já enfrentaram secas com duração de quase nove anos, seguidos por longos períodos nos quais choveu abaixo da média estimada.

De acordo com Freitas, a capacidade média dos principais reservatórios da região – com volume acima de 10 milhões de metros cúbicos de água e capacidade de abastecer os principais municípios por até três anos – está atualmente na faixa de 40%. E a tendência até o fim deste ano é de esvaziarem cada vez mais.

“Caso não haja um aporte considerável de água nesses grandes reservatórios em 2013, poderemos ter uma transição do problema de seca que se observa hoje no semiárido, mais rural, para uma seca ‘urbana’ – que atingiria a população de cidades abastecidas por meio de adutoras desses sistemas de reservatórios”, alertou Freitas.

Ações de adaptação

Uma das ações de adaptação que começou a ser implementada no semiárido nos últimos anos e que, de acordo com os pesquisadores, contribuiu para diminuir sensivelmente a vulnerabilidade do acesso à água, principalmente da população rural difusa, foi o Programa Um Milhão de Cisternas (P1MC).

Lançado em 2003 pela Articulação Semiárido Brasileiro (ASA) – rede formada por mais de mil organizações não governamentais (ONGs) que atuam na gestão e no desenvolvimento de políticas de convivência com a região semiárida –, o programa visa implementar um sistema nas comunidades rurais da região por meio do qual a água das chuvas é capturada por calhas, instaladas nos telhados das casas, e armazenada em cisternas cobertas e semienterradas. As cisternas são construídas com placas de cimento pré-moldadas, feitas pela própria comunidade, e têm capacidade de armazenar até 16 mil litros de água.

O programa tem contribuído para o aproveitamento da água da chuva em locais onde chove até 600 milímetros por ano – comparável ao volume das chuvas na Europa – que evaporam e são perdidos rapidamente sem um mecanismo que os represe, avaliaram os pesquisadores.

“Mesmo com a seca extrema na região nos últimos dois anos, observamos que a água para o consumo da população rural difusa tem sido garantida pelo programa, que já implantou cerca de 500 mil cisternas e é uma ação política de adaptação a eventos climáticos extremos. Com programas sociais, como o Bolsa Família, o programa Um Milhão de Cisternas tem contribuído para atenuar os impactos negativos causados pelas secas prolongadas na região”, afirmou Saulo Rodrigues Filho, professor da Universidade de Brasília (UnB).

Como a água tende a ser um recurso natural cada vez mais raro no semiárido nos próximos anos, Rodrigues defendeu a necessidade de repensar os tipos de atividades econômicas mais indicadas para a região.

“Talvez a agricultura não seja a atividade mais sustentável para o semiárido e há evidências de que é preciso diversificar as atividades produtivas na região, não dependendo apenas da agricultura familiar, que já enfrenta problemas de perda de mão de obra, uma vez que o aumento dos níveis de educação leva os jovens da região a se deslocar do campo para a cidade”, disse Rodrigues.

“Por meio de políticas de geração de energia mais sustentáveis, como a solar e a eólica, e de fomento a atividades como o artesanato e o turismo, é possível contribuir para aumentar a resiliência dessas populações a secas e estiagens agudas”, afirmou.

Outras medidas necessárias, apontada por Freitas, são de realocação de água entre os setores econômicos que utilizam o recurso e seleção de culturas agrícolas mais resistentes à escassez de água enfrentada na região.

“Há culturas no semiárido, como capim para alimentação de gado, que dependem de irrigação por aspersão. Não faz sentido ter esse tipo de cultura que demanda muito água em uma região que sofrerá muito os impactos das mudanças climáticas”, afirmou Freitas.

Transposição do Rio São Francisco

O pesquisador também defendeu que o projeto de transposição do Rio São Francisco tornou-se muito mais necessário agora – tendo em vista que a escassez de água deverá ser um problema cada vez maior no semiárido nas próximas décadas – e é fundamental para complementar as ações desenvolvidas na região para atenuar o risco de desabastecimento de água.

Alvo de críticas e previsto para ser concluído em 2015, o projeto prevê que as águas do Rio São Francisco cheguem às bacias do Rio Jaguaribe, que abastece o Ceará, e do Rio Piranhas-Açu, que abastece o Rio Grande do Norte e a Paraíba.

De acordo com um estudo realizado pela ANA, com financiamento do Banco Mundial e participação de pesquisadores da Universidade Federal do Ceará, entre outras instituições, a disponibilidade hídrica dessas duas bacias deve diminuir sensivelmente nos próximos anos, contribuindo para agravar ainda mais a deficiência hídrica do semiárido.

“A transposição do Rio Francisco tornou-se muito mais necessária e deveria ser acelerada porque contribuiria para minimizar o problema do déficit de água no semiárido agora, que deve piorar com a previsão de diminuição da disponibilidade hídrica nas bacias do Rio Jaguaribe e do Rio Piranhas-Açu”, disse Freitas à Agência FAPESP.

O Primeiro Relatório de Avaliação Nacional do PBMC, no entanto, indica que a vazão do Rio São Francisco deve diminuir em até 30% até o fim do século, o que colocaria o projeto de transposição sob ameaça.

Freitas, contudo, ponderou que 70% do volume de água do Rio São Francisco vem de bacias da região Sudeste, para as quais os modelos climáticos preveem aumento da vazão nas próximas décadas. Além disso, de acordo com ele, o volume total previsto para ser transposto para as bacias do Rio Jaguaribe e do Rio Piranhas-Açu corresponde a apenas 2% da vazão média da bacia do Rio São Francisco.

“É uma situação completamente diferente do caso do Sistema Cantareira, por exemplo, no qual praticamente 90% da água dos rios Piracicaba, Jundiaí e Capivari são transpostas para abastecer a região metropolitana de São Paulo”, comparou.

“Pode-se argumentar sobre a questão de custos da transposição do Rio São Francisco. Mas, em termos de necessidade de uso da água, o projeto reforçará a operação dos sistemas de reservatórios existentes no semiárido”, afirmou.

De acordo com o pesquisador, a água é distribuída de forma desigual no território brasileiro. Enquanto 48% do total do volume de chuvas que cai na Amazônia é escoado pela Bacia Amazônica, segundo Freitas, no semiárido apenas em média 7% do volume de água precipitada na região durante três a quatro meses chegam às bacias do Rio Jaguaribe e do Rio Piranhas-Açu. Além disso, grande parte desse volume de água é perdido pela evaporação. “Por isso, temos necessidade de armazenar essa água restante para os meses nos quais não haverá disponibilidade”, explicou.

As apresentações feitas pelos pesquisadores na conferência, que termina no dia 13, estarão disponíveis em: www.fapesp.br/conclima


Mudanças no clima do Brasil até 2100


Por Elton Alisson*

Mais calor, menos chuva no Norte e Nordeste do país e mais chuva no Sul e Sudeste são algumas das projeções do Relatório de Avaliação Nacional do Painel Brasileiro de Mudanças Climáticas (foto: Eduardo Cesar/FAPESP)

Agência FAPESP – O clima no Brasil nas próximas décadas deverá ser mais quente – com aumento gradativo e variável da temperatura média em todas as regiões do país entre 1 ºC e 6 ºC até 2100, em comparação à registrada no fim do século 20.

No mesmo período, também deverá diminuir significativamente a ocorrência de chuvas em grande parte das regiões central, Norte e Nordeste do país. Nas regiões Sul e Sudeste, por outro lado, haverá um aumento do número de precipitações.

As conclusões são do primeiro Relatório de Avaliação Nacional (RAN1) do Painel Brasileiro de Mudanças Climáticas (PBMC), cujo sumário executivo foi divulgado nesta segunda-feira (09/08), durante a 1ª Conferência Nacional de Mudanças Climáticas Globais (Conclima). Organizado pela FAPESP e promovido com a Rede Brasileira de Pesquisa e Mudanças Climáticas Globais (Rede Clima) e o Instituto Nacional de Ciência e Tecnologia para Mudanças Climáticas (INCT-MC), oevento ocorre até a próxima sexta-feira (13/09), no Espaço Apas, em São Paulo.

De acordo com o relatório, tendo em vista que as mudanças climáticas e os impactos sobre as populações e os setores econômicos nos próximos anos não serão idênticos em todo o país, o Brasil precisa levar em conta as diferenças regionais no desenvolvimento de ações de adaptação e mitigação e de políticas agrícolas, de geração de energia e de abastecimento hídrico para essas diferentes regiões.

Dividido em três partes, o Relatório 1 – em fase final de elaboração – apresenta projeções regionalizadas das mudanças climáticas que deverão ocorrer nos seis diferentes biomas do Brasil até 2100, e indica quais são seus impactos estimados e as possíveis formas de mitigá-los.

As projeções foram feitas com base em revisões de estudos realizados entre 2007 e início de 2013 por 345 pesquisadores de diversas áreas, integrantes do PBMC, e em resultados científicos de modelagem climática global e regional.

“O Relatório está sendo preparado nos mesmos moldes dos relatórios publicados pelo Painel Intergovernamental das Mudanças Climáticas [IPCC, na sigla em inglês], que não realiza pesquisa, mas avalia os estudos já publicados”, disse José Marengo, pesquisador do Instituto Nacional de Pesquisas Espaciais (Inpe) e coordenador do encontro.

“Depois de muito trabalho e interação, chegamos aos resultados principais dos três grupos de trabalho [Bases científicas das mudanças climáticas; Impactos, vulnerabilidades e adaptação; e Mitigação das mudanças climáticas]”, ressaltou.

Principais conclusões

Uma das conclusões do relatório é de que os eventos extremos de secas e estiagens prolongadas, principalmente nos biomas da Amazônia, Cerrado e Caatinga, devem aumentar e essas mudanças devem se acentuar a partir da metade e no fim do século 21.

A temperatura na Amazônia deverá aumentar progressivamente de 1 ºC a 1,5 ºC até 2040 – com diminuição de 25% a 30% no volume de chuvas –, entre 3 ºC e 3,5 ºC no período de 2041 a 2070 – com redução de 40% a 45% na ocorrência de chuvas –, e entre 5 ºC a 6 ºC entre 2071 a 2100.

Enquanto as modificações do clima associadas às mudanças globais podem comprometer o bioma em longo prazo, a questão atual do desmatamento decorrente das intensas atividades de uso da terra representa uma ameaça mais imediata para a Amazônia, ponderam os autores do relatório.

Os pesquisadores ressaltam que estudos observacionais e de modelagem numérica sugerem que, caso o desmatamento alcance 40% na região no futuro, haverá uma mudança drástica no padrão do ciclo hidrológico, com redução de 40% na chuva durante os meses de julho a novembro – o que prolongaria a duração da estação seca e provocaria o aquecimento superficial do bioma em até 4 ºC.

Dessa forma, as mudanças regionais decorrentes do efeito do desmatamento se somariam às provenientes das mudanças globais e constituíram condições propícias para a savanização da Amazônia – problema que tende a ser mais crítico na região oriental, ressaltam os pesquisadores.

“As projeções permitirão analisar melhor esse problema de savanização da Amazônia, que, na verdade, percebemos que poderá ocorrer em determinados pontos da floresta, e não no bioma como um todo, conforme previam alguns estudos”, destacou Tércio Ambrizzi, um dos autores coordenadores do sumário executivo do grupo de trabalho sobre a base científica das mudanças climáticas.

A temperatura da Caatinga também deverá aumentar entre 0,5 ºC e 1 ºC e as chuvas no bioma diminuirão entre 10% e 20% até 2040. Entre 2041 e 2070 o clima da região deverá ficar de 1,5 ºC a 2,5 ºC mais quente e o padrão de chuva diminuir entre 25% e 35%. Até o final do século, a temperatura do bioma deverá aumentar progressivamente entre 3,5 ºC e 4,5 ºC  e a ocorrência de chuva diminuir entre 40% e 50%. Tais mudanças podem desencadear o processo de desertificação do bioma.

Por sua vez, a temperatura no Cerrado deverá aumentar entre 5 ºC e 5,5 ºC e as chuvas diminuirão entre 35% e 45% no bioma até 2100. No Pantanal, o aquecimento da temperatura deverá ser de 3,5ºC a 4,5ºC até o final do século, com diminuição acentuada dos padrões de chuva no bioma – com queda de 35% a 45%.

Já no caso da Mata Atlântica, como o bioma abrange áreas desde a região Sul do país, passando pelo Sudeste e chegando até o Nordeste, as projeções apontam dois regimes distintos de mudanças climáticas.

Na porção Nordeste deve ocorrer um aumento relativamente baixo na temperatura – entre 0,5 ºC e 1 ºC – e decréscimo nos níveis de precipitação (chuva) em torno de 10% até 2040. Entre 2041 e 2070, o aquecimento do clima da região deverá ser de 2 ºC a 3 ºC, com diminuição pluviométrica entre 20% e 25%. Já para o final do século – entre 2071 e 2100 –, estimam-se condições de aquecimento intenso – com aumento de 3 ºC a 4 ºC na temperatura – e diminuição de 30% a 35% na ocorrência de chuvas.

Nas porções Sul e Sudeste as projeções indicam aumento relativamente baixo de temperatura entre 0,5 ºC e 1 ºC até 2040, com aumento de 5% a 10% no número de chuva. Entre 2041 e 2070 deverão ser mantidas as tendências de aumento gradual de 1,5 ºC a 2 ºC na temperatura e de 15% a 20% de chuvas.

Tais tendências devem se acentuar ainda mais no final do século, quando o clima deverá ficar entre 2,5 ºC e 3 ºC mais quente e entre 25% e 30% mais chuvoso.

Por fim, para o Pampa, as projeções indicam que até 2040 o clima da região será entre 5% e 10% mais chuvoso e até 1 ºC mais quente. Já entre 2041 e 2070, a temperatura do bioma deverá aumentar entre 1 ºC e 1,5 ºC  e haverá uma intensificação das chuvas entre 15% e 20%. As projeções para o clima da região no período entre 2071 e 2100 são mais agravantes, com aumento de temperatura de 2,5 ºC a 3 ºC e ocorrência de chuvas entre 35% e 40% acima do normal.

“O que se observa, de forma geral, é que nas regiões Norte e Nordeste do Brasil a tendência é de um aumento de temperatura e de diminuição das chuvas ao longo do século”, resumiu Ambrizzi.

“Já nas regiões mais ao Sul essa tendência se inverte: há uma tendência tanto de aumento da temperatura – ainda que não intenso – e de precipitação”, comparou.

Impactos e adaptação

As mudanças nos padrões de precipitação nas diferentes regiões do país, causadas pelas mudanças climáticas, deverão ter impactos diretos na agricultura, na geração e distribuição de energia e nos recursos hídricos das regiões, uma vez que a água deve se tornar mais rara nas regiões Norte e Nordeste e mais abundante no Sul e Sudeste, alertam os pesquisadores.

Por isso, será preciso desenvolver ações de adaptação e mitigação específicas e rever decisões de investimento, como a construção de hidrelétricas nas regiões leste da Amazônia, onde os rios poderão ter redução da vazão da ordem de até 20%, ressalvaram os pesquisadores.

“Essas variações de impactos mostram que qualquer tipo de estratégia planejada para geração de energia no leste da Amazônia está ameaçada, porque há uma série de fragilidades”, disse Eduardo Assad, pesquisador da Empresa Brasileira de Pesquisa Agropecuária (Embrapa).

“Dará para contar com água. Mas até quando e onde encontrar água nessas regiões são incógnitas”, disse o pesquisador, que é um dos coordenadores do Grupo de Trabalho 2 do relatório, sobre Impactos, vulnerabilidades e adaptação.

De acordo com Assad, é muito caro realizar ações de adaptação às mudanças climáticas no Brasil em razão das fragilidades que o país apresenta tanto em termos naturais – com grandes variações de paisagens – como socioeconômicas.

“A maior parte da população brasileira – principalmente a que habita as regiões costeiras do país – está vulnerável aos impactos das mudanças climáticas. Resolver isso não será algo muito fácil”, estimou.

Entre os setores econômicos do país, segundo Assad, a agricultura é um dos poucos que vêm se adiantando para se adaptar aos impactos das mudanças climáticas.

“Já estamos trabalhando com condições de adaptação há mais de oito anos. É possível desenvolver cultivares tolerantes a temperaturas elevadas ou à deficiência hídrica [dos solos], disse Assad.

O pesquisador também ressaltou que os grupos populacionais com piores condições de renda, educação e moradia sofrerão mais intensamente os impactos das mudanças climáticas no país. “Teremos que tomar decisões rápidas para evitar que tragédias aconteçam.”


Mercedes Bustamante, professora da Universidade de Brasília (UnB), e uma das coordenadoras do Grupo de Trabalho 3, sobre Mitigação das Mudanças Climáticas, apresentou uma síntese de estudos e pesquisas sobre o tema, identificando lacunas do conhecimento e direcionamentos futuros em um cenário de aquecimento global.

Bustamante apontou que a redução das taxas de desmatamento entre 2005 e 2010 – de 2,03 bilhões de toneladas de CO2 equivalente para 1,25 bilhão de toneladas – já teve efeitos positivos na redução das emissões de gases de efeito estufa (GEE) decorrentes do uso da terra.

“As emissões decorrentes da geração de energia e da agricultura, no entanto, aumentaram em termos absolutos e relativos, indicando mudanças no perfil das emissões brasileiras”, disse.

Mantidas as políticas atuais, a previsão é de que as emissões decorrentes dos setores de energia e de transportes aumentem 97% até 2030. Será preciso mais eficiência energética, mais inovação tecnológica e políticas de incentivo ao uso de energia renovável para reverter esse quadro.

Na área de transporte, as recomendações vão desde a transformação de um modal – fortemente baseado no transporte rodoviário – e o uso de combustíveis tecnológicos. “É preciso transferir do individual para o coletivo, investindo, por exemplo, em sistemas aquaviários e em veículos elétricos e híbridos”, ressaltou Bustamante.

O novo perfil das emissões de GEE revela uma participação crescente do metano – de origem animal – e do óxido nitroso – relacionado ao uso de fertilizantes. “Apesar desses resultados, a agricultura avançou no desenvolvimento de estratégias de mitigação e adaptação”, ponderou.

Para a indústria, responsável por 4% das emissões de GEE, a lista de recomendações para a mitigação passa pela reciclagem, pela utilização de biomassa renovável, pela cogeração de energia, entre outros.

As estratégias de mitigação das mudanças climáticas exigem, ainda, uma revisão do planejamento urbano de forma a garantir a sustentabilidade também das edificações de forma a controlar, por exemplo, o consumo da madeira e garantir maior eficiência energética na construção civil.

Informação para a sociedade

Os pesquisadores participantes da redação do relatório destacaram que, entre as virtudes do documento, está a de reunir dados de estudos científicos realizados ao longo dos últimos anos no Brasil que estavam dispersos e disponibilizar à sociedade e aos tomadores de decisão informações técnico-científicas críveis capazes de auxiliar no desenvolvimento de estratégias de adaptação e mitigação para os possíveis impactos das mudanças climáticas.

“Nós, cientistas, temos o desafio de conseguir traduzir a seriedade e a gravidade do momento e as oportunidades que as mudanças climáticas globais encerram para a sociedade. Sabemos que a inação representa a ação menos inteligente que a sociedade pode tomar”, disse Paulo Nobre, coordenador da Rede Clima.

Por sua vez, Celso Lafer, presidente da FAPESP, destacou, na abertura do evento, que a Fundação tem interesse especial nas pesquisas sobre mudanças climáticas, expresso no Programa FAPESP de Pesquisa em Mudanças Climáticas Globais (PFPMCG), mantido pela instituição.

“Uma das preocupações básicas da FAPESP é pesquisar e averiguar o impacto das mudanças climáticas globais naquilo que afeta as especificidades do Brasil e do Estado de São Paulo”, afirmou.

Também participaram da abertura do evento Bruno Covas, secretário do Meio Ambiente do Estado de São Paulo, Carlos Nobre, secretário de Políticas e Programa de Pesquisa e Desenvolvimento (Seped) do Ministério da Ciência, Tecnologia e Inovação (MCTI), e Paulo Artaxo, membro da coordenação do PFPMCG.

Carlos Nobre ressaltou que o relatório será a principal fonte de informações que orientará o Plano Nacional de Mudanças Climáticas que, no momento, está em revisão.

“É muito importante que os resultados desse estudo orientem os trabalhos em Brasília e em várias partes do Brasil, em um momento crítico de reorientar a política nacional, que tem de ir na direção de tornar a economia, a sociedade e o ambiente mais resilientes às inevitáveis mudanças climáticas que estão por vir”, afirmou.

Segundo ele, o Brasil já sinalizou compromisso com a mitigação, materializado na Política Nacional de Mudanças Climáticas e que prevê redução de 10% e 15% das emissões entre 2010 e 2020, respectivamente, relativamente a 2005.

“São Paulo lançou, em 2009, um programa ambicioso, de redução de 20% das emissões, já que a questão da mudança no uso da terra não é uma questão tão importante no Estado, mas sim o avanço tecnológico na geração de energia e em processos produtivos. O Brasil é o único país em desenvolvimento com metas voluntárias para redução de emissões”.

Ele ressaltou, entretanto, que “a adaptação ficou desassistida”. “Não é só mitigar; é preciso também se adaptar às mudanças climáticas. As três redes de pesquisa – Clima, INCT e FAPESP – avançam na adaptação, que é o guia para o desenvolvimento sustentável.”

* Colaboraram Claudia Izique e Noêmia Lopes


Modelo para entender as variações climáticas no país


Por Noêmia Lopes

Modelo Brasileiro do Sistema Terrestre é considerado fundamental para pesquisas sobre o clima(foto:Nasa)

Agência FAPESP – O desmatamento da Amazônia, as queimadas, os processos de interação entre o Oceano Atlântico e a atmosfera são algumas das questões climáticas particulares do Brasil que o Modelo Brasileiro do Sistema Terrestre (BESM, na sigla em inglês) leva em consideração e pretende dar conta de uma forma que mesmo os melhores modelos do mundo não são capazes de fazer.

O modelo foi apresentado em detalhes na segunda-feira (09/09) na abertura da 1ª Conferência Nacional de Mudanças Climáticas Globais, realizada em São Paulo.

Determinar o quanto o clima já mudou, o quanto as suas características ainda vão se alterar e onde isso ocorrerá é a base para o planejamento de políticas públicas de adaptação aos impactos das mudanças climáticas globais, segundo Paulo Nobre, coordenador geral do BESM e da Rede Brasileira de Pesquisa sobre Mudanças Climáticas Globais (Rede Clima), na abertura do evento.

“O BESM é um eixo estruturante da pesquisa em mudanças climáticas no Brasil e oferece subsídios para o Programa FAPESP de Pesquisa sobre Mudanças Climáticas Globais (PFPMCG), para a Rede Clima e para o Instituto Nacional de Ciência e Tecnologia para Mudanças Climáticas (INCT-MC)”, disse Nobre.

Com o intuito de colaborar com as previsões sobre as consequências do aquecimento global e o consequente aumento na frequência dos eventos extremos, o BESM roda no supercomputador Tupã da Rede Clima/PFPMCG, reunindo fluxos atmosféricos, oceânicos, de superfície e, em breve, químicos. Adquirido pelo Ministério da Ciência, Tecnologia e Inovação (MCTI) com apoio da FAPESP, o Tupã está instalado na unidade do Instituto Nacional de Pesquisas Espaciais (Inpe) de Cachoeira Paulista

Embora ainda não esteja em sua versão final e completa, o modelo já consegue, por exemplo, reconstituir a ocorrência dos últimos El Niños e estimar o retorno desse fenômeno, explicar as chuvas na Zona de Convergência do Atlântico Sul e até estimar a variação do gelo no globo de maneira satisfatória.

“Os cenários mais atuais, de 2013, estão disponíveis no supercomputador. Estamos tomando medidas para que o acesso seja disponibilizado ao público em geral até o final do ano”, afirmou Nobre.

Trinta pesquisadores estão diretamente envolvidos no desenvolvimento do BESM e outros 40 estão ligados indiretamente. De acordo com Nobre, esse número deve duplicar em cinco anos e duplicar novamente em 10 anos. “Para tanto, é preciso que cada vez mais jovens doutores integrem esse desafio.”

Nobre ressaltou ainda as graves consequências que o aumento da frequência dos eventos extremos – tais como secas, enchentes e tornados – tem sobre a vida das populações: “Nossas cidades e nossos campos não foram projetados para conviver com esse novo clima, o que torna essencial a entrada do Brasil na modelagem das mudanças climáticas”, disse.

Os quatro componentes

Iracema Cavalcanti, pesquisadora do Inpe, apresentou, durante o primeiro dia da conferência, os resultados já obtidos a partir do componente atmosférico do BESM.

Em comparação com os dados observados, o modelo consegue alcançar boas simulações de variabilidade sazonal (modificações climatológicas das estações), fluxos de umidade, variabilidade interanual (diferença entre anomalias de pressão), variabilidade da precipitação, entre outros. “Ainda há deficiências, como na maioria dos modelos globais. Mas as características gerais estão contempladas”, disse Cavalcanti.

Em relação aos oceanos, não é possível separá-los do que ocorre no modelo atmosférico, daí a modelagem acoplada ser estratégica para o funcionamento eficiente do BESM. A avaliação é de Leo Siqueira, também do Inpe, que apresentou os desafios desse componente.

“Já conseguimos boas representações, mas queremos melhorar as simulações de temperatura dos oceanos, principalmente nos trópicos, e do gelo marinho. Também queremos criar uma discussão saudável sobre qual modelo de gelo terrestre será adotado dentro do BESM”, disse Siqueira.

A modelagem do terceiro componente, superfície, foi apresentada por Marcos Heil Costa, da Universidade Federal de Viçosa (UFV). “A principal função de um modelo que integra processos superficiais é fornecer fluxos de energia e vapor d’água entre a vegetação e a atmosfera. Isso é o básico. Mas o aprimoramento do sistema, ocorrido ao longo dos últimos anos, permitiu que outros processos fossem incorporados”, afirmou.

Alguns desses processos são: fluxos de radiação, energia e massa; ciclos do carbono e do nitrogênio terrestres; recuperação de áreas abandonadas; incêndios na vegetação natural; culturas agrícolas; vazão e áreas inundadas sazonais; uso do solo por ações humanas (desmatamento); representação específica dos ecossistemas; fertilidade do solo; entre outros.

Química é o quarto e mais recente dos componentes. “Sem um modelo químico, os demais precisam ser constantemente ajustados”, disse Sérgio Correa, pesquisador da Universidade Estadual do Rio de Janeiro (UERJ). Correa está à frente de estudos sobre química atmosférica que permitirão refinar os dados do BESM quando esse componente for acoplado ao modelo – uma das próximas fases previstas.

Colaborações com pesquisadores de instituições estrangeiras e treinamentos são outras estratégias que estão em vigor para melhorar as representações do Brasil e da América do Sul, com alcance também global.

Mais informações sobre a 1ª Conferência Nacional de Mudanças Climáticas Globais: www.fapesp.br/conclima

The Science of Why We Don’t Believe Science (Mother Jones)

How our brains fool us on climate, creationism, and the end of the world.

By  | Mon Apr. 18, 2011 3:00 AM PDT

“A MAN WITH A CONVICTION is a hard man to change. Tell him you disagree and he turns away. Show him facts or figures and he questions your sources. Appeal to logic and he fails to see your point.” So wrote the celebrated Stanford University psychologist Leon Festinger [1] (PDF), in a passage that might have been referring to climate change denial—the persistent rejection, on the part of so many Americans today, of what we know about global warming and its human causes. But it was too early for that—this was the 1950s—and Festinger was actually describing a famous case study [2] in psychology.

Festinger and several of his colleagues had infiltrated the Seekers, a small Chicago-area cult whose members thought they were communicating with aliens—including one, “Sananda,” who they believed was the astral incarnation of Jesus Christ. The group was led by Dorothy Martin, a Dianetics devotee who transcribed the interstellar messages through automatic writing.

Through her, the aliens had given the precise date of an Earth-rending cataclysm: December 21, 1954. Some of Martin’s followers quit their jobs and sold their property, expecting to be rescued by a flying saucer when the continent split asunder and a new sea swallowed much of the United States. The disciples even went so far as to remove brassieres and rip zippers out of their trousers—the metal, they believed, would pose a danger on the spacecraft.

Festinger and his team were with the cult when the prophecy failed. First, the “boys upstairs” (as the aliens were sometimes called) did not show up and rescue the Seekers. Then December 21 arrived without incident. It was the moment Festinger had been waiting for: How would people so emotionally invested in a belief system react, now that it had been soundly refuted?

Read also: the truth about Climategate. [3]. Read also: the truth about Climategate [4].

At first, the group struggled for an explanation. But then rationalization set in. A new message arrived, announcing that they’d all been spared at the last minute. Festinger summarized the extraterrestrials’ new pronouncement: “The little group, sitting all night long, had spread so much light that God had saved the world from destruction.” Their willingness to believe in the prophecy had saved Earth from the prophecy!

From that day forward, the Seekers, previously shy of the press and indifferent toward evangelizing, began to proselytize. “Their sense of urgency was enormous,” wrote Festinger. The devastation of all they had believed had made them even more certain of their beliefs.

In the annals of denial, it doesn’t get much more extreme than the Seekers. They lost their jobs, the press mocked them, and there were efforts to keep them away from impressionable young minds. But while Martin’s space cult might lie at on the far end of the spectrum of human self-delusion, there’s plenty to go around. And since Festinger’s day, an array of new discoveries in psychology and neuroscience has further demonstrated how our preexisting beliefs, far more than any new facts, can skew our thoughts and even color what we consider our most dispassionate and logical conclusions. This tendency toward so-called “motivated reasoning [5]” helps explain why we find groups so polarized over matters where the evidence is so unequivocal: climate change, vaccines, “death panels,” the birthplace and religion of the president [6] (PDF), and much else. It would seem that expecting people to be convinced by the facts flies in the face of, you know, the facts.

The theory of motivated reasoning builds on a key insight of modern neuroscience [7] (PDF): Reasoning is actually suffused with emotion (or what researchers often call “affect”). Not only are the two inseparable, but our positive or negative feelings about people, things, and ideas arise much more rapidly than our conscious thoughts, in a matter of milliseconds—fast enough to detect with an EEG device, but long before we’re aware of it. That shouldn’t be surprising: Evolution required us to react very quickly to stimuli in our environment. It’s a “basic human survival skill,” explains political scientist Arthur Lupia[8] of the University of Michigan. We push threatening information away; we pull friendly information close. We apply fight-or-flight reflexes not only to predators, but to data itself.

We apply fight-or-flight reflexes not only to predators, but to data itself.

We’re not driven only by emotions, of course—we also reason, deliberate. But reasoning comes later, works slower—and even then, it doesn’t take place in an emotional vacuum. Rather, our quick-fire emotions can set us on a course of thinking that’s highly biased, especially on topics we care a great deal about.

Consider a person who has heard about a scientific discovery that deeply challenges her belief in divine creation—a new hominid, say, that confirms our evolutionary origins. What happens next, explains political scientist Charles Taber [9] of Stony Brook University, is a subconscious negative response to the new information—and that response, in turn, guides the type of memories and associations formed in the conscious mind. “They retrieve thoughts that are consistent with their previous beliefs,” says Taber, “and that will lead them to build an argument and challenge what they’re hearing.”

In other words, when we think we’re reasoning, we may instead be rationalizing. Or to use an analogy offered by University of Virginia psychologist Jonathan Haidt [10]: We may think we’re being scientists, but we’re actually being lawyers [11] (PDF). Our “reasoning” is a means to a predetermined end—winning our “case”—and is shot through with biases. They include “confirmation bias,” in which we give greater heed to evidence and arguments that bolster our beliefs, and “disconfirmation bias,” in which we expend disproportionate energy trying to debunk or refute views and arguments that we find uncongenial.

That’s a lot of jargon, but we all understand these mechanisms when it comes to interpersonal relationships. If I don’t want to believe that my spouse is being unfaithful, or that my child is a bully, I can go to great lengths to explain away behavior that seems obvious to everybody else—everybody who isn’t too emotionally invested to accept it, anyway. That’s not to suggest that we aren’t also motivated to perceive the world accurately—we are. Or that we never change our minds—we do. It’s just that we have other important goals besides accuracy—including identity affirmation and protecting one’s sense of self—and often those make us highly resistant to changing our beliefs when the facts say we should.

Modern science originated from an attempt to weed out such subjective lapses—what that great 17th century theorist of the scientific method, Francis Bacon, dubbed the “idols of the mind.” Even if individual researchers are prone to falling in love with their own theories, the broader processes of peer review and institutionalized skepticism are designed to ensure that, eventually, the best ideas prevail.

Scientific evidence is highly susceptible to misinterpretation. Giving ideologues scientific data that’s relevant to their beliefs is like unleashing them in the motivated-reasoning equivalent of a candy store.

Our individual responses to the conclusions that science reaches, however, are quite another matter. Ironically, in part because researchers employ so much nuance and strive to disclose all remaining sources of uncertainty, scientific evidence is highly susceptible to selective reading and misinterpretation. Giving ideologues or partisans scientific data that’s relevant to their beliefs is like unleashing them in the motivated-reasoning equivalent of a candy store.

Sure enough, a large number of psychological studies have shown that people respond to scientific or technical evidence in ways that justify their preexisting beliefs. In a classic 1979 experiment [12] (PDF), pro- and anti-death penalty advocates were exposed to descriptions of two fake scientific studies: one supporting and one undermining the notion that capital punishment deters violent crime and, in particular, murder. They were also shown detailed methodological critiques of the fake studies—and in a scientific sense, neither study was stronger than the other. Yet in each case, advocates more heavily criticized the study whose conclusions disagreed with their own, while describing the study that was more ideologically congenial as more “convincing.”

Since then, similar results have been found for how people respond to “evidence” about affirmative action, gun control, the accuracy of gay stereotypes [13], and much else. Even when study subjects are explicitly instructed to be unbiased and even-handed about the evidence, they often fail.

And it’s not just that people twist or selectively read scientific evidence to support their preexisting views. According to research by Yale Law School professor Dan Kahan [14] and his colleagues, people’s deep-seated views about morality, and about the way society should be ordered, strongly predict whom they consider to be a legitimate scientific expert in the first place—and thus where they consider “scientific consensus” to lie on contested issues.

In Kahan’s research [15] (PDF), individuals are classified, based on their cultural values, as either “individualists” or “communitarians,” and as either “hierarchical” or “egalitarian” in outlook. (Somewhat oversimplifying, you can think of hierarchical individualists as akin to conservative Republicans, and egalitarian communitarians as liberal Democrats.) In one study, subjects in the different groups were asked to help a close friend determine the risks associated with climate change, sequestering nuclear waste, or concealed carry laws: “The friend tells you that he or she is planning to read a book about the issue but would like to get your opinion on whether the author seems like a knowledgeable and trustworthy expert.” A subject was then presented with the résumé of a fake expert “depicted as a member of the National Academy of Sciences who had earned a Ph.D. in a pertinent field from one elite university and who was now on the faculty of another.” The subject was then shown a book excerpt by that “expert,” in which the risk of the issue at hand was portrayed as high or low, well-founded or speculative. The results were stark: When the scientist’s position stated that global warming is real and human-caused, for instance, only 23 percent of hierarchical individualists agreed the person was a “trustworthy and knowledgeable expert.” Yet 88 percent of egalitarian communitarians accepted the same scientist’s expertise. Similar divides were observed on whether nuclear waste can be safely stored underground and whether letting people carry guns deters crime. (The alliances did not always hold. Inanother study [16] (PDF), hierarchs and communitarians were in favor of laws that would compel the mentally ill to accept treatment, whereas individualists and egalitarians were opposed.)

Head-on attempts to persuade can sometimes trigger a backfire effect, where people not only fail to change their minds when confronted with the facts—they may hold their wrong views more tenaciously than ever.

In other words, people rejected the validity of a scientific source because its conclusion contradicted their deeply held views—and thus the relative risks inherent in each scenario. A hierarchal individualist finds it difficult to believe that the things he prizes (commerce, industry, a man’s freedom to possess a gun to defend his family [16]) (PDF) could lead to outcomes deleterious to society. Whereas egalitarian communitarians tend to think that the free market causes harm, that patriarchal families mess up kids, and that people can’t handle their guns. The study subjects weren’t “anti-science”—not in their own minds, anyway. It’s just that “science” was whatever they wanted it to be. “We’ve come to a misadventure, a bad situation where diverse citizens, who rely on diverse systems of cultural certification, are in conflict,” says Kahan [17].

And that undercuts the standard notion that the way to persuade people is via evidence and argument. In fact, head-on attempts to persuade can sometimes trigger a backfire effect, where people not only fail to change their minds when confronted with the facts—they may hold their wrong views more tenaciously than ever.

Take, for instance, the question of whether Saddam Hussein possessed hidden weapons of mass destruction just before the US invasion of Iraq in 2003. When political scientists Brendan Nyhan and Jason Reifler showed subjects fake newspaper articles [18] (PDF) in which this was first suggested (in a 2004 quote from President Bush) and then refuted (with the findings of the Bush-commissioned Iraq Survey Group report, which found no evidence of active WMD programs in pre-invasion Iraq), they found that conservatives were more likely than before to believe the claim. (The researchers also tested how liberals responded when shown that Bush did not actually “ban” embryonic stem-cell research. Liberals weren’t particularly amenable to persuasion, either, but no backfire effect was observed.)

Another study gives some inkling of what may be going through people’s minds when they resist persuasion. Northwestern University sociologist Monica Prasad [19] and her colleagues wanted to test whether they could dislodge the notion that Saddam Hussein and Al Qaeda were secretly collaborating among those most likely to believe it—Republican partisans from highly GOP-friendly counties. So the researchers set up a study [20] (PDF) in which they discussed the topic with some of these Republicans in person. They would cite the findings of the 9/11 Commission, as well as a statement in which George W. Bush himself denied his administration had “said the 9/11 attacks were orchestrated between Saddam and Al Qaeda.”

One study showed that not even Bush’s own words could change the minds of Bush voters who believed there was an Iraq-Al Qaeda link.

As it turned out, not even Bush’s own words could change the minds of these Bush voters—just 1 of the 49 partisans who originally believed the Iraq-Al Qaeda claim changed his or her mind. Far more common was resisting the correction in a variety of ways, either by coming up with counterarguments or by simply being unmovable:

Interviewer: [T]he September 11 Commission found no link between Saddam and 9/11, and this is what President Bush said. Do you have any comments on either of those?

Respondent: Well, I bet they say that the Commission didn’t have any proof of it but I guess we still can have our opinions and feel that way even though they say that.

The same types of responses are already being documented on divisive topics facing the current administration. Take the “Ground Zero mosque.” Using information from the political myth-busting site FactCheck.org [21], a team at Ohio State presented subjects [22] (PDF) with a detailed rebuttal to the claim that “Feisal Abdul Rauf, the Imam backing the proposed Islamic cultural center and mosque, is a terrorist-sympathizer.” Yet among those who were aware of the rumor and believed it, fewer than a third changed their minds.

A key question—and one that’s difficult to answer—is how “irrational” all this is. On the one hand, it doesn’t make sense to discard an entire belief system, built up over a lifetime, because of some new snippet of information. “It is quite possible to say, ‘I reached this pro-capital-punishment decision based on real information that I arrived at over my life,'” explains Stanford social psychologist Jon Krosnick [23]. Indeed, there’s a sense in which science denial could be considered keenly “rational.” In certain conservative communities, explains Yale’s Kahan, “People who say, ‘I think there’s something to climate change,’ that’s going to mark them out as a certain kind of person, and their life is going to go less well.”

This may help explain a curious pattern Nyhan and his colleagues found when they tried to test the fallacy [6] (PDF) that President Obama is a Muslim. When a nonwhite researcher was administering their study, research subjects were amenable to changing their minds about the president’s religion and updating incorrect views. But when only white researchers were present, GOP survey subjects in particular were more likely to believe the Obama Muslim myth than before. The subjects were using “social desirabililty” to tailor their beliefs (or stated beliefs, anyway) to whoever was listening.

Which leads us to the media. When people grow polarized over a body of evidence, or a resolvable matter of fact, the cause may be some form of biased reasoning, but they could also be receiving skewed information to begin with—or a complicated combination of both. In the Ground Zero mosque case, for instance, a follow-up study [24] (PDF) showed that survey respondents who watched Fox News were more likely to believe the Rauf rumor and three related ones—and they believed them more strongly than non-Fox watchers.

Okay, so people gravitate toward information that confirms what they believe, and they select sources that deliver it. Same as it ever was, right? Maybe, but the problem is arguably growing more acute, given the way we now consume information—through the Facebook links of friends, or tweets that lack nuance or context, or “narrowcast [25]” and often highly ideological media that have relatively small, like-minded audiences. Those basic human survival skills of ours, says Michigan’s Arthur Lupia, are “not well-adapted to our information age.”

A predictor of whether you accept the science of global warming? Whether you’re a Republican or a Democrat.

If you wanted to show how and why fact is ditched in favor of motivated reasoning, you could find no better test case than climate change. After all, it’s an issue where you have highly technical information on one hand and very strong beliefs on the other. And sure enough, one key predictor of whether you accept the science of global warming is whether you’re a Republican or a Democrat. The two groups have been growing more divided in their views about the topic, even as the science becomes more unequivocal.

So perhaps it should come as no surprise that more education doesn’t budge Republican views. On the contrary: In a 2008 Pew survey [26], for instance, only 19 percent of college-educated Republicans agreed that the planet is warming due to human actions, versus 31 percent of non-college educated Republicans. In other words, a higher education correlated with an increased likelihood of denying the science on the issue. Meanwhile, among Democrats and independents, more education correlated with greater acceptance of the science.

Other studies have shown a similar effect: Republicans who think they understand the global warming issue best are least concerned about it; and among Republicans and those with higher levels of distrust of science in general, learning more about the issue doesn’t increase one’s concern about it. What’s going on here? Well, according to Charles Taber and Milton Lodge of Stony Brook, one insidious aspect of motivated reasoning is that political sophisticates are prone to be more biased than those who know less about the issues. “People who have a dislike of some policy—for example, abortion—if they’re unsophisticated they can just reject it out of hand,” says Lodge. “But if they’re sophisticated, they can go one step further and start coming up with counterarguments.” These individuals are just as emotionally driven and biased as the rest of us, but they’re able to generate more and better reasons to explain why they’re right—and so their minds become harder to change.

That may be why the selectively quoted emails of Climategate were so quickly and easily seized upon by partisans as evidence of scandal. Cherry-picking is precisely the sort of behavior you would expect motivated reasoners to engage in to bolster their views—and whatever you may think about Climategate, the emails were a rich trove of new information upon which to impose one’s ideology.

Climategate had a substantial impact on public opinion, according to Anthony Leiserowitz [27], director of the Yale Project on Climate Change Communication [28]. It contributed to an overall drop in public concern about climate change and a significant loss of trust in scientists. But—as we should expect by now—these declines were concentrated among particular groups of Americans: Republicans, conservatives, and those with “individualistic” values. Liberals and those with “egalitarian” values didn’t lose much trust in climate science or scientists at all. “In some ways, Climategate was like a Rorschach test,” Leiserowitz says, “with different groups interpreting ambiguous facts in very different ways.”

Is there a case study of science denial that largely occupies the political left? Yes: the claim that childhood vaccines are causing an epidemic of autism.

So is there a case study of science denial that largely occupies the political left? Yes: the claim that childhood vaccines are causing an epidemic of autism. Its most famous proponents are an environmentalist (Robert F. Kennedy Jr. [29]) and numerous Hollywood celebrities (most notably Jenny McCarthy [30] and Jim Carrey). TheHuffington Post gives a very large megaphone to denialists. And Seth Mnookin [31], author of the new book The Panic Virus [32], notes that if you want to find vaccine deniers, all you need to do is go hang out at Whole Foods.

Vaccine denial has all the hallmarks of a belief system that’s not amenable to refutation. Over the past decade, the assertion that childhood vaccines are driving autism rateshas been undermined [33] by multiple epidemiological studies—as well as the simple fact that autism rates continue to rise, even though the alleged offending agent in vaccines (a mercury-based preservative called thimerosal) has long since been removed.

Yet the true believers persist—critiquing each new study that challenges their views, and even rallying to the defense of vaccine-autism researcher Andrew Wakefield, afterhis 1998 Lancet paper [34]—which originated the current vaccine scare—was retracted and he subsequently lost his license [35] (PDF) to practice medicine. But then, why should we be surprised? Vaccine deniers created their own partisan media, such as the website Age of Autism, that instantly blast out critiques and counterarguments whenever any new development casts further doubt on anti-vaccine views.

It all raises the question: Do left and right differ in any meaningful way when it comes to biases in processing information, or are we all equally susceptible?

There are some clear differences. Science denial today is considerably more prominent on the political right—once you survey climate and related environmental issues, anti-evolutionism, attacks on reproductive health science by the Christian right, and stem-cell and biomedical matters. More tellingly, anti-vaccine positions are virtually nonexistent among Democratic officeholders today—whereas anti-climate-science views are becoming monolithic among Republican elected officials.

Some researchers have suggested that there are psychological differences between the left and the right that might impact responses to new information—that conservatives are more rigid and authoritarian, and liberals more tolerant of ambiguity. Psychologist John Jost of New York University has further argued that conservatives are “system justifiers”: They engage in motivated reasoning to defend the status quo.

This is a contested area, however, because as soon as one tries to psychoanalyze inherent political differences, a battery of counterarguments emerges: What about dogmatic and militant communists? What about how the parties have differed through history? After all, the most canonical case of ideologically driven science denial is probably the rejection of genetics in the Soviet Union, where researchers disagreeing with the anti-Mendelian scientist (and Stalin stooge) Trofim Lysenko were executed, and genetics itself was denounced as a “bourgeois” science and officially banned.

The upshot: All we can currently bank on is the fact that we all have blinders in some situations. The question then becomes: What can be done to counteract human nature itself?

We all have blinders in some situations. The question then becomes: What can be done to counteract human nature?

Given the power of our prior beliefs to skew how we respond to new information, one thing is becoming clear: If you want someone to accept new evidence, make sure to present it to them in a context that doesn’t trigger a defensive, emotional reaction.

This theory is gaining traction in part because of Kahan’s work at Yale. In one study [36], he and his colleagues packaged the basic science of climate change into fake newspaper articles bearing two very different headlines—”Scientific Panel Recommends Anti-Pollution Solution to Global Warming” and “Scientific Panel Recommends Nuclear Solution to Global Warming”—and then tested how citizens with different values responded. Sure enough, the latter framing made hierarchical individualists much more open to accepting the fact that humans are causing global warming. Kahan infers that the effect occurred because the science had been written into an alternative narrative that appealed to their pro-industry worldview.

You can follow the logic to its conclusion: Conservatives are more likely to embrace climate science if it comes to them via a business or religious leader, who can set the issue in the context of different values than those from which environmentalists or scientists often argue. Doing so is, effectively, to signal a détente in what Kahan has called a “culture war of fact.” In other words, paradoxically, you don’t lead with the facts in order to convince. You lead with the values—so as to give the facts a fighting chance.

[1] https://motherjones.com/files/lfestinger.pdf
[2] http://www.powells.com/biblio/61-9781617202803-1
[3] http://motherjones.com/environment/2011/04/history-of-climategate
[4] http://motherjones.com/environment/2011/04/field-guide-climate-change-skeptics
[5] http://www.ncbi.nlm.nih.gov/pubmed/2270237
[6] http://www-personal.umich.edu/~bnyhan/obama-muslim.pdf
[7] https://motherjones.com/files/descartes.pdf
[8] http://www-personal.umich.edu/~lupia/
[9] http://www.stonybrook.edu/polsci/ctaber/
[10] http://people.virginia.edu/~jdh6n/
[11] https://motherjones.com/files/emotional_dog_and_rational_tail.pdf
[12] http://synapse.princeton.edu/~sam/lord_ross_lepper79_JPSP_biased-assimilation-and-attitude-polarization.pdf
[13] http://psp.sagepub.com/content/23/6/636.abstract
[14] http://www.law.yale.edu/faculty/DKahan.htm
[15] https://motherjones.com/files/kahan_paper_cultural_cognition_of_scientific_consesus.pdf
[16] http://digitalcommons.law.yale.edu/cgi/viewcontent.cgi?article=1095&context=fss_papers
[17] http://seagrant.oregonstate.edu/blogs/communicatingclimate/transcripts/Episode_10b_Dan_Kahan.html
[18] http://www-personal.umich.edu/~bnyhan/nyhan-reifler.pdf
[19] http://www.sociology.northwestern.edu/faculty/prasad/home.html
[20] http://sociology.buffalo.edu/documents/hoffmansocinquiryarticle_000.pdf
[21] http://www.factcheck.org/
[22] http://www.comm.ohio-state.edu/kgarrett/FactcheckMosqueRumors.pdf
[23] http://communication.stanford.edu/faculty/krosnick/
[24] http://www.comm.ohio-state.edu/kgarrett/MediaMosqueRumors.pdf
[25] http://en.wikipedia.org/wiki/Narrowcasting
[26] http://people-press.org/report/417/a-deeper-partisan-divide-over-global-warming
[27] http://environment.yale.edu/profile/leiserowitz/
[28] http://environment.yale.edu/climate/
[29] http://www.huffingtonpost.com/robert-f-kennedy-jr-and-david-kirby/vaccine-court-autism-deba_b_169673.html
[30] http://www.huffingtonpost.com/jenny-mccarthy/vaccine-autism-debate_b_806857.html
[31] http://sethmnookin.com/
[32] http://www.powells.com/biblio/1-9781439158647-0
[33] http://discovermagazine.com/2009/jun/06-why-does-vaccine-autism-controversy-live-on/article_print
[34] http://www.thelancet.com/journals/lancet/article/PIIS0140673697110960/fulltext
[35] http://www.gmc-uk.org/Wakefield_SPM_and_SANCTION.pdf_32595267.pdf
[36] http://www.scribd.com/doc/3446682/The-Second-National-Risk-and-Culture-Study-Making-Sense-of-and-Making-Progress-In-The-American-Culture-War-of-Fact

“Cientistas podem contar uma história sem perder a acurácia” (Fapesp)

Dan Fay, diretor da Microsoft Research Connections, afirma que a tecnologia pode ajudar os pesquisadores a expor artigos e atrair mais leitores, sejam eles seus pares, formuladores de políticas ou agências de fomento (foto:Edu Cesar/FAPESP)



Por Frances Jones

Agência FAPESP – Com mais de 20 anos de Microsoft, o engenheiro Dan Fay faz parte de um seleto grupo de profissionais que vasculha o mundo científico em busca de parcerias entre a gigante de informática e autores de projetos relacionados às ciências da Terra, energia e meio ambiente.

São projetos como o software World Wide Telescope, desenvolvido em parceria com pesquisadores da Universidade Johns Hopkins, que permite que cientistas de diversas regiões do mundo acessem imagens de objetos celestes coletadas por telescópios espaciais, observatórios e instituições de pesquisa, manipulem e compartilhem esses dados.

“Nosso desafio é encontrar pesquisadores a cuja pesquisa podemos agregar valor e não apenas fornecer mais máquinas”, disse Fay, que, além de ser diretor da divisão de Terra, Energia e Ambiente da Microsoft Research Connections, o braço de pesquisas da Microsoft, integra o conselho consultivo industrial para Computação e Tecnologia da Informação da Purdue University, no estado norte-americano de Indiana.

Fay esteve em São Paulo para participar do Latin American eScience Workshop 2013, promovido pela FAPESP e pela Microsoft Research de 13 a 15 de maio, quando proferiu duas palestras a pesquisadores e estudantes de diversos países sobre avanços em diversas áreas do conhecimento proporcionados pela melhoria na capacidade de análise de grandes volumes de informações. Em uma delas, falou sobre como a ciência pode utilizar a computação em nuvem; na outra, sobre “como divulgar sua pesquisa internacionalmente”. Em seguida, conversou com a Agência FAPESP. Leia abaixo, trechos da entrevista.

Agência FAPESP – O que o senhor espera para o futuro com relação à eScience, a utilização no fazer ciência de tecnologias e ferramentas de computação?
Fay – O interessante sobre a eScience é combinar dados computacionais e novas técnicas nas mais diversas áreas. O que vemos agora é um uso maior da computação, mas o passo seguinte será o cruzamento dos diferentes domínios e o uso combinado dessas informações, como, por exemplo, da biologia e do meio ambiente juntos, fazendo uma análise transversal. Os dois domínios falam línguas diferentes. A passagem entre as áreas será um dos próximos desafios. Não se pode assumir que um dado significa algo. É preciso ter certeza.

Agência FAPESP – Como os pesquisadores podem utilizar a computação em nuvem para seus estudos?
Fay – A computação em nuvem fornece um novo paradigma para enfrentar os desafios da computação e da análise de dados nas mais diversas áreas do conhecimento científico. Diferentemente dos supercomputadores tradicionais, isolados e centralizados, a nuvem está em todos os lugares e pode oferecer suporte a diferente estilos de computação que são adequados para a análise de dados e para colaboração. Nos últimos três anos temos trabalhado com pesquisadores acadêmicos para explorar o potencial desta nova plataforma. Temos mais de 90 projetos de pesquisa que usam o Windows Azure [plataforma em nuvem da Microsoft] e temos aprendido muito. Como em qualquer tecnologia nova, há sempre pessoas que começam antes e outras depois. Há vários pesquisadores que começaram a usá-la para compreender como isso pode mudar a forma com que fazem pesquisa.

Agência FAPESP – O senhor acha que no futuro haverá uma mudança na forma de se divulgar ciência?
Dan Fay – Comentei com os alunos no workshop que sempre haverá os artigos tradicionais, de revistas científicas, que são revisados pelos pares. Com a quantidade de informações que temos hoje, no entanto, para tornar isso mais visível para outras pessoas e para o público em geral, os dados também podem ser apresentados de outra forma. Um artigo que se aprofunda nos detalhes também pode ser acessível a pessoas de diferentes domínios. Produzir fotos e vídeos, entre outros recursos, também pode ajudar.

Agência FAPESP – Esse é um papel dos cientistas?
Fay – Você pode usar os mesmos dados científicos acurados e contar histórias sobre eles. E permitir que as pessoas escutem diretamente de você essa história de forma visual. Essa interação é muito poderosa. É a mesma coisa que ir ao museu e ver várias obras de arte: as pessoas se conectam a elas. Os cientistas querem que seus colegas e as pessoas em geral se conectem dessa forma com suas informações, dados ou artigos. Acho bom encontrar outros mecanismos que possam explicar os dados. Estamos em uma época fascinante na qual há uma preocupação em divulgar a informação de um jeito interessante. Isso é verdade não apenas quando se quer falar para os seus pares, mas especialmente se você quer que os formuladores de políticas, o governo ou as agências de fomento entendam o que é o seu trabalho. Às vezes ele tem de ser apresentado de forma que possa ser consumido com mais facilidade.

Agência FAPESP – A tecnologia ajuda nesse ponto?
Fay – Sim. Parte disso porque essas formas fazem com que as pessoas leiam com mais profundidade os artigos. Há uma pesquisadora em Harvard, uma astrônoma, que criou um desses tours virtuais que temos sobre as galáxias. Ela calcula que mais pessoas assistiram a seus tours do que leram seus artigos científicos. Esse tipo de entendimento está aumentando. Se posso ajudar alguém a ler um artigo ao ver isso em outro lugar, isso é um avanço.

Agência FAPESP – O senhor acha que os cientistas devem entrar em redes sociais digitais, como o Facebook, para expor seus trabalhos?
Fay – Sim. Eles devem sempre se apoiar no rigor científico, mas ajuda empregar técnicas de design ou de marketing. Mesmo em seus pôsteres. É uma forma de divulgar melhor as informações.

Agência FAPESP – E as novas tecnologias promoverão a abertura dos dados científicos?
Fay – Para além da abertura de dados, há um problema social, no qual as pessoas se sentem donas de suas ideias ou informações. Encontrar formas de compartilhar os dados e dar o devido reconhecimento às pessoas que os coletaram, processaram e os tornaram disponíveis é importante. É aí, acredito, que estão vários desafios. Também há a questão dos dados médicos ou de outros dados que você não quer que fiquem soltos por aí.

Why Sandy Has Meteorologists Scared, in 4 Images (The Atlantic)

By Alexis Madrigal

OCT 28 2012, 12:23 PM ET 126

She’s huge. She’s strong and might get stronger. She’s strange. She’s directing the might of her storm surge right at New York City.

sandycomes_615.jpgUpdate 10/29, 4:49pm: The Eastern seaboard has battened down the hatches. Hurricane Sandy is expected to make landfall in New Jersey in the next few hours, but flooding has been reported in Atlantic City and pieces of New York during this morning’s high tide cycle. The Metropolitan Transportation Authority already shut down rail, bus, and subway service in NYC, as did Washington DC’s authorities. All eyes are on the 8 o’clock hour, when the storm surge from Sandy will combine with a very high tide to create maximum water levels. In the worst case scenario, the storm surge will hit precisely at the moment the tide peaks at 8:53pm. In that scenario, New York City, in particular, could sustain substantial damage, especially to its transportation infrastructure.

The good news, if there is any, is that the forecast hasn’t worsened much. It is what it has been, which is grim. Meteorologist Jeff Masters put it in simple terms. “As the core of Sandy moves ashore, the storm will carry with it a gigantic bulge of water that will raise waters levels to the highest storm tides ever seen in over a century of record keeping, along much of the coastline of New Jersey and New York,” Masters wrote today. “The peak danger will be between 7 pm – 10 pm, when storm surge rides in on top of the high tide.”

Here’s the latest map of the prospective storm surge tonight. You can compare it to the image at the bottom, which shows what the forecast was yesterday.


* * *

Hurricane Sandy has already caused her first damage in New York: the subway system will be shut as of 7pm tonight. Meteorologists are scared, so city planners are scared.

For many, the hullabaloo raises memories of Irene, which despite causing $15.6 billion worth of damages in the United States, did not live up to its pre-arrival hype.

By almost all measures, this storm looks like it could be worse: higher winds, a path through a more populated area, worse storm surge, and a greater chance it’ll linger. The atmospherics, you might say, all point to this being the worst storm in recent history.

I’ve been watching weather nerds freak out about a few different graphs over the last several days, which they’ve sent around like sports fans would tweet a particularly vicious hit in the NFL. You don’t want to look, but you also can’t help it.

Dr. Ryan Maue, a meteorologist at WeatherBELL, put out this animated GIF of the storm’s approach yesterday. “This is unprecedented –absolutely stunning upper-level configuration pinwheeling #Sandy on-shore like ping-pong ball,” he tweeted. It shows how cold air to the north and west of the storm spin Sandy into the mid-atlantic coastline. (Nota bene: his models also show very high winds at skyscraper altitudes.)


hurricanegif.gifThis morning, the Wall Street Journal’s Eric Holthaus (@WSJweather), tweeted the following map. “Oh my…. I have never seen so much purple on this graphic. By far. Never,” he said. “Folks, please take this storm seriously.” The storm is strong *and* huge. And when it encounters the cold air from the north and west, it will develop renewed strength thanks to that interaction, a process known as “baroclinic enhancement.”


This last graphic I created from National Oceanographic and Atmospheric Administration data that has weather watchers worried. It shows the probability of a greater than six foot storm surge  in and around New York City. Hurricane Irene, by comparison, caused a four foot surge.
Note that the highest probabilities are focused tightly around New York City, which also happens to be the most densely populated area in the country. That’s a very bad combination. Jeff Masters, author of the must-read storm blog Wunderground, laid out the general problem.
“[According to last night’s forecast], the destructive potential of the storm surge was exceptionally high: 5.7 on a scale of 0 to 6,” he wrote. “This is a higher destructive potential than any hurricane observed between 1969 – 2005, including Category 5 storms like Katrina, Rita, Wilma, Camille, and Andrew.”
Specifically, New York City’s infrastructure may take an unprecedented hit. The subway narrowly escaped flooding during Irene, and Sandy (for all the reasons above) is expected to be worse. So…

“According to the latest storm surge forecast for NYC from NHC, Sandy’s storm surge is expected to be several feet higher than Irene’s. If the peak surge arrives near Monday evening’s high tide at 9 pm EDT, a portion of New York City’s subway system could flood, resulting in billions of dollars in damage,” Masters concluded. “I give a 50% chance that Sandy’s storm surge will end up flooding a portion of the New York City subway system.”

Update 1:06pm: To get a taste of how forecasters are feeling, here is The Weather Channel’s senior meteorologist, Stu Ostro:

History is being written as an extreme weather event continues to unfold, one which will occupy a place in the annals of weather history as one of the most extraordinary to have affected the United States.

On Twitter, Alan Robinson pointed out that I left out another scary map, the rainfall forecast, which shows the storm “sitting over the Delaware and Susquehanna watersheds.” Much of the damage that Irene caused came from flooding rivers. However, there is one key factor militating against similar damage, Jeff Masters of Wunderground says. Irene hit when the ground was already very wet. Sandy is striking when ground moisture is roughly average. Here’s Masters whole statement:

Hurricane Irene caused $15.8 billion in damage, most of it from river flooding due to heavy rains. However, the region most heavily impacted by Irene’s heavy rains had very wet soils and very high river levels before Irene arrived, due to heavy rains that occurred in the weeks before the hurricane hit. That is not the case for Sandy; soil moisture is near average over most of the mid-Atlantic, and is in the lowest 30th percentile in recorded history over much of Delaware and Southeastern Maryland. One region of possible concern is the Susquehanna River Valley in Eastern Pennsylvania, where soil moisture is in the 70th percentile, and river levels are in the 76th – 90th percentile. This area is currently expected to receive 3 – 6 inches of rain (Figure 4), which is probably not enough to cause catastrophic flooding like occurred for Hurricane Irene. I expect that river flooding from Sandy will cause less than $1 billion in damage.

Current scientific knowledge does not substantiate Ban Ki-Moon assertions on weather and climate, say 125-plus scientists (Financial Post) + EANTH list reactions

OPEN CLIMATE LETTER TO UN SECRETARY-GENERAL: Current scientific knowledge does not substantiate Ban Ki-Moon assertions on weather and climate, say 125-plus scientists.

Special to Financial Post | Nov 29, 2012 8:36 PM ET | Last Updated:Nov 30, 2012 12:11 PM ET

Getty – UN Secretary-General Ban Ki-Moon

Policy actions that aim to reduce CO2 emissions are unlikely to influence future climate. Policies need to focus on preparation for, and adaptation to, all dangerous climatic events, however caused

Open Letter to the Secretary-General of the United Nations

H.E. Ban Ki-Moon, Secretary-General, United Nations. First Avenue and East 44th Street, New York, New York, U.S.A.

November 29, 2012

Mr. Secretary-General:

On November 9 this year you told the General Assembly: “Extreme weather due to climate change is the new normal … Our challenge remains, clear and urgent: to reduce greenhouse gas emissions, to strengthen adaptation to … even larger climate shocks … and to reach a legally binding climate agreement by 2015 … This should be one of the main lessons of Hurricane Sandy.”

On November 13 you said at Yale: “The science is clear; we should waste no more time on that debate.”

The following day, in Al Gore’s “Dirty Weather” Webcast, you spoke of “more severe storms, harsher droughts, greater floods”, concluding: “Two weeks ago, Hurricane Sandy struck the eastern seaboard of the United States. A nation saw the reality of climate change. The recovery will cost tens of billions of dollars. The cost of inaction will be even higher. We must reduce our dependence on carbon emissions.”

We the undersigned, qualified in climate-related matters, wish to state that current scientific knowledge does not substantiate your assertions.

The U.K. Met Office recently released data showing that there has been no statistically significant global warming for almost 16 years. During this period, according to the U.S. National Oceanic and Atmospheric Administration (NOAA), carbon dioxide (CO2) concentrations rose by nearly 9% to now constitute 0.039% of the atmosphere. Global warming that has not occurred cannot have caused the extreme weather of the past few years. Whether, when and how atmospheric warming will resume is unknown. The science is unclear. Some scientists point out that near-term natural cooling, linked to variations in solar output, is also a distinct possibility.

The “even larger climate shocks” you have mentioned would be worse if the world cooled than if it warmed. Climate changes naturally all the time, sometimes dramatically. The hypothesis that our emissions of CO2 have caused, or will cause, dangerous warming is not supported by the evidence.

The incidence and severity of extreme weather has not increased. There is little evidence that dangerous weather-related events will occur more often in the future. The U.N.’s own Intergovernmental Panel on Climate Change says in its Special Report on Extreme Weather (2012) that there is “an absence of an attributable climate change signal” in trends in extreme weather losses to date. The funds currently dedicated to trying to stop extreme weather should therefore be diverted to strengthening our infrastructure so as to be able to withstand these inevitable, natural events, and to helping communities rebuild after natural catastrophes such as tropical storm Sandy.

There is no sound reason for the costly, restrictive public policy decisions proposed at the U.N. climate conference in Qatar. Rigorous analysis of unbiased observational data does not support the projections of future global warming predicted by computer models now proven to exaggerate warming and its effects.

The NOAA “State of the Climate in 2008” report asserted that 15 years or more without any statistically-significant warming would indicate a discrepancy between observation and prediction. Sixteen years without warming have therefore now proven that the models are wrong by their creators’ own criterion.

Based upon these considerations, we ask that you desist from exploiting the misery of the families of those who lost their lives or properties in tropical storm Sandy by making unsupportable claims that human influences caused that storm. They did not. We also ask that you acknowledge that policy actions by the U.N., or by the signatory nations to the UNFCCC, that aim to reduce CO2 emissions are unlikely to exercise any significant influence on future climate. Climate policies therefore need to focus on preparation for, and adaptation to, all dangerous climatic events however caused.

Signed by:

  1. Habibullo I. Abdussamatov, Dr. Sci., mathematician and astrophysicist, Head of the Selenometria project on the Russian segment of the ISS, Head of Space Research of the Sun Sector at the Pulkovo Observatory of the Russian Academy of Sciences, St. Petersburg, Russia
  2. Syun-Ichi Akasofu, PhD, Professor of Physics, Emeritus and Founding Director, International Arctic Research Center of the University of Alaska, Fairbanks, Alaska, U.S.A.
  3. Bjarne Andresen, Dr. Scient., physicist, published and presents on the impossibility of a “global temperature”, Professor, Niels Bohr Institute (physics (thermodynamics) and chemistry), University of Copenhagen, Copenhagen, Denmark
  4. J. Scott Armstrong, PhD, Professor of Marketing, The Wharton School, University of Pennsylvania, Founder of the International Journal of Forecasting, focus on analyzing climate forecasts, Philadelphia, Pennsylvania, U.S.A.
  5. Timothy F. Ball, PhD, environmental consultant and former climatology professor, University of Winnipeg, Winnipeg, Manitoba, Canada
  6. James R. Barrante, Ph.D. (chemistry, Harvard University), Emeritus Professor of Physical Chemistry, Southern Connecticut State University, focus on studying the greenhouse gas behavior of CO2, Cheshire, Connecticut, U.S.A.
  7. Colin Barton, B.Sc., PhD (Earth Science, Birmingham, U.K.), FInstEng Aus Principal research scientist (ret.), Commonwealth Scientific and Industrial Research Organisation (CSIRO), Melbourne, Victoria, Australia
  8. Joe Bastardi, BSc, (Meteorology, Pennsylvania State), meteorologist, State College, Pennsylvania, U.S.A.
  9. Franco Battaglia, PhD (Chemical Physics), Professor of Physics and Environmental Chemistry, University of Modena, Italy
  10. Richard Becherer, BS (Physics, Boston College), MS (Physics, University of Illinois), PhD (Optics, University of Rochester), former Member of the Technical Staff – MIT Lincoln Laboratory, former Adjunct Professor – University of Connecticut, Areas of Specialization: optical radiation physics, coauthor – standard reference book Optical Radiation Measurements: Radiometry, Millis, MA, U.S.A.
  11. Edwin X. Berry, PhD (Atmospheric Physics, Nevada), MA (Physics, Dartmouth), BS (Engineering, Caltech), Certified Consulting Meteorologist, President, Climate Physics LLC, Bigfork, MT, U.S.A.
  12. Ian Bock, BSc, PhD, DSc, Biological sciences (retired), Ringkobing, Denmark
  13. Ahmed Boucenna, PhD, Professor of Physics (strong climate focus), Physics Department, Faculty of Science, Ferhat Abbas University, Setif, Algéria
  14. Antonio Brambati, PhD, Emeritus Professor (sedimentology), Department of Geological, Environmental and Marine Sciences (DiSGAM), University of Trieste (specialization: climate change as determined by Antarctic marine sediments), Trieste, Italy
  15. Stephen C. Brown, PhD (Environmental Science, State University of New York), District Agriculture Agent, Assistant Professor, University of Alaska Fairbanks, Ground Penetrating Radar Glacier research, Palmer, Alaska, U.S.A.
  16. Mark Lawrence Campbell, PhD (chemical physics; gas-phase kinetic research involving greenhouse gases (nitrous oxide, carbon dioxide)), Professor, United States Naval Academy, Annapolis, Maryland, U.S.A.
  17. Rudy Candler, PhD (Soil Chemistry, University of Alaska Fairbanks (UAF)), former agricultural laboratory manager, School of Agriculture and Land Resources Management, UAF, co-authored papers regarding humic substances and potential CO2 production in the Arctic due to decomposition, Union, Oregon, U.S.A.
  18. Alan Carlin, B.S. (California Institute of Technology), PhD (economics, Massachusetts Institute of Technology), retired senior analyst and manager, U.S. Environmental Protection Agency, Washington, DC, former Chairman of the Angeles Chapter of the Sierra Club (recipient of the Chapter’s Weldon Heald award for conservation work), U.S.A.
  19. Dan Carruthers, M.Sc., Arctic Animal Behavioural Ecologist, wildlife biology consultant specializing in animal ecology in Arctic and Subarctic regions, Turner Valley, Alberta, Canada
  20. Robert M. Carter, PhD, Professor, Marine Geophysical Laboratory, James Cook University, Townsville, Australia
  21. Uberto Crescenti, PhD, Full Professor of Applied Geology, Università G. d’Annunzio, Past President Società Geologica taliana, Chieti, Italy
  22. Arthur Chadwick, PhD (Molecular Biology), Research Professor of Geology, Department of Biology and Geology, Southwestern Adventist University, Climate Specialties: dendrochronology (determination of past climate states by tree ring analysis), palynology (same but using pollen as a climate proxy), paleobotany and botany; Keene, Texas, U.S.A.
  23. George V. Chilingar, PhD, Professor, Department of Civil and Environmental Engineering of Engineering (CO2/temp. focused research), University of Southern California, Los Angeles, California, U.S.A.
  24. Ian D. Clark, PhD, Professor (isotope hydrogeology and paleoclimatology), Dept. of Earth Sciences, University of Ottawa, Ottawa, Ontario, Canada
  25. Cornelia Codreanova, Diploma in Geography, Researcher (Areas of Specialization: formation of glacial lakes) at Liberec University, Czech Republic, Zwenkau, Germany
  26. Michael Coffman, PhD (Ecosystems Analysis and Climate Influences, University of Idaho), CEO of Sovereignty International, President of Environmental Perspectives, Inc., Bangor, Maine, U.S.A.
  27. Piers Corbyn, ARCS, MSc (Physics, Imperial College London)), FRAS, FRMetS, astrophysicist (Queen Mary College, London), consultant, founder WeatherAction long range weather and climate forecasters, American Thinker Climate Forecaster of The Year 2010, London, United Kingdom
  28. Richard S. Courtney, PhD, energy and environmental consultant, IPCC expert reviewer, Falmouth, Cornwall, United Kingdom
  29. Roger W. Cohen, B.S., M.S., PhD Physics, MIT and Rutgers University, Fellow, American Physical Society, initiated and managed for more than twenty years the only industrial basic research program in climate, Washington Crossing, Pennsylvania, U.S.A.
  30. Susan Crockford, PhD (Zoology/Evolutionary Biology/Archaeozoology), Adjunct Professor (Anthropology/Faculty of Graduate Studies), University of Victoria, Victoria, British Colombia, Canada
  31. Walter Cunningham, B.S., M.S. (Physics – Institute of Geophysics And Planetary Sciences,  UCLA), AMP – Harvard Graduate School of Business, Colonel (retired) U.S. Marine Corps, Apollo 7 Astronaut., Fellow – AAS, AIAA; Member AGU, Houston, Texas, U.S.A.
  32. Joseph D’Aleo, BS, MS (Meteorology, University of Wisconsin),  Doctoral Studies (NYU), CMM, AMS Fellow, Executive Director – ICECAP (International Climate and Environmental Change Assessment Project), College Professor Climatology/Meteorology, First Director of Meteorology The Weather Channel, Hudson, New Hampshire, U.S.A.
  33. David Deming, PhD (Geophysics), Professor of Arts and Sciences, University of Oklahoma, Norman, Oklahoma, U.S.A.
  34. James E. Dent; B.Sc., FCIWEM, C.Met, FRMetS, C.Env., Independent Consultant (hydrology & meteorology), Member of WMO OPACHE Group on Flood Warning, Hadleigh, Suffolk, England, United Kingdom
  35. Willem de Lange, MSc (Hons), DPhil (Computer and Earth Sciences), Senior Lecturer in Earth and Ocean Sciences, The University of Waikato, Hamilton, New Zealand
  36. Silvia Duhau, Ph.D. (physics), Solar Terrestrial Physics, Buenos Aires University, Buenos Aires, Argentina
  37. Geoff Duffy, DEng (Dr of Engineering), PhD (Chemical Engineering), BSc, ASTCDip. (first chemical engineer to be a Fellow of the Royal Society in NZ), FIChemE, wide experience in radiant heat transfer and drying, chemical equilibria, etc. Has reviewed, analysed, and written brief reports and papers on climate change, Auckland, New Zealand
  38. Don J. Easterbrook, PhD, Emeritus Professor of Geology, Western Washington, University, Bellingham, Washington, U.S.A.
  39. Ole Henrik Ellestad, former Research Director, applied chemistry SINTEF, Professor in physical chemistry, University of Oslo, Managing director Norsk Regnesentral and Director for Science and Technology, Norwegian Research Council, widely published in infrared spectroscopy, Oslo, Norway
  40. Per Engene, MSc, Biologist, Co-author – The Climate, Science and Politics (2009), Bø i Telemark, Norway
  41. Gordon Fulks, B.S., M.S., PhD (Physics, University of Chicago), cosmic radiation, solar wind, electromagnetic and geophysical phenomena, Portland, Oregon, U.S.A.
  42. Katya Georgieva, MSc (meteorology), PhD (solar-terrestrial climate physics), Professor, Space Research and Technologies Institute, Bulgarian Academy of Sciences, Sofia, Bulgaria
  43. Lee C. Gerhard, PhD, Senior Scientist Emeritus, University of Kansas, past director and state geologist, Kansas Geological Survey, U.S.A.
  44. Ivar Giaever PhD, Nobel Laureate in Physics 1973, professor emeritus at the Rensselaer Polytechnic Institute, a professor-at-large at the University of Oslo, Applied BioPhysics, Troy, New York, U.S.A.
  45. Albrecht Glatzle, PhD, ScAgr, Agro-Biologist and Gerente ejecutivo, Tropical pasture research and land use management, Director científico de INTTAS, Loma Plata, Paraguay
  46. Fred Goldberg, PhD, Adj Professor, Royal Institute of Technology (Mech, Eng.), Secretary General KTH International Climate Seminar 2006 and Climate analyst (NIPCC), Lidingö, Sweden
  47. Laurence I. Gould, PhD, Professor of Physics, University of Hartford, Past Chair (2004), New England Section of the American Physical Society, West Hartford, Connecticut, U.S.A.
  48. Vincent Gray, PhD, New Zealand Climate Coalition, expert reviewer for the IPCC, author of The Greenhouse Delusion: A Critique of Climate Change 2001, Wellington, New Zealand
  49. William M. Gray, PhD, Professor Emeritus, Dept. of Atmospheric Science, Colorado State University, Head of the Tropical Meteorology Project, Fort Collins, Colorado, U.S.A.
  50. Charles B. Hammons, PhD (Applied Mathematics), climate-related specialties: applied mathematics, modeling & simulation, software & systems engineering, Associate Professor, Graduate School of Management, University of Dallas; Assistant Professor, North Texas State University (Dr. Hammons found many serious flaws during a detailed study of the software, associated control files plus related email traffic of the Climate Research Unit temperature and other records and “adjustments” carried out in support of IPCC conclusions), Coyle, OK, U.S.A.
  51. William Happer, PhD, Professor, Department of Physics, Princeton University, Princeton, NJ, U.S.A.
  52. Hermann Harde, PhD, Professur f. Lasertechnik & Werkstoffkunde (specialized in molecular spectroscopy, development of gas sensors and CO2-climate sensitivity), Helmut-Schmidt-Universität, Universität der Bundeswehr Fakultät für Elektrotechnik, Hamburg, Germany
  53. Howard Hayden, PhD, Emeritus Professor (Physics), University of Connecticut, The Energy Advocate, Pueblo West, Colorado, U.S.A.
  54. Ross Hays, Meteorologist, atmospheric scientist, NASA Columbia Scientific Balloon Facility (currently working at McMurdo Station, Antarctica), Palestine, Texas, U.S.A.
  55. Martin Hovland, M.Sc. (meteorology, University of Bergen), PhD (Dr Philos, University of Tromsø), FGS, Emeritus Professor, Geophysics, Centre for Geobiology, University of Bergen, member of the expert panel: Environmental Protection and Safety Panel (EPSP) for the Ocean Drilling Program (ODP) and the Integrated ODP, Stavanger, Norway
  56. Ole Humlum, PhD, Professor of Physical Geography, Department of Physical Geography, Institute of Geosciences, University of Oslo, Oslo, Norway
  57. Craig D. Idso, PhD, Chairman of the Board of Directors of the Center for the Study of Carbon Dioxide and Global Change, Tempe, Arizona, U.S.A.
  58. Sherwood B. Idso, PhD, President, Center for the Study of Carbon Dioxide and Global Change, Tempe, Arizona, U.S.A.
  59. Larry Irons, BS (Geology), MS (Geology), Sr. Geophysicist at Fairfield Nodal (specialization: paleoclimate), Lakewood, Colorado, U.S.A.
  60. Terri Jackson, MSc (plasma physics), MPhil (energy economics), Director, Independent Climate Research Group, Northern Ireland and London (Founder of the energy/climate group at the Institute of Physics, London), United Kingdom
  61. Albert F. Jacobs, Geol.Drs., P. Geol., Calgary, Alberta, Canada
  62. Hans Jelbring, PhD Climatology, Stockholm University, MSc Electronic engineering, Royal Institute of Technology, BSc  Meteorology, Stockholm University, Sweden
  63. Bill Kappel, B.S. (Physical Science-Geology), B.S. (Meteorology), Storm Analysis, Climatology, Operation Forecasting, Vice President/Senior Meteorologist, Applied Weather Associates, LLC, University of Colorado, Colorado Springs, U.S.A.
  64. Olavi Kärner, Ph.D., Extraordinary Research Associate; Dept. of Atmospheric Physics, Tartu Observatory, Toravere, Estonia
  65. Leonid F. Khilyuk, PhD, Science Secretary, Russian Academy of Natural Sciences, Professor of Engineering (CO2/temp. focused research), University of Southern California, Los Angeles, California, U.S.A.
  66. William Kininmonth MSc, MAdmin, former head of Australia’s National Climate Centre and a consultant to the World Meteorological organization’s Commission for Climatology, Kew, Victoria, Australia
  67. Gerhard Kramm, Dr. rer. nat. (Theoretical Meteorology), Research Associate Professor, Geophysical Institute, Associate Faculty, College of Natural Science and Mathematics, University of Alaska Fairbanks, (climate specialties: Atmospheric energetics, physics of the atmospheric boundary layer, physical climatology – seeinteresting paper by Kramm et al), Fairbanks, Alaska, U.S.A.
  68. Leif Kullman, PhD (Physical geography, plant ecology, landscape ecology), Professor, Physical geography, Department of Ecology and Environmental science, Umeå University, Areas of Specialization: Paleoclimate (Holocene to the present), glaciology, vegetation history, impact of modern climate on the living landscape, Umeå, Sweden
  69. Hans H.J. Labohm, PhD, Independent economist, author specialised in climate issues, IPCC expert reviewer, author of Man-Made Global Warming: Unravelling a Dogma and climate science-related Blog, The Netherlands
  70. Rune Berg-Edland Larsen, PhD (Geology, Geochemistry), Professor, Dep. Geology and Geoengineering, Norwegian University of Science and Technology (NTNU), Trondheim, Norway
  71. C. (Kees) le Pair, PhD (Physics Leiden, Low Temperature Physics), former director of the Netherlands Research Organization FOM (fundamental physics) and subsequently founder and director of The Netherlands Technology Foundation STW.  Served the Dutch Government many years as member of its General Energy Council and of the National Defense Research Council. Royal Academy of Arts and Sciences Honorary Medal and honorary doctorate in all technical sciences of the Delft University of technology, Nieuwegein, The Netherlands
  72. Douglas Leahey, PhD, meteorologist and air-quality consultant, past President – Friends of Science, Calgary, Alberta, Canada
  73. Jay Lehr, B.Eng. (Princeton), PhD (environmental science and ground water hydrology), Science Director, The Heartland Institute, Chicago, Illinois, U.S.A.
  74. Bryan Leyland, M.Sc., FIEE, FIMechE, FIPENZ, MRSNZ, consulting engineer (power), Energy Issues Advisor – International Climate Science Coalition, Auckland, New Zealand
  75. Edward Liebsch, B.A. (Earth Science, St. Cloud State University); M.S. (Meteorology, The Pennsylvania State University), former Associate Scientist, Oak Ridge National Laboratory; former Adjunct Professor of Meteorology, St. Cloud State University, Environmental Consultant/Air Quality Scientist (Areas of Specialization: micrometeorology, greenhouse gas emissions), Maple Grove, Minnesota, U.S.A.
  76. William Lindqvist, PhD (Applied Geology), Independent Geologic Consultant, Areas of Specialization: Climate Variation in the recent geologic past, Tiburon, California, U.S.A.
  77. Horst-Joachim Lüdecke, Prof. Dr. , PhD (Physics), retired from university of appl. sciences HTW, Saarbrücken (Germany), atmospheric temperature research, speaker of the European Institute for Climate and Energy (EIKE), Heidelberg, Germany
  78. Anthony R. Lupo, Ph.D., Professor of Atmospheric Science, Department of Soil, Environmental, and Atmospheric Science, University of Missouri, Columbia, Missouri, U.S.A.
  79. Oliver Manuel, BS, MS, PhD, Post-Doc (Space Physics), Associate – Climate & Solar Science Institute, Emeritus Professor, College of Arts & Sciences University of Missouri-Rolla, previously Research Scientist (US Geological Survey) and NASA Principal Investigator for Apollo, Cape Girardeau, Missouri, U.S.A.
  80. Francis Massen, professeur-docteur en physique (PhD equivalent, Universities of Nancy (France) and Liège (Belgium), Manager of the Meteorological Station of the Lycée Classique de Diekirch, specialising in the measurement of solar radiation and atmospheric gases. Collaborator to the WOUDC (World Ozone and UV Radiation Data Center), Diekirch, Luxembourg
  81. Henri Masson, Prof. dr. ir., Emeritus Professor University of Antwerp (Energy & Environment Technology Management), Visiting professor Maastricht School of Management, specialist in dynamical (chaotic) complex system analysis, Antwerp, Belgium.
  82. Ferenc Mark Miskolczi, PhD, atmospheric physicist, formerly of NASA’s Langley Research Center, Hampton, Virginia, U.S.A.
  83. Viscount Monckton of Brenchley, Expert reviewer, IPCC Fifth Assessment Report, Quantification of Climate Sensitivity, Carie, Rannoch, Scotland
  84. Nils-Axel Mörner, PhD (Sea Level Changes and Climate), Emeritus Professor of Paleogeophysics & Geodynamics, Stockholm University, Stockholm, Sweden
  85. John Nicol, PhD (Physics, James Cook University), Chairman – Australian climate Science Coalition, Brisbane, Australia
  86. Ingemar Nordin, PhD, professor in philosophy of science (including a focus on “Climate research, philosophical and sociological aspects of a politicised research area”), Linköpings University, Sweden.
  87. David Nowell, M.Sc., Fellow of the Royal Meteorological Society, former chairman of the NATO Meteorological Group, Ottawa, Ontario, Canada
  88. Cliff Ollier, D.Sc., Professor Emeritus (School of Earth and Environment – see hisCopenhagen Climate Challenge sea level article here), Research Fellow, University of Western Australia, Nedlands, W.A., Australia
  89. Oleg M. Pokrovsky, BS, MS, PhD (mathematics and atmospheric physics – St. Petersburg State University, 1970), Dr. in Phys. and Math Sciences (1985), Professor in Geophysics (1995), principal scientist, Main Geophysical Observatory (RosHydroMet), Note: Dr. Pokrovsky analyzed long climates and concludes that anthropogenic CO2 impact is not the main contributor in climate change,St. Petersburg, Russia.
  90. Daniel Joseph Pounder, BS (Meteorology, University of Oklahoma), MS (Atmospheric Sciences, University of Illinois, Urbana-Champaign); Meteorological/Oceanographic Data Analyst for the National Data Buoy Center, formerly Meteorologist, WILL AM/FM/TV, Urbana, U.S.A.
  91. Brian Pratt, PhD, Professor of Geology (Sedimentology), University of Saskatchewan (see Professor Pratt’s article for a summary of his views), Saskatoon, Saskatchewan, Canada
  92. Harry N.A. Priem, PhD, Professore-emeritus isotope-geophysics and planetary geology, Utrecht University, past director ZWO/NOW Institute of Isotope Geophysical Research, Past-President Royal Netherlands Society of Geology and Mining, Amsterdam, The Netherlands
  93. Oleg Raspopov, Doctor of Science and Honored Scientist of the Russian Federation, Professor – Geophysics, Senior Scientist, St. Petersburg Filial (Branch) of N.V.Pushkov Institute of Terrestrial Magnetism, Ionosphere and Radiowaves Propagation of RAS (climate specialty: climate in the past, particularly the influence of solar variability), Editor-in-Chief of journal “Geomagnetism and Aeronomy” (published by Russian Academy of Sciences), St. Petersburg, Russia
  94. Curt G. Rose, BA, MA (University of Western Ontario), MA, PhD (Clark University), Professor Emeritus, Department of Environmental Studies and Geography, Bishop’s University, Sherbrooke, Quebec, Canada
  95. S. Jeevananda Reddy, M.Sc. (Geophysics), Post Graduate Diploma (Applied Statistics, Andhra University), PhD (Agricultural Meteorology, Australian University, Canberra), Formerly Chief Technical Advisor—United Nations World Meteorological Organization (WMO) & Expert-Food and Agriculture Organization (UN), Convener – Forum for a Sustainable Environment, author of 500 scientific articles and several books – here is one: “Climate Change – Myths & Realities“, Hyderabad, India
  96. Arthur Rorsch, PhD, Emeritus Professor, Molecular Genetics, Leiden University, former member of the board of management of the Netherlands Organization Applied Research TNO, Leiden, The Netherlands
  97. Rob Scagel, MSc (forest microclimate specialist), Principal Consultant – Pacific Phytometric Consultants, Surrey, British Columbia, Canada
  98. Chris Schoneveld, MSc (Structural Geology), PhD (Geology), retired exploration geologist and geophysicist, Australia and France
  99. Tom V. Segalstad, PhD (Geology/Geochemistry), Associate Professor of Resource and Environmental Geology, University of Oslo, former IPCC expert reviewer, former Head of the Geological Museum, and former head of the Natural History Museum and Botanical Garden (UO), Oslo, Norway
  100. John Shade, BS (Physics), MS (Atmospheric Physics), MS (Applied Statistics), Industrial Statistics Consultant, GDP, Dunfermline, Scotland, United Kingdom
  101. Thomas P. Sheahen, B.S., PhD (Physics, Massachusetts Institute of Technology), specialist in renewable energy, research and publication (applied optics) in modeling and measurement of absorption of infrared radiation by atmospheric CO2,  National Renewable Energy Laboratory (2005-2009); Argonne National Laboratory (1988-1992); Bell Telephone labs (1966-73), National Bureau of Standards (1975-83), Oakland, Maryland, U.S.A.
  102. S. Fred Singer, PhD, Professor Emeritus (Environmental Sciences), University of Virginia, former director, U.S. Weather Satellite Service, Science and Environmental Policy Project, Charlottesville, Virginia, U.S.A.
  103. Frans W. Sluijter, Prof. dr ir, Emeritus Professor of theoretical physics, Technical University Eindhoven, Chairman—Skepsis Foundation, former vice-president of the International Union of Pure and Applied Physics, former President of the Division on Plasma Physics of the European Physical Society and former bureau member of the Scientific Committee on Sun-Terrestrial Physics, Euvelwegen, the Netherlands
  104. Jan-Erik Solheim, MSc (Astrophysics), Professor, Institute of Physics, University of Tromsø, Norway (1971-2002), Professor (emeritus), Institute of Theoretical Astrophysics, University of Oslo, Norway (1965-1970, 2002- present), climate specialties: sun and periodic climate variations, scientific paper by Professor Solheim “Solen varsler et kaldere tiår“, Baerum, Norway
  105. H. Leighton Steward, Master of Science (Geology), Areas of Specialization: paleoclimates and empirical evidence that indicates CO2 is not a significant driver of climate change, Chairman, PlantsNeedCO2.org and CO2IsGreen.org, Chairman of the Institute for the Study of Earth and Man (geology, archeology & anthropology) at SMU in Dallas, Texas, Boerne, TX, U.S.A.
  106. Arlin B. Super, PhD (Meteorology – University of Wisconsin at Madison), former Professor of Meteorology at Montana State University, retired Research Meteorologist, U.S. Bureau of Reclamation, Saint Cloud, Minnesota, U.S.A.
  107. Edward (Ted) R. Swart, D.Sc. (physical chemistry, University of Pretoria), M.Sc. and Ph.D. (math/computer science, University of Witwatersrand). Formerly Director of the Gulbenkian Centre, Dean of the Faculty of Science, Professor and Head of the Department of Computer Science, University of Rhodesia and past President of the Rhodesia Scientific Association. Set up the first radiocarbon dating laboratory in Africa. Most recently, Professor in the Department of Combinatorics and Optimization at the University of Waterloo and Chair of Computing and Information Science and Acting Dean at the University of Guelph, Ontario, Canada, now retired in Kelowna British Columbia, Canada
  108. George H. Taylor, B.A. (Mathematics, U.C. Santa Barbara), M.S. (Meteorology, University of Utah), Certified Consulting Meteorologist, Applied Climate Services, LLC, Former State Climatologist (Oregon), President, American Association of State Climatologists (1998-2000), Corvallis, Oregon, U.S.A.
  109. J. E. Tilsley, P.Eng., BA Geol, Acadia University, 53 years of climate and paleoclimate studies related to development of economic mineral deposits, Aurora, Ontario, Canada
  110. Göran Tullberg, Civilingenjör i Kemi (equivalent to Masters of Chemical Engineering), Co-author – The Climate, Science and Politics (2009) (see here for a review), formerly instructor of Organic Chemistry (specialization in “Climate chemistry”), Environmental Control and Environmental Protection Engineering at University in Växjö; Falsterbo, Sweden
  111. Brian Gregory Valentine, PhD, Adjunct professor of engineering (aero and fluid dynamics specialization) at the University of Maryland, Technical manager at US Department of Energy, for large-scale modeling of atmospheric pollution, Technical referee for the US Department of Energy’s Office of Science programs in climate and atmospheric modeling conducted at American Universities and National Labs, Washington, DC, U.S.A.
  112. Bas van Geel, PhD, paleo-climatologist, Institute for Biodiversity and Ecosystem Dynamics, Research Group Paleoecology and Landscape Ecology, Faculty of Science, Universiteit van Amsterdam, Amsterdam, The Netherlands
  113. Gerrit J. van der Lingen, PhD (Utrecht University), geologist and paleoclimatologist, climate change consultant, Geoscience Research and Investigations, Nelson, New Zealand
  114. A.J. (Tom) van Loon, PhD, Professor of Geology (Quaternary Geologyspecialism: Glacial Geology), Adam Mickiewicz University, former President of the European Association of Science Editors Poznan, Poland
  115. Fritz Vahrenholt, B.S. (chemistry), PhD (chemistry), Prof. Dr., Professor of Chemistry, University of Hamburg, Former Senator for environmental affairs of the State of Hamburg, former CEO of REpower Systems AG (wind turbines), Author of the book Die kalte Sonne: warum die Klimakatastrophe nicht stattfindet (The Cold Sun: Why the Climate Crisis Isn’t Happening”, Hamburg, Germany
  116. Michael G. Vershovsky, Ph.D. in meteorology (macrometeorology, long-term forecasts, climatology), Senior Researcher, Russian State Hydrometeorological University, works with, as he writes, “Atmospheric Centers of Action (cyclones and anticyclones, such as Icelandic depression, the South Pacific subtropical anticyclone, etc.). Changes in key parameters of these centers strongly indicate that the global temperature is influenced by these natural factors (not exclusively but nevertheless)”, St. Petersburg, Russia
  117. Gösta Walin, PhD and Docent (theoretical Physics, University of Stockholm), Professor Emeritus in oceanografi, Earth Science Center, Göteborg University, Göteborg,  Sweden
  118. Anthony Watts, ItWorks/IntelliWeather, Founder, surfacestations.orgWatts Up With That, Chico, California, U.S.A.
  119. Carl Otto Weiss, Direktor und Professor at Physikalisch-Technische Bundesanstalt,  Visiting Professor at University of Copenhagen, Tokyo Institute of Technology, Coauthor of ”Multiperiodic Climate Dynamics: Spectral Analysis of…“, Braunschweig, Germany
  120. Forese-Carlo Wezel, PhD, Emeritus Professor of Stratigraphy (global and Mediterranean geology, mass biotic extinctions and paleoclimatology), University of Urbino, Urbino, Italy
  121. Boris Winterhalter, PhD, senior marine researcher (retired), Geological Survey of Finland, former professor in marine geology, University of Helsinki, Helsinki, Finland
  122. David E. Wojick, PhD,  PE, energy and environmental consultant, Technical Advisory Board member – Climate Science Coalition of America, Star Tannery, Virginia, U.S.A.
  123. George T. Wolff, Ph.D., Principal Atmospheric Scientist, Air Improvement Resource, Inc., Novi, Michigan, U.S.A.
  124. Thomas (Tom) Wysmuller –NASA (Ret) ARC, GSFC, Hdq. – Meteorologist, Ogunquit, ME, U.S.A.
  125. Bob Zybach, PhD (Environmental Sciences, Oregon State University), climate-related carbon sequestration research, MAIS, B.S., Director, Environmental Sciences Institute Peer review Institute, Cottage Grove, Oregon, U.S.A.
  126. Milap Chand Sharma, PhD, Associate Professor of Glacial Geomorphology, Centre fort the Study of Regional Development, Jawaharlal Nehru University, New Delhi, India
  127. Valentin A. Dergachev, PhD, Professor and Head of the Cosmic Ray Laboratory at Ioffe Physical-Technical Institute of Russian Academy of Sciences, St. Petersburg, Russia
  128. Vijay Kumar Raina, Ex-Deputy Director General, Geological Survey of India, Ex-Chairman Project Advisory and Monitoring Committee on Himalayan glacier, DST, Govt. of India and currently Member Expert Committee on Climate Change Programme, Dept. of Science & Technology, Govt. of India, author of 2010 MoEF Discussion Paper, “Himalayan Glaciers – State-of-Art Review of Glacial Studies, Glacial Retreat and Climate Change”, the first comprehensive study on the region.  Winner of the Indian Antarctica Award, Chandigarh, India
  129. Scott Chesner, B.S. (Meteorology, Penn State University), KETK Chief Meteorologist, KETK TV, previously Meteorologist with Accu Weather, Tyler, Texas, U.S.A

*   *   *

Reactions (I will not mention names here; all are from emails in the EANTH list)

1) “Hmm, I clicked on a few links, googled a few names. Found that when one is listed as “author of x book”, said book doesn’t appear on Amazon, etc.

Many non-PhDs.

Many “consultants”. Losts of “adjuncts”, lots of professor emeriti. someone listed as an “Extraordinary Research Associate”.

Little actual data. Few peer-reviewed research reports.

Didn’t recognize most of the names. Did recognize some “suspicious” ones (e.g., Tim Ball, a lovely [sic] Canadian).

Misrepresentation of the SREX report (quotation is a minor comment on a single point of many – page 280 of 594).

Link is to a letter published in the Financial Post, the business section of the National Post, the more rightward leaning of Canada’s 2 national papers. To give an indication, on the day in 2007 when the Nobel was awarded to IPCC and Gore, the headline on front page was “A Coup for Junk Science: Gaffe riddled work undeserving”.

Conclusion: don’t bother to click the link.”


2) “A few of the names on the list are also contained in table 3 of the 2012 Heartland Institute Proposed Budget (pages 7-8). Namely:

Craig D. Idso

Anthony Lupo

Susan Crockford

Joseph D’Aleo

Fred Singer

Robert Carter




3) “Is it that bad guys without phd and associations with the wrong institutions nullify the legitimacy of the good guys with proper credentials? Suppose you could not look them up. Would you be unable to judge the contents (with links to data) of the letter? You seem to require a certain kind of authority (defined by political means especially) to allow you to decide whether ideas are valuable. How sad. If every scientist were that intellectually timid there would be no learning. Thank goodness for the Feynmans of the world.”


4) “Short of being able to read, review and test all the science, a person has to make judgements based on additional criteria. My criteria include, but are not limited to, some things such as peer-review, credentials, reputation, availability of cited sources/affiliation/expertise, guilt-by-association, and so on. They are only part of the judgement of credibility. I looked up a book listed in the credentials of one “expert” and could not find it; I followed links, and so on.

The endpoint was when I looked for the quotation in the cited source (SREX) and evaluated it as misrepresentation. Since the IPCC was called on as an expert source by the so-called experts, yet it claimed other than what they claimed it claimed, the credibility of the letters and listed experts is to be disparaged.

That’s the character of science, and process of knowledge. Yup, that is how one really, really does judge which ideas are valuable.”


5) “Thank you for drawing attention to this open letter. I suspect quite a number of the people named in this letter are members of naysayer groups. From an Australian perspective Prof Bob Carter is a member of the secretive  Lavoissier group. I have inside knowledge of this group as I was approached with my husband to write a film script about climate change many years ago and we pulled out eventually after being told what they wanted to say about the science of climate change, which required a distortion of the facts. We had the impression that the money for the film was coming from America and I wouldn’t mind betting that it was oil and mining interest finance ($6 million). The person who set out to recruit us was a glaciologist who was also a member of the Lavoissier group. For more information see the following:

Pearse, Guy , “High and Dry”, Viking/Penguin,Camberwell, Victoria, 2007.
Hamilton, Clive, “Scorcher: The Dirty Politics of Climate Change, Black Inc. Agenda, Melbourne, Victoria, 2007.”


6) “I read a short and entertaining book that laid out a good process for deciding what to believe about climate change (or any other complex issue with lots of scientific research swirling around). It’s by Greg Craven and it’s called What’s the Worst That Could Happen?

Besides providing a way to cut through all the chatter, the book offers sound fundamentals for people interested in how scientific information comes to be accepted. I think it’s a great book for students, especially because the author (a physics teacher) tackles tough subjects with humor.

Here’s a link to it: http://www.amazon.com/Whats-Worst-That-Could-Happen/dp/0399535012/

Conhecimento não é fator determinante para formação de opinião sobre ciência (Fapesp)

Códigos morais e políticos podem influenciar muito mais as atitudes em relação a questões científicas e tecnológicas, aponta pesquisador (FAPESP)


Por Elton Alisson

Agência FAPESP – As pesquisas sobre percepção pública da ciência e tecnologia realizadas em diferentes países, incluindo o Brasil, com o objetivo de avaliar a opinião dos cidadãos sobre temas científicos e tecnológicos deparam com o desafio de explicar quais fatores influenciam atitudes, interesse e engajamento em relação a esses assuntos.

Isso porque, do conjunto de indicadores utilizados nessas pesquisas para analisar quais fatores são mais relevantes na formação de interesses e atitudes dos cidadãos sobre ciência e tecnologia – como renda, educação, idade e escolaridade –, nenhum deles consegue explicar minimamente a variabilidade das respostas.

“Tem alguma outra variável que não estamos medindo que determina o tipo de atitude das pessoas sobre ciência e tecnologia em geral”, disse Juri Castelfranchi, professor do Departamento de Sociologia e Antropologia da Faculdade de Filosofia e Ciências Humanas (Fafich) da Universidade Federal de Minas Gerais (UFMG), durante conferência sobre os desafios interpretativos e metodológicos para o estudo da percepção pública da ciência e tecnologia que proferiu no dia 27 de outubro no 2º Seminário Internacional Empírika.

Realizado nos dias 26 e 27 de outubro no Instituto de Estudos da Linguagem (IEL) da Universidade Estadual de Campinas (Unicamp), o evento integrou a programação da Feira Ibero-americana de Ciência, Tecnologia e Inovação (Empírika).

De acordo com Castelfranchi, um dos fatores que contribuem para a dificuldade de as pesquisas sobre percepção pública da ciência e tecnologia determinarem qual ou quais processos contribuem para a construção da opinião pública sobre o tema é que elas estão “baseadas na hipótese mal fundada e fundamentada de que as atitudes das pessoas em relação aos assuntos científicos e tecnológicos são moduladas pelo conhecimento que têm sobre esses temas”.

Tradicionalmente, segundo Castelfranchi, a maioria dos estudos realizados sobre o que faz com que as pessoas aceitem ou rejeitem a realização de uma pesquisa científica ou uma nova tecnologia focalizou o interesse, o conhecimento e as atitudes dos entrevistados em relação à ciência e tecnologia, baseado na ideia de que esses três aspectos estariam relacionados.

Dessa forma, as pessoas não interessadas teriam baixo nível de informação e tenderiam, em geral, a ter atitudes mais negativas em relação à ciência e tecnologia. Por outro lado, ao estimular o interesse dessas pessoas por temas científicos e tecnológicos seria possível melhorar o nível de conhecimento delas sobre essas áreas e, consequentemente, suas atitudes em relação à ciência e tecnologia se tornariam mais positivas.

Entretanto, pesquisas de campo demonstraram que essas premissas são falsas e que a situação real é muito mais complexa do que a defendida por esse modelo, que foi derrubado.

Em geral, de acordo com os resultados de estudos recentes na área, existe um grande interesse de boa parte da população sobre os temas de ciência e tecnologia, mas que não corresponde à busca de informação.

“Há grupos de público com baixa escolaridade, principalmente em países em desenvolvimento, que não conhecem e não buscam informação sobre ciência e que têm atitudes bastante positivas em relação à ciência e tecnologia”, disse Castelfranchi.

“Em contrapartida, alguns estudos detectaram que não é verdade que, ao aumentar o conhecimento, a atitude das pessoas se torna mais positiva. Em alguns casos ocorre o contrário, elas tendem a ser mais cautelosas e críticas”, disse.

Paradoxo do conhecimento versus atitude

Segundo Castelfranchi, um dos exemplos que ilustram essa suposta contradição, batizada de “paradoxo do conhecimento versus atitude”, é a questão dos transgênicos na Europa.

O continente, que é um dos que mais investem em ciência e tecnologia, decretou no início dos anos 2000 uma moratória contra os alimentos transgênicos após intensos debates entre segmentos da sociedade favoráveis e outros contrários à tecnologia, baseados no apelo emocional e argumentos mais de cunho econômico e político do que científico.

Uma pesquisa realizada em 1998 e replicada em 2010 em toda a Comunidade Europeia sobre o conhecimento e atitudes dos europeus em relação a aplicações biotecnológicas, incluindo alimentos e vacinas transgênicas, apontou que o fator risco não era determinante para a rejeição ou não da população à nova tecnologia.

Em muitos casos, os entrevistados responderam que algumas aplicações biotecnológicas eram perigosas, mas que eram úteis, moralmente aceitáveis e que deveriam ser encorajadas. Em outros casos, os participantes da pesquisa apontaram determinadas aplicações biotecnológicas como não tão perigosas, mas politicamente e moralmente questionáveis – como os transgênicos –, o que fez com que a tecnologia fosse rejeitada.

“Não foi o risco o fator mais relevante que levou à rejeição dos transgênicos na Europa, mas considerações políticas como, entre elas, o fato de a tecnologia ser controlada por multinacionais, ser patenteada e porque os países europeus eram contrários a monoculturas”, avaliou Castelfranchi.

A pesquisa também apontou que os cidadãos europeus que tinham conhecimento mais baixo não rejeitavam os transgênicos, mas não tinham uma opinião formada sobre eles. Por outro lado, os participantes com maior escolaridade tinham opiniões favoráveis ou contrárias mais definidas.

“O conhecimento não mudou a atitude dos cidadãos europeus em relação aos transgênicos, mas sim o fato de terem uma atitude mais definida em relação à tecnologia, a exemplo do que também pode ser observado no Brasil e em outros países ibero-americanos onde foram realizadas pesquisas do gênero”, disse Castelfranchi.

Na mais recente pesquisa Percepção pública da ciência e tecnologia, realizada no fim de 2010 pelo Ministério da Ciência, Tecnologia e Inovação (MCTI) com mais de 2 mil pessoas em todo o país, nenhum dos grupos específicos, de diferentes níveis sociais e de escolaridade, respondeu que as tecnologias trazem mais malefícios do que benefícios, quando perguntados sobre isso.

Porém, os participantes que mais conheciam cientistas e instituições de pesquisa foram justamente os que declararam em maior proporção que os cientistas podem ser perigosos em função do conhecimento que possuem.

“Não há nenhuma associação entre baixa escolaridade e achar que a ciência é perigosa. Mas, pelo contrário: pessoas de alta escolaridade tendem a ter uma postura mais cautelosa tanto em relação aos benefícios como sobre os malefícios apresentados pela ciência e tecnologia”, afirmou Castelfranchi.

Valores morais e políticos

No caso do Brasil, um dos fatores relevantes que influenciam as atitudes dos brasileiros em relação à ciência e tecnologia, identificado por Castelfranchi e outros pesquisadores que analisaram os dados da pesquisa realizada pelo MCTI, é o porte das cidades onde os entrevistados moram.

Os pesquisadores constataram que os participantes da pesquisa que moram em cidades brasileiras de grande porte tendem a avaliar melhor os prós e contras do desenvolvimento tecnocientífico para responder se a ciência e tecnologia trazem só benefícios ou malefícios. Já as pessoas que residem em cidades pequenas têm uma chance ligeiramente maior de apontar que a ciência só traz benefícios.

Contudo, tanto essa variável como nenhuma outra, como o sexo dos entrevistados, não consegue explicar, por si só, a variabilidade das respostas se a ciência e a tecnologia trazem mais benefícios ou malefícios.

“Nenhum dos fatores analisados até agora implica as pessoas terem uma posição mais otimista ou pessimista sobre a ciência e a tecnologia. Tem outros pontos, que precisamos descobrir, que influenciam essa resposta”, avaliou Castelfranchi.

Uma das hipóteses levantadas pelo pesquisador é que os códigos morais e políticos das pessoas, como a religião, podem ser mais determinantes do que o conhecimento que elas possuem ou não para formar suas opiniões sobre aspectos específicos da ciência e da tecnologia.

Entre os participantes da pesquisa sobre percepção pública da ciência e tecnologia realizada pelo MCTI, os que se declararam católicos concordaram mais do que os evangélicos com uma das afirmações feitas durante o estudo de que por causa de seu conhecimento os cientistas têm poderes que os tornam perigosos e que a ciência tem que ser controlada socialmente.

“A trajetória e a orientação de vida e os valores morais das pessoas, provavelmente, exercem uma influência muito maior na modulação de suas atitudes em relação à ciência e tecnologia em geral e sobre aspectos específicos da pesquisa do que o nível de conhecimento que elas têm”, estima Castelfranchi.

Para comprovar essa hipótese, de acordo com o pesquisador, é preciso desenvolver novas metodologias qualitativas e quantitativas e grandes quantidades de observações etnográficas para verificar como as pessoas se posicionam em relação à ciência e tecnologia, abolindo a ideia de que isso está relacionado apenas ao nível de conhecimento.

“Precisamos renovar nossas metodologias de pesquisa e a forma como olhamos e interpretamos os dados das pesquisas de percepção pública da ciência e tecnologia para entender como as pessoas atribuem sentido e constroem suas opiniões sobre questões científicas e tecnológicas, para termos uma visão dinâmica de como formam suas atitudes”, afirmou Castelfranchi.

How To Think About Science, Part 1 – 24 (CBC)

Friday, January 2, 2009

If science is neither cookery, nor angelic virtuosity, then what is it?
Modern societies have tended to take science for granted as a way of knowing, ordering and controlling the world. Everything was subject to science, but science itself largely escaped scrutiny. This situation has changed dramatically in recent years. Historians, sociologists, philosophers and sometimes scientists themselves have begun to ask fundamental questions about how the institution of science is structured and how it knows what it knows. David Cayley talks to some of the leading lights of this new field of study.

Episode Guide

Episode 1 – Steven Shapin and Simon Schaffer
Episode 2 – Lorraine Daston

Episode 3 – Margaret Lock
Episode 4 – Ian Hacking and Andrew Pickering
Episode 5 – Ulrich Beck and Bruno Latour
Episode 6 – James Lovelock
Episode 7 – Arthur Zajonc
Episode 8 – Wendell Berry
Episode 9 – Rupert Sheldrake
Episode 10 – Brian Wynne
Episode 11 – Sajay Samuel
Episode 12 – David Abram
Episode 13 – Dean Bavington
Episode 14 – Evelyn Fox Keller
Episode 15 – Barbara Duden
 and Silya Samerski 
Episode 16 – Steven Shapin
Episode 17 – Peter Galison
Episode 18 – Richard Lewontin
Episode 19 – Ruth Hubbard
Episode 20 – Michael Gibbons, Peter Scott, & Janet Atkinson Grosjean
Episode 21 -Christopher Norris and Mary Midgely
Episode 22 – Allan Young
Episode 23 – Lee Smolin
Episode 24 – Nicholas Maxwell

Earthquake Hazards Map Study Finds Deadly Flaws (Science Daily)

ScienceDaily (Aug. 31, 2012) — Three of the largest and deadliest earthquakes in recent history occurred where earthquake hazard maps didn’t predict massive quakes. A University of Missouri scientist and his colleagues recently studied the reasons for the maps’ failure to forecast these quakes. They also explored ways to improve the maps. Developing better hazard maps and alerting people to their limitations could potentially save lives and money in areas such as the New Madrid, Missouri fault zone.

“Forecasting earthquakes involves many uncertainties, so we should inform the public of these uncertainties,” said Mian Liu, of MU’s department of geological sciences. “The public is accustomed to the uncertainties of weather forecasting, but foreseeing where and when earthquakes may strike is far more difficult. Too much reliance on earthquake hazard maps can have serious consequences. Two suggestions may improve this situation. First, we recommend a better communication of the uncertainties, which would allow citizens to make more informed decisions about how to best use their resources. Second, seismic hazard maps must be empirically tested to find out how reliable they are and thus improve them.”

Liu and his colleagues suggest testing maps against what is called a null hypothesis, the possibility that the likelihood of an earthquake in a given area — like Japan — is uniform. Testing would show which mapping approaches were better at forecasting earthquakes and subsequently improve the maps.

Liu and his colleagues at Northwestern University and the University of Tokyo detailed how hazard maps had failed in three major quakes that struck within a decade of each other. The researchers interpreted the shortcomings of hazard maps as the result of bad assumptions, bad data, bad physics and bad luck.

Wenchuan, China — In 2008, a quake struck China’s Sichuan Province and cost more than 69,000 lives. Locals blamed the government and contractors for not making buildings in the area earthquake-proof, according to Liu, who says that hazard maps bear some of the blame as well since the maps, based on bad assumptions, had designated the zone as an area of relatively low earthquake hazard.

Léogâne, Haiti — The 2010 earthquake that devastated Port-au-Prince and killed an estimated 316,000 people occurred along a fault that had not caused a major quake in hundreds of years. Using only the short history of earthquakes since seismometers were invented approximately one hundred years ago yielded hazard maps that were didn’t indicate the danger there.

Tōhoku, Japan — Scientists previously thought the faults off the northeast coast of Japan weren’t capable of causing massive quakes and thus giant tsunamis like the one that destroyed the Fukushima nuclear reactor. This bad understanding of particular faults’ capabilities led to a lack of adequate preparation. The area had been prepared for smaller quakes and the resulting tsunamis, but the Tōhoku quake overwhelmed the defenses.

“If we limit our attention to the earthquake records in the past, we will be unprepared for the future,” Liu said. “Hazard maps tend to underestimate the likelihood of quakes in areas where they haven’t occurred previously. In most places, including the central and eastern U.S., seismologists don’t have a long enough record of earthquake history to make predictions based on historical patterns. Although bad luck can mean that quakes occur in places with a genuinely low probability, what we see are too many ‘black swans,’ or too many exceptions to the presumed patterns.”

“We’re playing a complicated game against nature,” said the study’s first author, Seth Stein of Northwestern University. “It’s a very high stakes game. We don’t really understand all the rules very well. As a result, our ability to assess earthquake hazards often isn’t very good, and the policies that we make to mitigate earthquake hazards sometimes aren’t well thought out. For example, the billions of dollars the Japanese spent on tsunami defenses were largely wasted.

“We need to very carefully try to formulate the best strategies we can, given the limits of our knowledge,” Stein said. “Understanding the uncertainties in earthquake hazard maps, testing them, and improving them is important if we want to do better than we’ve done so far.”

The study, “Why earthquake hazard maps often fail and what to do about it,” was published by the journal Tectonophysics. First author of the study was Seth Stein of Northwestern University. Robert Geller of the University of Tokyo was co-author. Mian Liu is William H. Byler Distinguished Chair in Geological Sciences in the College of Arts and Science at the University of Missouri.

Skeptical Uses of ‘Religion’ in Debate on Climate Change (The Yale Forum on Climate Change & The Media)

Michael Svoboda   August 27, 2012

Religion’ and religion-inspired terms — savior, prophet, priests, heretic, dogma, crusade — are regularly used in efforts to influence public attitudes about climate change. But how does this language work, and on whom?

Over the past several months The Yale Forum has published a series of articles describing how major religious groups across America address climate change. Within the broader societal debate on this issue, however, the voices heard in these pieces may be outnumbered by those of a group with a very different take on the connections between religion and the environment: climate skeptics.

Since 2005, in op-eds published in newspapers (The Wall Street Journal, The Washington Examiner, The Washington Post, and The Washington Times), in magazines (Forbes, National Review, The Weekly Standard), and online (Fox News and Townhall and also climate-specific websites like Watts Up with That), conservative commentators have repeatedly described global warming as a religion.

So how does this use of religious language affect the public understanding of climate change? To answer this question, the Forum analyzed more than 250 op-eds, blog posts, and books published between 2005 and the present. The results suggest that this religious language may be most effective in fortifying the opinions of those using it: Calling global warming a “religion” effectively neutralizes appeals to “the scientific consensus.”

Taking the Measure of the Meme

To take your own quick measure of the global-warming-as-religion (hereafter GWAR) meme, try two related searches at Google: first search for “climate change” and “religion,” then for “global warming” and “religion.” The top ten items from the Forum‘s two most recent searches (20 items in all) broke down as follows:

  • 10% were by religious groups calling for action on climate change,
  • 25% were about religious groups calling for action on climate change,
  • 10% were against religious groups opposed to action on climate,
  • 50% described concern for global warming as a religion, and
  • 5% rebutted those who described concern for global warming as a religion.

Based on this sample, one is more likely to encounter an article or op-ed about global warming as a religion than an article or op-ed explaining how or whether a particular religious group addresses climate change.

The dominance of the GWAR meme is even greater when one looks specifically at conservative venues. Over the past year, approximately 100 op-ed pieces that touched on global warming were published in nationally recognized conservative newspapers and/or by nationally syndicated columnists whose work is aggregated by Townhall. Ten of these pieces equated accepting the science on global warming with religious belief; none offered a religious argument for action on climate change.

During the peak years of Al Gore’s An Inconvenient Truth (2006–2008), the ratio was far higher. Roughly 40% of the more than 150 conservative op-eds penned in response to the documentary, to its Academy Award, or to Al Gore’s Nobel Peace Prize included language (prophet, priests, savior, crusade, faith, dogma, heresy, faith, etc.) that framed concern for climate change as a religious belief. Some drew that analogy explicitly. (See, for example, Richard Lindzen’s March 8, 2007, op-ed piece forThe Wall Street Journal and The Daily Mail (UK) — “Global Warming: The Bogus Religion of Our Age.”)

And since then several climate skeptics — Christopher Horner (2007), Iain Murray (2008), Roy Spencer (2008), Christopher Booker (2009), Ian Wishart (2009), Steve Goreham (2010), Larry Bell (2011), Brian Sussman (2012), and Robert Zubrin (2012) — have included the GWAR meme in their books.

A Brief History of the Global-Warming-as-Religion Meme

The global-warming-as-religion meme is an offshoot of the environmentalism-as-religion meme, which, according to New American Foundation fellow and Arizona State University Law Professor Joel Garreau, can be traced back to religious critiques of Lynn White’s 1967 essay in Science, “The Historical Roots of Our Ecologic Crisis.” By pinning the ecological blame on the Judaeo-Christian tradition’s instrumental view of nature, these authors argued, White seemed to call for the revival of nature worship.

Elements of these early critiques were reworked in what is perhaps the most well-known instance of the environmentalism-as-religion argument, Michael Crichton’s speech to the Commonwealth Club of San Francisco in 2003.

The first* example of the more specific global-warming-as-religion claim appears to be the aside in Republican Senator James Inhofe’s January 4, 2005, “update” to his “greatest hoax” speech: “Put simply, man-induced global warming is an article of religious faith.” Using slightly different language, Inhofe repeated this charge a few months later in his “Four Pillars of Climate Alarmism” speech.**

In between these two speeches, in a February 16, 2005, editorial for Capitalism Magazine by American Policy Center President Tom DeWeese, the GWAR meme gained titular status: “The New Religion Is Global Warming.”

But the most fully developed version of the global-warming-as-religion analogy is the nearly 5,000-word essay published on the Web in 2007 by retired British mathematician John Brignell — who cites Crichton’s 2003 speech in his opening paragraph.

The more generic environmentalism-as-religion meme now seems confined to Earth Day, which Emory University economics professor Paul Rubin described in an April 22, 2010, WSJ op-ed piece as environmentalism’s “holy day.” Two recent examples, from this past April, were provided by former business consultant W.A. Beatty and by Dale Hurd, a “news veteran” for the Christian Broadcasting Network.

The GWAR meme appears as opportunities — cool summers; early, late, or heavy snowstorms; or scandals — arise. And its meaning can vary accordingly.

Nature/Climate as Sacred

Some of the first American “environmentalists” — David Thoreau, Ralph Waldo Emerson, John Muir — often used religious language. Nature was where they most vividly experienced the presence of God. But when contemporary environmentalists use quasi-religious language without explicitly avowing a particular faith, their opponents may suspect that nature itself has become the object of their worship. When James Lovelock named his homeostatic model of the planet and its atmosphere after the ancient Greek earth goddess, Gaia, he provided a new ground for this suspicion.

For conservatives, there are strong and weak versions of this charge.

The strong charge is “paganism,” that environmentalists or climate activists/scientists worship nature in ways akin to the practices of the Egyptian, Mesopotamian, Greek, and Roman empires in which the ancient Jews and early Christians lived. This strong charge is typically leveled by evangelicals who publicly profess their own faith. Physicist James Wanliss and his colleagues — whose book and dvd,Resisting the Green Dragon, offer “A Biblical Response to One of the Greatest Deceptions of the Day” — provide perhaps the most vivid example.

The weak version reduces the charge of paganism to misplaced values. Very arch religious language may still be used, but the meaning is now metaphorical. In these more frequent instances of the GWAR meme, conservatives accuse climate activists/scientists of essentializing climate, of being too willing to slow or even disable our economic engine because they believe Earth has an “optimal climate.”

Climate Science as Cult

“Cult” implies that a given set of beliefs or practices is arcane, outside the mainstream, and insular. Someone embedded in a cult will not acknowledge conflicting evidence. So whenever new facts or dramatic events challenge the validity of climate science, at least in the minds of conservative skeptics, “cult of global warming” op-eds appear. Major snowstorms, cold snaps, and years that fail to surpass 1998′s average annual temperature provide these new “facts.”

Odd religious news can also prompt “cult of global warming” op-eds. The third no-show of Harold Camping’s apocalypse provided the prompt, last fall, for op-eds by Michael Barone and Derek Hunter. (The “cult” in the title of Michael Barone’s piece, however, may be the work of the Post’s editor; thesame piece appeared under a different title in The Washington Examiner.)

Climate Science as Corrupt Orthodoxy

But it’s hard to depict a thoroughly institutionalized effort like climate science as a cult. The international undertaking that is science is more plausibly compared with the Roman Catholic Church. And for climate skeptics, the best of the many possible instances of that church is the Roman Catholic Church of the late Renaissance, the church that condemned both Luther and Galileo.

The very Nobel public profiles of Al Gore and the IPCC, from 2006 to 2008, prompted many comparisons with priests and popes, cardinals and curia. Add in carbon offsets and the Reformation riffs practically wrote themselves. Conservative columnist Charles Krauthammer’s March 16, 2007,column in Time exemplifies this subgenre:

In other words, the rich reduce their carbon output by not one ounce. But drawing on the hundreds of millions of net worth in the Kodak theatre [for the “carbon-neutral” 2007 Academy Awards], they pull out lunch money to buy ecological indulgences. The last time the selling of pardons was prevalent — in a predecessor religion to environmentalism called Christianity — Martin Luther lost his temper and launched the Reformation.

(It should be noted, however, that climate activists and environmental journalists have themselves sometimes written about their ecological “sins.”)

While green hypocrisy was the primary target of Krauthammer’s 2007 column, orthodoxy and dogma are always at least secondary targets in this use of the GWAR meme. And shots were taken at them in a February 9, 2007, National Review column by Rich Lowry; a May 30, 2008, Washington Post column by Charles Krauthammer; a March 9, 2009, Townhall piece by Robert Knight; a January 13, 2010, Townhallcolumn by Walter E. Williams; a November 29, 2011, Wall Street Journal column by Bret Stephens; and, most recently, an April 26, 2012, post by David Solway. This is the most common use of the GWAR meme.

Dissenting Religions and the Scientific Consensus

But one might argue that by depicting climate scientists and activists as members of an aloof and self-serving (and possibly self-deluding) priesthood, conservatives are themselves engaged in religious posturing, for self-righteous dissent is part of the DNA of the western religious tradition.

Ancient Israel was a small country surrounded by much more powerful empires. Some heroes of the Bible — e.g., Daniel, Shadrach, Meshach, and Abednego — worked as trusted bureaucrats within state-ecclesiastical systems based on cosmologies they did not believe in. When ordered to consent to the beliefs of their rulers, they refused.

During the Protestant Reformation religious dissent often became political dissent. Today’s evangelicals are dissenters from mainstream denominations that dissented first from the Church of England and then from King George. Now they dissent from Washington.

But in the U.S., Roman Catholics too can view themselves as a dissenting minority, as, for example, when the Catholic Bishops objected to parts of the new healthcare law.

In fact, Americans are so primed for dissensus that both sides in the climate debate find it plausible to claim the mantle of Galileo.

In the run-up to the December 2009 conference in Copenhagen, cartoonists Michael Ramirez and David Horsey published cartoons that drew exactly opposite conclusions from the history of science, including Galileo’s conflict with the Roman Catholic Church regarding Copernicus’s heliocentric model of the solar system.

Within this charged religious history, a steadfast minority (of Jews, early Christians, Protestants, or Puritans) has been correct more often than the majority, than the broader cultural consensus (of Egyptians/Assyrians/Babylonians/Persians, Greeks/Romans, Roman Catholics, or Anglicans). Thus the GWAR meme not only legitimizes dissent (because everyone is entitled to his or her own religious views), it also provides emotional reinforcement for it (because the “official” religion is almost always “false”). The Protestant vs. Catholic variant of the meme also reinforces climate skeptics’ narratives about greedy and scheming scientists and/or self-serving elites. For those who use it, the GWAR meme effectively inoculates them against “the scientific consensus.”

Managing the Meme

Much has been said and published by religious leaders trying to promote action on climate change. But these messages must compete against the global-warming-as-religion meme reinforced regularly in op-eds sent out by The Wall Street Journal to its two million plus subscribers and, more frequently, in columns posted by Townhall for its two million unique monthly visitors.

Are there counter-measures for this meme?

In his summer 2010 article in The New Atlantis, Joel Garreau, New American Foundation fellow and Lincoln Professor of Law, Culture and Values (Sandra Day O’Connor College of Law, Arizona State University), traced the emergence of environmentalism as a secular religion. In that piece, Gerreau speculated that “the two faces of religious environmentalism — the greening of mainstream religion and the rise of carbon Calvinism — may each transform the political and policy debate over climate change.” In response to an e-mailed query from The Yale Forum, after stressing that he did not “conflate faith-based environmentalism with the scientific study of climate,” Garreau explained his “pragmat[ic]” outlook:  ”I just lay out the facts (as startling as they may be to some), observe that faith-based systems are ubiquitous in history, and then ask, in public policy terms, how you deal with this situation.”

Garreau said he is not surprised that “climate change deniers [might] wish to point out the ironies of faith-based environmentalism rising up in parallel with scientific environmentalism.” But he said he does not think that would have much effect. He suggested no countermeasures but did anticipate a possible line of attack: “It would hardly be surprising if there were a few under-examined pieties in their own world view.”

From the title of University of Maryland School of Public Policy professor Robert H. Nelson’s 2010 book,The New Holy Wars: Economic Religion vs. Environmental Religion in Contemporary America, one might infer that the playing field for climate policy might be leveled by calling attention to the equally religious faith in economics, in economic growth in particular. But what would be gained from a “religious” standoff between economics and environmentalism? In response to an e-mail question, Nelson listed three benefits:

First, … it helps us to understand … the … intensity of the disagreements about climate policy. Second, it offers a note of caution to all participants, given [that past] religious disagreements have too often escalated beyond all reason …. Third, … [s]eeing economics and environmentalism as religions, and discussing them as such, [would bring their] core value assumptions to the surface.

In other words, pushing back with the same religious language might be an effective countermeasure, at least initially. Then, Nelson added,  ”a secular religious ‘ecumenical movement’” could perhaps resolve the tensions between economics and environmentalism.

One clearly should proceed with caution in pursuing any “religious” countermeasures. The cultural and historical associations evoked by religious language do not necessarily favor “consensus,” especially a consensus presented in authoritative terms. In American history, religious groups have splintered far more often than they have united.

Bottom line: Climate communicators should expect and prepare for religious language. But they should weigh the subtle cultural messages religious language carries before deciding whether or how to use or respond to it.

*If readers know of an earlier example, please send the reference and/or the link to the author.
**Brian McCammack’s September 2007 
American Quarterly article, “Hot Damned America: Evangelicalism and the Climate Policy Debate,” pointed the way to these two speeches by Senator Inhofe.

Michael Svoboda

Michael Svoboda (PhD, Hermeneutics) is an Asst. Prof. of Writing at The George Washington University. Previously the owner of an academic bookstore, he now tracks and analyzes efforts to communicate climate change, including the stream of research and policy published by NGOs. E-mail: msvoboda@yaleclimatemediaforum.org

Cientistas apontam problemas da cobertura da imprensa sobre mudanças climáticas (Fapesp)

Especialistas reunidos em São Paulo para debater gestão de riscos dos extremos climáticos manifestam preocupação com dificuldades enfrentadas por jornalistas para lidar com a complexidade do tema (Wikimedia)


Por Fábio de Castro

Agência FAPESP – Na avaliação de especialistas reunidos em São Paulo para discutir a gestão de riscos dos extremos climáticos e desastres, para que seja possível gerenciar de forma adequada os impactos desses eventos, é fundamental informar a sociedade – incluindo os formuladores de políticas públicas – sobre as descobertas das ciências climáticas.

No entanto, pesquisadores estão preocupados com as dificuldades encontradas na comunicação com a sociedade. A complexidade dos estudos climáticos tende a gerar distorções na cobertura jornalística do tema e o resultado pode ser uma ameaça à confiança do público em relação à ciência.

A avaliação foi feita por participantes do workshop “Gestão dos riscos dos extremos climáticos e desastres na América Central e na América do Sul – o que podemos aprender com o Relatório Especial do IPCC sobre extremos?”, realizado na semana passada na capital paulista.

O evento teve o objetivo de debater as conclusões do Relatório Especial sobre Gestão dos Riscos de Extremos Climáticos e Desastres (SREX, na sigla em inglês) – elaborado e recentemente publicado pelo Painel Intergovernamental sobre Mudanças Climáticas (IPCC) – e discutir opções para gerenciamento dos impactos dos extremos climáticos, especialmente nas Américas do Sul e Central.

O workshop foi realizado pela FAPESP e pelo Instituto Nacional de Pesquisas Espaciais (Inpe), em parceria com o IPCC, o Overseas Development Institute (ODI) e a Climate and Development Knowledge (CKDN), ambos do Reino Unido, e apoio da Agência de Clima e Poluição do Ministério de Relações Exteriores da Noruega.

Durante o evento, o tema da comunicação foi debatido por autores do IPCC-SREX, especialistas em extremos climáticos, gestores e líderes de instituições de prevenção de desastres.

De acordo com Vicente Barros, do Centro de Investigação do Mar e da Atmosfera da Universidade de Buenos Aires, o IPCC, do qual é membro, entrou há três anos em um processo de reestruturação que compreende uma mudança na estratégia de comunicação.

“A partir de 2009, o IPCC passou a ser atacado violentamente e não estávamos preparados para isso, porque nossa função era divulgar o conhecimento adquirido, mas não traduzi-lo para a imprensa. Temos agora um grupo de jornalistas que procura fazer essa mediação, mas não podemos diluir demais as informações e a última palavra na formulação da comunicação é sempre do comitê executivo, porque o peso político do que é expresso pelo painel é muito grande”, disse Barros.

A linguagem é um grande problema, segundo Barros. Se for muito complexa, não atinge o público. Se for muito simplificada, tende a distorcer as conclusões e disseminar visões que não correspondem à realidade.

“O IPCC trata de problemas muito complexos e admitimos que não podemos fazer uma divulgação que chegue a todos. Isso é um problema. Acredito que a comunicação deve permanecer nas mãos dos jornalistas, mas talvez seja preciso investir em iniciativas de treinamento desses profissionais”, disse.

Fábio Feldman, do Fórum Paulista de Mudanças Climáticas, manifestou preocupação com as dificuldades de comunicação dos cientistas com o público, que, segundo ele, possibilitam que os pesquisadores “céticos” – isto é, que negam a influência humana nos eventos de mudanças climáticas – ganhem cada vez mais espaço na mídia e no debate público.

“Vejo com preocupação um avanço do espaço dado aos negacionistas no debate público. A imprensa acha que é preciso usar necessariamente o princípio do contraditório, dando espaço e importância equânimes para as diferentes posições no debate”, disse.

De acordo com Feldman, os cientistas – especialmente aqueles ligados ao IPCC – deveriam ter uma atitude mais pró-ativa no sentido de se contrapor aos “céticos” no debate público.

Posições diferentes

Para Reynaldo Luiz Victoria, da Coordenação do Programa FAPESP de Pesquisa em Mudanças Climáticas Globais, é importante que a imprensa trate as diferentes posições de modo mais equitativo.

“Há casos específicos em que a imprensa trata questões de maneira pouco equitativa – e eventualmente sensacionalista –, mas acho que nós, como pesquisadores, não temos obrigação de reagir. A imprensa deveria nos procurar para fazer o contraponto e esclarecer o público”, disse Victoria à Agência FAPESP.

Victoria, no entanto, destacou a importância de que os “céticos” também sejam ouvidos. “Alguns são cientistas sérios e merecem um tratamento equitativo. Certamente que não se pode ignorá-los, mas, quando fazem afirmações passíveis de contestação, a imprensa deve procurar alguém que possa dar um contraponto. Os jornalistas precisam nos procurar e não o contrário”, disse.

De modo geral, a cobertura da imprensa sobre mudanças climáticas é satisfatória, segundo Victoria. “Os bons jornais publicam artigos corretos e há jornalistas muito sérios produzindo material de alta qualidade”, destacou.

Para Luci Hidalgo Nunes, professora do Departamento de Geografia da Universidade Estadual de Campinas (Unicamp), os negacionistas ganham espaço porque muitas vezes o discurso polêmico tem mais apelo midiático do que a complexidade do conhecimento científico.

“O cientista pode ter um discurso bem fundamentado, mas que é considerado enfadonho pelo público. Enquanto isso, um pesquisador com argumentos pouco estruturados pode fazer um discurso simplificado, portanto atraente para o público, e polêmico, o que rende manchetes”, disse à Agência FAPESP.

Apesar de a boa ciência ter, em relação ao debate público, uma desvantagem inerente à sua complexidade, Nunes acredita ser importante que a imprensa continue pluralista. A pesquisadora publicou um estudo no qual analisa a cobertura do jornal O Estado de S. Paulo sobre mudanças climáticas durante um ano. Segundo Nunes, um dos principais pontos positivos observados consistiu em dar voz às diferentes posições.

“Sou favorável a que a imprensa cumpra seu papel e dê todos os parâmetros, para que haja um debate democrático. Acho que isso está sendo bem feito e a própria imprensa está aberta para nos dar mais espaço. Mas precisamos nos manifestar para criar essas oportunidades”, disse.

Nunes também considera que a cobertura da imprensa sobre mudanças climáticas, de modo geral, tem sido satisfatória, ainda que irregular. “O tema ganha vulto em determinados momentos, mas não se mantém na pauta do noticiário de forma permanente”, disse.

Segundo ela, o assunto sobressaiu especialmente em 2007, com a publicação do primeiro relatório do IPCC, e em 2012 durante a RIO+20.

“Em 2007, a cobertura foi intensa, mas a popularização do tema também deu margem a distorções e exageros. O sensacionalismo é ruim para a ciência, porque faz o tema ganhar as manchetes rapidamente por algum tempo, mas no médio prazo o efeito é inverso: as pessoas percebem os exageros e passam a olhar com descrédito os resultados científicos de modo geral”, disse.

Ciência e cultura, o que elas têm em comum? (Jornal da Ciência)

JC e-mail 4549, de 27 de Julho de 2012.

A pergunta foi tema da mesa-redonda “Divulgação da Ciência e da Cultura”, realizada na 64ª Reunião Anual da Sociedade Brasileira para o Progresso da Ciência (SBPC), que termina hoje (27), em São Luís.

Para Ildeu de Castro Moreira, diretor de Popularização e Difusão da Ciência e Tecnologia do Ministério da Ciência, Tecnologia e Inovação (MCTI) e conselheiro da SBPC, o debate sobre a relação da ciência com a arte é muito importante porque são duas facetas fundamentais da cultura humana. “Ciência, arte e cultura têm em comum a criatividade inerente ao ser humano”, definiu. Ele explica que arte e ciência são atividades humanas e sociais baseadas na criatividade e curiosidade.

Físico e divulgador científico, Ildeu falou sobre o “imaginário científico presente na mente de artistas”, e explicou que a ciência também tem preocupação estética e guarda semelhanças com a arte. Para ele, há beleza nas teorias científicas. “Equações matemáticas e fórmulas físicas são lindas. Podem parecer chatas em sala de aula, mas contando com a ajuda do olhar de um artista é possível mostrar essa beleza. É preciso aprender a olhar a beleza da ciência, assim como temos que aprender a olhar muita coisa na arte contemporânea”, exemplifica.

Para Ildeu, as conexões entre ciência e arte são importantes para fazer a divulgação científica chegar mais facilmente ao público. Em sua exposição, ele mostrou manifestações artísticas que falam de ciência, dando exemplos de poesias, músicas, enredos de escolas de samba, ditos populares e cordel.

Público infantil – Em sua apresentação na mesa-redonda, Luisa Medeiros Massarani, jornalista e chefe do Museu da Vida da Fiocruz, no Rio de Janeiro, falou sobre iniciativas de divulgação científicas voltadas para o público infantil. “A experiência tem demonstrado uma grande receptividade das crianças, maior do que a de adultos e adolescentes. Principalmente devido à curiosidade da criança, que são consideradas como ‘cientistas naturais'”, explica.

Luisa falou sobre o crescimento de museus de ciências no País, que atualmente são cerca de 200, embora ainda estejam concentrados em algumas regiões. “Os museus têm apelo incrível para as crianças e são importantes também para o divulgador que vê na hora a reação da criança”, revela. Apesar de os museus terem grande parte do público formado por crianças, Luisa afirma que é preciso pensar em espaços específicos para elas, desde a redução do tamanho dos móveis até atividades interativas adequadas.

Ela defende que a criança deve ser encarada como ator social importante no processo de divulgação científica. “Falar de divulgação científica para criança não é falar de ciência unilateralmente, é preciso que a criança seja ator importante e protagonista do processo”, explica ao dizer que a experiência de uma feira de ciência, ou a visita a um museu fica na memória da criança e pode influenciar sua formação, além de provocar e despertar o interesse pela ciência.

A chefe do Museu da Vida citou exposições, livros e publicações voltadas para o público infantil. E destacou a importância de fazer avaliações junto às crianças depois dessas experiências, para saber qual caminho seguir.

Ildeu aproveitou para sugerir que artistas participem mais ativamente das reuniões da SBPC, não somente como um evento paralelo, como a SBPC Cultural, mas como integrantes de mesas e debates com os cientistas. A ideia é aproveitar o público da Reunião, que alcança 15, 20 mil pessoas para falar dessa relação.

(Jornal da Ciência)

Scientific particles collide with social media to benefit of all (Irish Times)

The Irish Times – Thursday, July 12, 2012

xxx Large Hadron Collider at Cern: the research body now has 590,000 followers on Twitter

xxx Large Hadron Collider at Cern: the research body now has 590,000 followers on Twitter


IN 2008 CERN switched on the Large Hadron Collider (LHC) in Geneva – around the same time it sent out its first tweet. Although the first outing of the LHC didn’t go according to plan, the Twitter account gained 10,000 followers within the first day, according to James Gillies, head of communications at Cern.

Speaking at the Euroscience Open Forum in Dublin this week, Gillies explained the role social media plays in engaging the public with the particle physics research its laboratory does. The Twitter account now has 590,000 followers and Cern broke important news via it in March 2010 by joyously declaring: “Experiment have seen collisions.”

“Why do we communicate at Cern? If you talk to the scientists who work there they will tell you it’s a good thing to do and they all want to do it,” Gillies said, adding that Cern is publicly funded so engaging with the people who pay the bills is important.

When the existence of the Higgs particle was announced last week, it wasn’t an exclusive press event. Live video was streamed across the web, questions were taken not only from journalists but also from Twitter followers, and Cern used this as a chance to announce jobs via Facebook.

While Cern appears to be the social media darling of the science world, other research institutes and scientists are still weighing up the pros and cons of platforms like Facebook, Twitter or YouTube.

There is a certain stigma attached to social networking sites, not just because much of the content is perceived as banal, but also because too much tweeting could be damaging to your image as a scientist.

Bora Zivkovic is blogs editor at Scientific American, organiser of the fast-growing science conference ScienceOnline and speaker at the social media panel this Saturday at the Euroscience Open Forum. He says the adoption of social media by scientists is slow but growing.

“Academics are quite risk-averse and are shy about trying new things that have a perceived potential to remove the edge they may have in the academic hierarchy, either through lost time or lost reputation.”

Zivkovic talks about fear of the “Sagan effect”, named after the late Carl Sagan. A talented astronomer and astrophysicist, he was loved by the public but snubbed by the science community.

“Many still see social media as self-promotion, which is still in some scientific circles viewed as a negative thing to do. The situation is reminiscent of the very slow adoption of email by researchers back in the early 1990s.

“Once the scientists figure out how to include social media in their daily workflow, realise it does not take away from their time but actually makes them more effective in reaching their academic goals, and realise that the ‘Sagan effect’ on reputation is a thing of the past, they will readily incorporate social media into their normal work.”

Many researchers still rely heavily on specialist mailing lists. The broadcast capability on social media is far greater and bespoke, claims Dr Matthew Rowe, research associate at the Knowledge Media Institute with the Open University.

“If I was to email people about some recent work I would presume that it would be marked as spam. However, if I was to announce the release of some work through social media, then a debate and conversation could evolve surrounding the topic; I have seen this happen many times on Facebook.”

Conversations on social media sites are often seen as trivial – for scientists, the end goal is “publish or perish”. Results must be published in a reputable academic journal and preferably cited by those in their area.

Twitter, it seems, can help. A 2011 paper from researcher Gunther Eysenbach found a correlation between Twitter activity and highly cited articles. The microblogging site may help citation rate or serve as a measure of how “citable” your paper may be.

In addition, a 2010 survey on Twitter found one-third of academics said they use it for sharing information with peers, communicating with students or as a real-time news source.

For some the argument for social media is the potential for connecting with volunteers and providing valuable data from the citizen scientist. Yolanda Melero Cavero’s MinkApp has connected locals with an effort to control the mink population in Scotland.

“The most interesting thing about MinkApp, for me, was the fact that the scientist was able to get 600 volunteers for her ecological study. Social media has the grassroots potential to engage with willing volunteers,” says Nancy Salmon, researcher at the department of occupational therapy at the University of Limerick.

Rowe gives some sage social media advice for academics about keeping on topic and your language jargon-free.

But there’s always room for humour as demonstrated by the Higgs boson jokes on Twitter and Facebook last week. As astronomer Phil Platt tweeted: “I’ve got 99.9999% problems, but a Higgs ain’t one.”

Um novo bóson à vista (FAPESP)

Físicos do Cern descobriram nova partícula que parece ser o bóson de Higgs

MARCOS PIVETTA | Edição Online 19:46 4 de julho de 2012

Colisões de prótons nos quais se observa quatro elétrons de alta energia (linhas verdes e torres vermelhas). O evento mostra características esperadas do decaimento de um bóson de Higgs mas também é coerente com processos de fundo do modelo padrão

de Lindau (Alemanha)*

O maior laboratório do mundo pode ter encontrado a partícula que dá massa a todas as outras partículas, o tão procurado bóson de Higgs. Era a peça que faltava para completar um quebra-cabeça científico chamado modelo padrão, o arcabouço teórico formulado nas últimas décadas para explicar as partículas e forças presentes na matéria visível do Universo. Depois de analisarem trilhões de colisões de prótons produzidas em 2011 e em parte deste ano no Grande Acelerador de Hádrons (LHC), físicos dos dois maiores experimentos tocados de forma independente no Centro Europeu de Energia Nuclear (Cern) anunciaram nesta quarta-feira (4), nos arredores de Genebra (Suíça), a descoberta de uma nova partícula que tem quase todas as características do bóson de Higgs, embora ainda não possam assegurar com certeza de que se trata especificamente desse ou de algum outro tipo de bóson.

“Observamos em nossos dados sinais claros de uma nova partícula na região de massa em torno de 126 GeV (Giga-elétron-volts)”, disse a física Fabiola Gianotti, porta-voz do experimento Atlas. “Mas precisamos um pouco mais de tempo para prepararmos os resultados para publicação.” As informações provenientes de outro experimento feito no Cern, o CMS, são praticamente idênticas. “Os resultados são preliminares, mas os sinais que vemos em torno da região com massa de 125 GeV são dramáticos. É realmente uma nova partícula. Sabemos que deve ser um bóson e é o bóson mais pesado que achamos”, afirmou o porta-voz do experimento CMS, o físico Joe Incandela. Se tiver mesmo uma massa de 125 ou 126 GeV, a nova partícula será tão pesada quanto um átomo do elemento químico iodo.

Em ambos os casos experimentos, o grau de confiabilidade das análises estatísticas atingiu o nível que os cientistas chamam de 5 sigma. Nesses casos, a chance de erro é de uma em três milhões. Ou seja, com esse nível de certeza, é possível falar que houve uma descoberta, só não se conhece em detalhes a natureza da partícula encontrada. “É incrível que essa descoberta tenha acontecido durante a minha vida”, comenta Peter Higgs, o físico teórico britânico que, há 50 anos, ao lado de outros cientistas, previu a existência desse tipo de bóson. Ainda neste mês, um artigo com os dados do LHC deverá ser submetido a uma revista científica. Até o final do ano, quando acelerador será fechado para manutenção por ao menos um ano e meio, mais dados devem ser produzidos pelos dois experimentos.

“Estou rindo o dia todo”
Em Lindau, uma pequena cidade do sul da Alemanha à beira do lago Constance na divisa com a Áustria e a Suíça, onde ocorre nesta semana o 62º Encontro de Prêmios Nobel, os pesquisadores comemoraram a notícia vinda dos experimentos no Cern. Como o tema do encontro deste ano era física, não faltaram laureados com a maior honraria da ciência para comentar o feito. “Não sabemos se é o bóson (de Higgs), mas é um bóson”, disse o físico teórico David J. Gross, da Universidade de Califórnia, ganhador do Nobel de 2004 pela descoberta da liberdade assintótica. “Estou rindo o dia todo.” O físico experimental Carlo Rubia, ex-diretor geral do Cern e ganhador do Nobel de 1984 por trabalhos que levaram à identificação de dois tipos de bósons (o W e Z), foi na mesma linha de raciocínio. “Estamos diante de um marco”, afirmou.

Talvez com um entusiasmo um pouco menor, mas ainda assim reconhecendo a enorme importância do achado no Cern, dois outros Nobel deram sua opinião sobre a notícia do dia. “É algo que esperávamos há anos”, afirmou o físico teórico holandês Martinus Veltman, que recebeu o prêmio em 1999. “O modelo padrão ganhou um degrau maior de validade.” Para o cosmologista americano George Smoot, ganhador do Nobel de 2006 pela descoberta da radiação cósmica de fundo (uma relíquia do Big Bang, a explosão primordial que criou o Universo), ainda deve demorar uns dois ou três anos para os cientistas realmente saberem que tipo de nova partícula foi realmente descoberta. Se a nova partícula não for o bóson de Higgs, Smoot disse que seria “maravilhoso se fosse algo relacionado com a matéria escura”, um misterioso componente que, ao lado da matéria visível e da ainda mais desconhecida energia escura, seria um dos pilares do Universo.

Não é possível medir de forma direta partículas com as propriedades do bóson de Higgs, mas sua existência, ainda que fugaz, deixaria rastros, que, estes sim, poderiam ser detectados num acelerador de partículas tão potente como o LHC. Instáveis e fugazes, os bósons de Higgs sobrevivem uma ínfima fração de segundo – até decaírem e virarem partículas menos pesadas, que, por sua vez, decaem também e dão origem a partículas ainda mais leves. O modelo padrão prevê que, em função de sua massa, os bósons de Higgs devem decair em diferentes canais, ou seja, em distintas combinações de partículas mais leves, como dois fótons ou quatro léptons. Nos experimentos feitos no Cern, dos quais participaram cerca de 6 mil físicos, foram encontradas evidências quase inequívocas das formas de decaimento que seriam a assinatura típica dos bóson de Higgs.

*O jornalista Marcos Pivetta viajou a Lindau a convite do Daad (Serviço Alemão de Intercâmbio Acadêmico)

Rio+20 sem ciência (Mundo Sustentável, G1)

sáb, 16/06/12

por André Trigueiro

Depois de cinco dias reunidos na Pontifícia Universidade Católica do Rio (PUC-RJ), 500 cientistas de 75 países – seis deles Prêmios Nobel – produziram um relatório contundente em que resumem a situação do planeta. Entre outras informações, eles dizem que “há evidências científicas convincentes de que o atual modelo de desenvolvimento está minando a capacidade de o planeta responder às agressões do homem”. Manifestam preocupação com o fato de que “os níveis de produção e de consumo poderão causar mudanças irreversíveis e catastróficas para a humanidade”. Mas asseveram que “temos conhecimento e criatividade para construir um novo caminho. Entretanto, é preciso correr contra o tempo”.

O Prêmio Nobel de Química,Yuan Tse Lee, de Taiwan, foi escolhido pelos colegas para uma missão quase impossível: resumir em apenas dois minutos para os chefes de estado no Riocentro o que de mais importante aparece no relatório. Apenas 120 segundos serão suficientes para inspirar nas principais lideranças do mundo o devido senso de urgência? Bom, foi este o tempo definido pelo protocolo da ONU. Perguntei ao dr.Yuan qual seria a mensagem mais importante do relatório.

“Não temos muito mais tempo para transformar a sociedade, torná-la sustentável. Se continuarmos nesse ritmo, vai ficar cada vez pior. Entraremos numa grande enrascada”, disse ele, para em seguida arrematar com um lampejo de confiança no futuro:”Não temos o direito de ficar pessimistas. Estou feliz a de ver tantos jovens no Rio”.

Quem também estava no encontro foi o climatologista Carlos Nobre, que nesta semana teve a honra de escrever o editorial da prestigiada revista científica Science com o sugestivo título de “UNsustainable? (com as iniciais da ONU em maiúsculas no início da palavra “insustentável” em inglês) onde afirmou que o mundo “saiu da zona de segurança”. Perguntei a ele se a classe política está ouvindo os alertas dos cientistas.

“Nós estamos tendo dificuldade de comunicar a todos os tomadores de decisão o senso de urgência. Tempo talvez seja o recurso mais escasso na questão do desenvolvimento sustentável”. Ao ser indagado sobre o que estava em jogo, caso as recomendações dos cientistas não fossem consideradas pelos tomadores de decisão, o atual secretário de Políticas e Programas de Pesquisa e Desenvolvimento do Ministério da Ciência, Tecnologia e Inovação respondeu com indisfarçável preocupação. “O risco de excedermos alguns limites planetários existe. Os recursos não são infinitos e a capacidade da Terra absorver os choques também não. No caso do clima, por exemplo, provavelmente também já estamos operando fora da margem de segurança”.

Para Carlos Nobre, “a urgência da situação planetária requer decisões também urgentes e ações imediatas. Essa distância entre o que os cientistas percebem como urgente e a as respostas dadas pelo sistema político configura o descompasso”.

Deixei a PUC intrigado não apenas pela contundência de mais um alerta da comunidade científica, mas também pela ausência de jornalistas interessados em cobrir o maior evento paralelo da Rio+20 na área da ciência. Será que nós, profissionais de imprensa, também estamos em descompasso com as informações relevantes descortinadas pela comunidade científica? Será este um assunto restrito às mídias especializadas ou todos os jornalistas e comunicadores deveriam abrir mais espaços, especialmente em tempos de crise, para o que os cientistas estão dizendo? Vale a reflexão. E, sobretudo, a ação.

Cientistas da USP continuam fiéis ao IPCC (Jornal do Campus-USP) + carta de Ricardo Felício

por   e 

Especialistas defendem a credibilidade do painel climático mundial e opinam sobre o aquecimento global

Desde novembro do ano passado, o IPCC passa por uma crise de credibilidade. Na ocasião foram encontrados erros no relatório que deu ao painel o Prêmio Nobel da Paz em 2007. O mais grave deles era uma previsão sobre o degelo do Himalaia até 2035. Na mesma época vieram a público trocas de e-mails entre seus cientistas insinuando que pesquisas que negam o aquecimento global não seriam avaliadas pelo IPCC. O caso ficou conhecido como Climategate.

Os acontecimentos serviram de argumento para os céticos, aqueles que defendem que o aquecimento global é um fenômeno natural com precedentes ao longo da história e não tem relação com as ações do homem no planeta.

A polêmica ganhou novamente os holofotes da mídia em fevereiro, quando o secretário executivo do painel pediu demissão do cargo. Imediatamente o presidente da instituição, Rajendra Pachauri, anunciou que não mediria esforços para propor um conjunto de medidas que assegurem mais rigor científico nos relatórios e maior controle sobre os especialistas que os produzem.

Sem crise

Tércio Ambrizzi acredita que mais cuidado pode evitar dados incorretos. No entanto, “quando olhado no geral, o relatório do IPCC é muito sólido”, acredita. Segundo ele, mais de mil páginas são analisadas e esse volume dá margem para que algum erro escape. Quanto aos emails publicados, Ambrizzi diz que a invasão da privacidade é algo extremamente perigoso. “Em ciência não é assim que você prova que um resultado científico está errado. Você prova com a ciência”.

Paulo Artaxo compartilha dessa visão, também enfatizando o esforço que os cientistas têm com o relatório: “Ter duas citações que merecem ser corrigidas não invalida o trabalho intenso de milhares de cientistas ao longo de muitos anos”.

A professora Ilana Wainer, presidente da Comissão de Pesquisa do Instituto de Oceanografia (IO), vai além na defesa da credibilidade do IPCC. A professora diz que não vê crise “no que diz respeito à ciência, que é irrefutável e determinística”.

A cientista do IO é enfática ao afirmar que o Climategate foi baseado num verdadeiro roubo de emails de pesquisadores do centro de pesquisa do clima em East Anglia (Reino Unido). “Foram mais de 1000 emails e os ‘céticos’ tentaram desacreditar a ciência das mudanças climáticas baseados nisso e só conseguiram achar um ou outro email pessoal. Existe uma base científica sólida que sustenta a afirmação de que o aquecimento global das últimas quatro, cinco décadas vem da ação do homem”.

(ilustração: Hugo Neto)

Aquecimento global

Para Artaxo, o homem está alterando de modo significativo vários aspectos do planeta. Ele cita como exemplo a queima de combustíveis fósseis nos últimos 150 anos e as alterações no uso do solo, como a troca intensiva de florestas por plantações: “A acumulação adicional de gases de efeito estufa na atmosfera aumentou a temperatura média de nosso planeta em 0.7 graus centígrados nos últimos 150 anos”.

Ambrizzi e Ilana também são categóricos na defesa do aquecimento global e da interferência do homem. “O aquecimento global está ocorrendo e é inequívoca a participação do homem nisso”, afirma a professora do IO.

Já Aretha Sanchez, advogada e autora de pesquisa sobre mudanças climático-ambientais desenvolvida pelo Instituto de Pesquisas Energéticas e Nucleares (Ipen), afirma que mudanças climáticas são comprovadas por registros através do tempo.  “Essas alterações ocorrem por fatores externos ou internos à Terra – dentre os internos, temos a presença humana”, diz Aretha.

Iceberg gigante

Perguntada se o desprendimento do iceberg gigante na Antártida tem relação com o aquecimento global, Ilana explica que “do lado continental há acúmulo de neve/gelo; do lado oceânico ocorre um processo conhecido como calving, que é a liberação repentina e o rompimento de uma massa de gelo de uma geleira. O gelo que rompe pode ser classificado como um iceberg. O desprendimento desse grande iceberg pode ocorrer normalmente como parte do balanço de massa da geleira. O aquecimento global favorece, sim, a intensificação do calving e maior frequência de icebergs, mas não necessariamente está associado ao tamanho deles”.

*   *   *

Carta aberta de Ricardo Augusto Felicio, professor de climatologia do Departamento de Geografia da FFLCH, endereçada ao Jornal do Campus (JC) da USP

Lamentável e repugnante a matéria deste jornal da primeira quinzena de março de 2010, informando que os cientistas da USP permanecem fiéis ao
IPCC <http://www.jornaldocampus.usp.br/index.php/2010/03/cientistas-da-usp-continuam-fieis-ao-ipcc/>.

Vocês deveriam se retratar em público por tamanho absurdo. Somos muitos os pesquisadores desta instituição que negam as imbecilidades pregadas, em
forma de dogma, da patifaria imposta por ONGs, ONU e interesses de governos internacionais.

Cientista não pode ser fiel, muito menos a um órgão político da ONU que nada tem de científico. O jornal ainda peca ao falar dos 2000 cientistas. Eles não devem passar atualmente de 100 ou 200. Só em 2008, mais de 600 caíram fora, alegando que não mais participariam deste conluio. O número real expressa um avolumando contingente de membros de ONGs, políticos e burocratas que nada tem a ver com ciência. Esta é a realidade que custa a ser demonstrada aqui no Brasil.

Enquanto a briga lá fora está acirrada devido aos diversos escândalos, quase semanais, encontrados nos afazeres do IPCC e seus asseclas, a nossa
imprensa se cala, não trazendo as grandes discussões diárias sobre o assunto que vemos em outros países.

Só mesmo pseudocientistas, engajados em interesses econômicos, é que se curvam ao IPCC. E pelo que vemos, temos muitos aqui dentro.

Então lançamos o desafio, exatamente como é feito no exterior: *mostrem a evidência! *Já adiantamos que não aceitamos: “eu acho” ou “eu creio”; saída
de modelos de computador e nem dogmas.

A grande prova de que eles não tem nada é sua fuga das discussões e seus ridículos planos, atrelados ao uso do “princípio da precaução, porque na falta de plena certeza científica, devem-se tomar medidas de mitigação imediatas”.

Qual a finalidade da pesquisa científica séria e dedicada, se no final das contas a resposta já está dada de antemão – se o aquecimento global fosse verdadeiro, deveríamos tomar medidas mitigatórias, mas se ele não for comprovado (como não o é) devemos tomar *exatamente as mesmas medidas*, apenas por precaução?

Que futuro resta para a ciência climática, se ela não é mais ouvida, pois todas as decisões em nome dela já foram tomadas? Sem falar da idéia de consenso, pois todos já admitiram que o homem causa “aquecimento global”, também confundido com “mudanças climáticas”. Oras, só nestas afirmações nós
percebemos como eles são totalmente contraditórios.

Sem falar que ainda dizem que os debates já se encerraram. Como as discussões estão encerradas se elas nunca aconteceram?

Querem trocar todo o cotidiano das atividades humanas baseados em mentiras?! Isto é completamente absurdo! A patifaria tomou vida própria. Está mais do que na hora de ser devidamente neutralizada.

Gastar verbas com o Painel Brasileiro de Mudanças Climáticas – PBMC será uma fabulosa forma de sumir com dinheiro público que poderia ser muito bem empregado para fazer melhorias contra um real problema: saneamento básico no Brasil!

Quanto à imparcialidade do jornal, esta ficou muito a desejar.

Ricardo Augusto Felicio é graduado em Ciências Atmosféricas – Meteorologia pela USP, tem mestrado em Meteorologia pelo Instituto Nacional de Pesquisas
Espaciais e doutorado em Geografia (Geografia Física) pela USP

Perspective: Troubled by Interdisciplinarity? (Science)

Career Advice

By Stephanie PfirmanMelissa Begg

April 06, 2012

Do program managers and senior faculty tell you “that idea is not really in my bailiwick, and I’m not sure where else to send you”? Do you spend more time choosing a publication venue than writing your paper? Are you asked to be on committees and panels to provide a “fresh perspective” — and then told you spend too much time on service? Is your e-mail full of correspondence about how to handle overhead, subawards, and subcontracts on collaborative proposals?

If any of these descriptions apply to you, you may be suffering from the pain and inconvenience of interdisciplinarity, one of the fastest-growing problems among researchers today. It’s not a problem that goes away on its own. Rather, it festers if it’s not addressed, diminishing creativity and productivity.

Despite the pain and inconvenience, increasing numbers of scientists are pursuing interdisciplinary career paths, and a growing proportion of research funding opportunities from federal granting agencies is interdisciplinary. In May 2011, 30% to 40% of all requests for proposals from the National Science Foundation and the National Institutes of Health explicitly required an interdisciplinary approach.

Interdisciplinarity can be wonderfully rich and rewarding, but there are dangers attendant to choosing this non-traditional route. Interdisciplinary scholars go “out on a limb” and “often must fight for identity, recognition, roles, legitimacy, and standing.” This takes a personal — as well as a professional — toll: While the status of their peers grows with accomplishments within the disciplinary community, interdisciplinary scholars have to “live without the comfort of expertise” and often without the comfort of community. Scholars report that they no longer fit in as well after they leave their disciplinary base.

This connection between research direction and community fit is supported by the 2003 Faculty Worklife Survey conducted by the University of Wisconsin, Madison’s Women in Science and Engineering Leadership Institute. The belief that their colleagues did not perceive their research to be “mainstream” left people feeling more negative about colleagues’ valuation of their research, their respect in the workplace, departmental decision-making, informal departmental interactions, and overall isolation and “fit.”

The messages from a number of recent publications can be distilled to this: Interdisciplinary research doesn’t fit into traditional academic structures. Therefore, if you choose this route, the onus is on you to take additional steps to become aware of the pitfalls and prepare yourself to succeed in this arena.

What kinds of steps are we talking about? Our recommendations include building skills for interdisciplinary collaboration, extending your mentorship team, bolstering your interdisciplinary CV for disciplinary review, and preparing for the complications of writing and submitting interdisciplinary grant proposals.

Recommendations for interdisciplinary scholars

Prepare yourself for new ways of working, thinking, and interacting.

• Specialize within your interdisciplinary research area. Avoid the tendency of many interdisciplinary scholars to branch out too quickly and in too many directions, which can diffuse your impact.

• Focus on your disciplinary strength and skills. It may sound counterintuitive, but in many situations your value as an interdisciplinary colleague is directly proportional to your skills in your own discipline. Keep up with the latest literature and theoretical developments in your disciplinary field so that you will be prepared to apply new knowledge and skills in diverse areas.

• Build core competencies that sustain interdisciplinary research by taking courses or learning on your own. For example, you could take courses that use the case study method to enhance interdisciplinary skills or include practice reviewing interdisciplinary papers and proposals.

• Attend seminars and workshops in other disciplines. Participating in research seminars outside your own department is a great way to expand your thinking, add a new batch of colleagues to your network, and develop expertise in new research areas.

• Seek new mentorship. The old model of one scholar, one mentor is fast becoming a distant memory. Find a mentor or two from beyond your field to help broaden your mindset and approaches.

When preparing manuscripts and grant applications, enhance your credibility as a successful researcher whose work crosses traditional disciplinary boundaries.

• Include a cover letter with your paper or proposal that highlights its interdisciplinary nature and suggests reviewers with complementary expertise so that all of your research aims receive appropriate review.

• Frame research aims to satisfy the needs of both disciplinary-leaning reviewers and interdisciplinary-eager granting agencies. Incorporating conceptual models and grounding your ideas within the disciplines establishes common ground with diverse reviewers.

• Involve respected colleagues with expertise in the techniques you plan to use.

• Try to have at least one publication in each field in which you propose to work. If the work requires an area you haven’t published in, get a letter of support from a well-known investigator in that field offering assistance.

• Start early on budget preparation for collaborative proposals. Most interdisciplinary endeavors are collaborative — and collaborative grant activities have financial implications, with potential revenue losses to departments due to diversion of overhead costs to other units. It may sound like a minor issue, but the most aggravating problem identified in the 2004 report of the National Academies Committee on Facilitating Interdisciplinary Research (CFIR) was the logistics of interdisciplinary research: budget control, institutional cost recovery, space, unit reporting, and award agreements. More than 40% of scholars and provosts picked one of these as the top impediment to interdisciplinary projects. A recent study found that faculty and administrators at universities with overhead-sharing policies reported satisfaction with their policies, and most felt that they indeed helped to foster interdisciplinary science.

• Use the Kulage study to support budget negotiations. Your colleagues and administrators may be resistant at first to innovations like overhead sharing, so showing them evidence of the effectiveness of overhead sharing may help you close the deal and reach an agreement that recognizes and rewards the contributions of the interdisciplinary collaborators involved in your proposal.

It’s never too early to start thinking about tenure and promotion. You need to plan for a portfolio that withstands the scrutiny of discipline-oriented review committees while also allowing you to pursue interdisciplinary interests. You can take steps to prepare yourself for rigorous evaluation by disciplinary and interdisciplinary reviewers.

• Annotate your CV to explain your contributions to collaborative publications and grants. While this task may seem onerous, if you don’t do it, people have to guess, and they often guess wrong. Increasingly, journals require people to clarify their roles in publications, and some institutions now require that CVs articulate not only specific roles but also the percentage of effort devoted to various activities. Use such policies to your advantage.

• Ground your research statement. As with proposals, incorporating conceptual models and explaining connections to key disciplinary theories and approaches helps to contextualize your work for reviewers with diverse backgrounds.

• Seek a spectrum of reviewers. If asked to suggest reviewers to evaluate your work and advise your tenure or promotion review panel, be sure to include experts from multiple departments or from outside of the institution. Choose experts who can address the particular research areas you work in. For example, you might propose one letter writer who could attest to your disciplinary strength. Another might emphasize how another field is using your research. This could broaden the perspective of the review panel and permit consideration of less traditional CVs.

If you’re on the job market, look for institutions and departments that really value interdisciplinarity. In 2004, more than 10% of scholars identified “strategic plans” as the top impediment to interdisciplinary research. Seven years later, some institutions are finally tackling this: Take a look at the case studies of Ohio University and Macalester College in the National Council for Science and the Environment report. Fostering interdisciplinarity is a strategic decision at the institutional level, but integration of interdisciplinarity into departmental missions is key. Check to see if these pieces are in place at the institution you’re thinking of working for. You can use the NIH template for interdisciplinary offer letters as a mental checklist as you discuss expectations with the chair of the search. You don’t want to come across as too demanding, but having this model letter in mind will help you think of questions to ask about the position.

When push comes to shove, department chairs and supervisors often look askance at activities they perceive to be “extra-departmental.” As noted in a 2011 article in the Journal of Environmental Studies and Sciences,

There is a significant and growing need for interdisciplinary … scholars to develop, teach, and apply successful problem-solving approaches and to educate the next generation of scholars and professionals. Yet such professionals often work in departments where most of their colleagues are disciplinarians and the reward and incentive system is based on disciplines or is at best multidisciplinary. They need diverse strategies and support to overcome the many difficulties that they face day to day in research, teaching, and administration, as well as over the course of their careers.

Increasingly, institutions are addressing what is perhaps the single most vexing problem identified by the 2004 CFIR report: promotion criteria, which 15% of provosts and faculty members identified as the top impediment. Some institutions have turned to using the Boyer criteria of discovery, integration, application, and teaching, rather than focusing mainly on discovery (often with passing reference to teaching). Beyond these traditional criteria, Boyer’s “integration” criterion, in particular, is important in the evaluation of interdisciplinary research. “Application” can also be important. These are all positive signs that smoother sailing may be ahead.

Interdisciplinary research is laudable and undeniably enriching. But until academia’s reward system catches up to its desire for interdisciplinary collaboration, researchers — especially early-career investigators — must take additional steps to prepare for and protect themselves from choppy waters ahead.


Boyer E.L. (1990) Scholarship reconsidered: Priorities of the professoriate. Jossey-Bass, New York

Clark, S.G., M.M. Steen-Adams, S. Pfirman, R.L. Wallace (2011) Professional Development of Interdisciplinary Environmental Scholars, Journal of Environmental Studies and Sciences.

Collins, J.P. (2002). May you live in interesting times: Using multidisciplinary and interdisciplinary programs to cope with change in the life sciences. BioScience 52:75-83.

Committee on Facilitating Interdisciplinary Research (2004). Committee on Facilitating Interdisciplinary Research, National Academy of Sciences, National Academy of Engineering, Institute of Medicine.

Heemskerk, M., K. Wilson, and M. Pavao-Zuckerman. 2003. Conceptual models as tools for communication across disciplines. Conservation Ecology 7(3): 8. [online]

Kulage, K.M., E.L. Larson, and M.D. Begg (2011). Sharing facilities and administrative cost recovery to facilitate interdisciplinary research. Academic Medicine 86: 394-401.

Larson, E.L., T.F. Landers, and M.D. Begg (2011) Building Interdisciplinary Research Models: A Didactic Course to Prepare Interdisciplinary Scholars and Faculty. Clinical and Translational Science (4)1: 38–41.

Lattuca, L.R. (2001). Creating interdisciplinarity: interdisciplinary research and teaching among college and university faculty. Nashville, TN: Vanderbilt University Press.

Pfirman, S. and P. Martin (2010). Fostering Interdisciplinary Scholars. Chapter in Oxford Handbook on Interdisciplinarity, Editors: R. Frodeman, J. Thompson Klein, and C. Mitcham, Oxford University Press, 624 pp.

Pfirman, S.; Martin, P.; Danielson, A.; Goodman, R.M.; Steen-Adams, M.; Waggett, C.; Mutter, J.; Rikakis, T.; Fletcher, M.; Berry, L.; Hornbach, D.; Hempel, M.; Morehouse, B.; Southard, R. (2011). Interdisciplinary Hiring and Career Development: Guidance for Individuals and Institutions. National Council for Science and the Environment.

Porter, A.L., Cohen, A.S., Roessner, J.D., and Perreault, M. (2007) Measuring Researcher Interdisciplinarity, Scientometrics, 72(1): 117-147

WISELI (2003) Study of Faculty Worklife at the University of Wisconsin-Madison

Stephanie Pfirman is Hirschorn Professor and co-chair of the environmental science department at Barnard College and a member of Columbia University’s Earth Institute faculty, both in New York City. Melissa Begg is Professor and Vice Dean for Education at the Mailman School of Public Health and Co-Director of the Irving Institute for Clinical and Translational Research at Columbia University in New York.

Carta aberta à presidenta Dilma Rousseff – Mudanças climáticas: hora de se recobrar o bom senso

Carta aberta à presidenta Dilma Rousseff
Mudanças climáticas: hora de se recobrar o bom senso
São Paulo, 14 de maio de 2012