Arquivo da categoria: mediação tecnológica

>A dor da rejeição (Fapesp)

>
Divulgação Científica
29/3/2011

Estudo indica que o sentimento de rejeição após o fim de um relacionamento amoroso e a dor física ao se machucar ativam as mesmas regiões no cérebro (reprodução)

Agência FAPESP – A dor da rejeição não é apenas uma figura de expressão ou de linguagem, mas algo tão real como a dor física. Segundo uma nova pesquisa, experiências intensas de rejeição social ativam as mesmas áreas no cérebro que atuam na resposta a experiências sensoriais dolorosas.

“Os resultados dão novo sentido à ideia de que a rejeição social ‘machuca’”, disse Ethan Kross, da Universidade de Michigan, que coordenou a pesquisa.

Os resultados do estudo serão publicados esta semana no site e em breve na edição impressa da revista Proceedings of the National Academy of Sciences.

“A princípio, derramar uma xícara de café quente em você mesmo ou pensar em uma pessoa com quem experimentou recentemente um rompimento inesperado parece que provocam tipos diferentes de dor, mas nosso estudo mostra que são mais semelhantes do que se pensava”, disse Kross.

Estudos anteriores indicaram que as mesmas regiões no cérebro apoiam os sentimentos emocionalmente estressantes que acompanham a experiência tando da dor física como da rejeição social.

A nova pesquisa destaca que há uma interrelação neural entre esses dois tipos de experiências em áreas do cérebro, uma parte em comum que se torna ativa quando uma pessoa experimenta sensações dolorosas, físicas ou não. Kross e colegas identificaram essas regiões: o córtex somatossensorial e a ínsula dorsal posterior.

Participaram do estudo 40 voluntários que haviam passado por um fim inesperado de relacionamento amoroso nos últimos seis meses e que disseram se sentir rejeitados por causa do ocorrido.

Cada participante completou duas tarefas, uma relacionada à sensação de rejeição e outra com respostas à dor física, enquanto tinham seus cérebros examinados por ressonância magnética funcional.

“Verificamos que fortes sensações induzidas de rejeição social ativam as mesmas regiões cerebrais envolvidas com a sensação de dor física, áreas que são raramente ativadas em estudos de neuroimagens de emoções”, disse Kross.

O artigo Social rejection shares somatosensory representations with physical pain (doi/10.1073/pnas.1102693108), de Ethan Kross e outros, poderá ser lido em breve por assinantes da PNAS em http://www.pnas.org/cgi/doi/10.1073/pnas.1102693108.

>Can a group of scientists in California end the war on climate change? (Guardian)

>
The Berkeley Earth project say they are about to reveal the definitive truth about global warming

Ian Sample
guardian.co.uk
Sunday 27 February 2011 20.29 GMT

Richard Muller of the Berkeley Earth project is convinced his approach will lead to a better assessment of how much the world is warming. Photograph: Dan Tuffs for the Guardian

In 1964, Richard Muller, a 20-year-old graduate student with neat-cropped hair, walked into Sproul Hall at the University of California, Berkeley, and joined a mass protest of unprecedented scale. The activists, a few thousand strong, demanded that the university lift a ban on free speech and ease restrictions on academic freedom, while outside on the steps a young folk-singer called Joan Baez led supporters in a chorus of We Shall Overcome. The sit-in ended two days later when police stormed the building in the early hours and arrested hundreds of students. Muller was thrown into Oakland jail. The heavy-handedness sparked further unrest and, a month later, the university administration backed down. The protest was a pivotal moment for the civil liberties movement and marked Berkeley as a haven of free thinking and fierce independence.

Today, Muller is still on the Berkeley campus, probably the only member of the free speech movement arrested that night to end up with a faculty position there – as a professor of physics. His list of publications is testament to the free rein of tenure: he worked on the first light from the big bang, proposed a new theory of ice ages, and found evidence for an upturn in impact craters on the moon. His expertise is highly sought after. For more than 30 years, he was a member of the independent Jason group that advises the US government on defence; his college lecture series, Physics for Future Presidents was voted best class on campus, went stratospheric on YouTube and, in 2009, was turned into a bestseller.

For the past year, Muller has kept a low profile, working quietly on a new project with a team of academics hand-picked for their skills. They meet on campus regularly, to check progress, thrash out problems and hunt for oversights that might undermine their work. And for good reason. When Muller and his team go public with their findings in a few weeks, they will be muscling in on the ugliest and most hard-fought debate of modern times.

Muller calls his latest obsession the Berkeley Earth project. The aim is so simple that the complexity and magnitude of the undertaking is easy to miss. Starting from scratch, with new computer tools and more data than has ever been used, they will arrive at an independent assessment of global warming. The team will also make every piece of data it uses – 1.6bn data points – freely available on a website. It will post its workings alongside, including full information on how more than 100 years of data from thousands of instruments around the world are stitched together to give a historic record of the planet’s temperature.

Muller is fed up with the politicised row that all too often engulfs climate science. By laying all its data and workings out in the open, where they can be checked and challenged by anyone, the Berkeley team hopes to achieve something remarkable: a broader consensus on global warming. In no other field would Muller’s dream seem so ambitious, or perhaps, so naive.

“We are bringing the spirit of science back to a subject that has become too argumentative and too contentious,” Muller says, over a cup of tea. “We are an independent, non-political, non-partisan group. We will gather the data, do the analysis, present the results and make all of it available. There will be no spin, whatever we find.” Why does Muller feel compelled to shake up the world of climate change? “We are doing this because it is the most important project in the world today. Nothing else comes close,” he says.

Muller is moving into crowded territory with sharp elbows. There are already three heavyweight groups that could be considered the official keepers of the world’s climate data. Each publishes its own figures that feed into the UN’s Intergovernmental Panel on Climate Change. Nasa’s Goddard Institute for Space Studies in New York City produces a rolling estimate of the world’s warming. A separate assessment comes from another US agency, the National Oceanic and Atmospheric Administration (Noaa). The third group is based in the UK and led by the Met Office. They all take readings from instruments around the world to come up with a rolling record of the Earth’s mean surface temperature. The numbers differ because each group uses its own dataset and does its own analysis, but they show a similar trend. Since pre-industrial times, all point to a warming of around 0.75C.

You might think three groups was enough, but Muller rolls out a list of shortcomings, some real, some perceived, that he suspects might undermine public confidence in global warming records. For a start, he says, warming trends are not based on all the available temperature records. The data that is used is filtered and might not be as representative as it could be. He also cites a poor history of transparency in climate science, though others argue many climate records and the tools to analyse them have been public for years.

Then there is the fiasco of 2009 that saw roughly 1,000 emails from a server at the University of East Anglia’s Climatic Research Unit (CRU) find their way on to the internet. The fuss over the messages, inevitably dubbed Climategate, gave Muller’s nascent project added impetus. Climate sceptics had already attacked James Hansen, head of the Nasa group, for making political statements on climate change while maintaining his role as an objective scientist. The Climategate emails fuelled their protests. “With CRU’s credibility undergoing a severe test, it was all the more important to have a new team jump in, do the analysis fresh and address all of the legitimate issues raised by sceptics,” says Muller.

This latest point is where Muller faces his most delicate challenge. To concede that climate sceptics raise fair criticisms means acknowledging that scientists and government agencies have got things wrong, or at least could do better. But the debate around global warming is so highly charged that open discussion, which science requires, can be difficult to hold in public. At worst, criticising poor climate science can be taken as an attack on science itself, a knee-jerk reaction that has unhealthy consequences. “Scientists will jump to the defence of alarmists because they don’t recognise that the alarmists are exaggerating,” Muller says.

The Berkeley Earth project came together more than a year ago, when Muller rang David Brillinger, a statistics professor at Berkeley and the man Nasa called when it wanted someone to check its risk estimates of space debris smashing into the International Space Station. He wanted Brillinger to oversee every stage of the project. Brillinger accepted straight away. Since the first meeting he has advised the scientists on how best to analyse their data and what pitfalls to avoid. “You can think of statisticians as the keepers of the scientific method, ” Brillinger told me. “Can scientists and doctors reasonably draw the conclusions they are setting down? That’s what we’re here for.”

For the rest of the team, Muller says he picked scientists known for original thinking. One is Saul Perlmutter, the Berkeley physicist who found evidence that the universe is expanding at an ever faster rate, courtesy of mysterious “dark energy” that pushes against gravity. Another is Art Rosenfeld, the last student of the legendary Manhattan Project physicist Enrico Fermi, and something of a legend himself in energy research. Then there is Robert Jacobsen, a Berkeley physicist who is an expert on giant datasets; and Judith Curry, a climatologist at Georgia Institute of Technology, who has raised concerns over tribalism and hubris in climate science.

Robert Rohde, a young physicist who left Berkeley with a PhD last year, does most of the hard work. He has written software that trawls public databases, themselves the product of years of painstaking work, for global temperature records. These are compiled, de-duplicated and merged into one huge historical temperature record. The data, by all accounts, are a mess. There are 16 separate datasets in 14 different formats and they overlap, but not completely. Muller likens Rohde’s achievement to Hercules’s enormous task of cleaning the Augean stables.

The wealth of data Rohde has collected so far – and some dates back to the 1700s – makes for what Muller believes is the most complete historical record of land temperatures ever compiled. It will, of itself, Muller claims, be a priceless resource for anyone who wishes to study climate change. So far, Rohde has gathered records from 39,340 individual stations worldwide.

Publishing an extensive set of temperature records is the first goal of Muller’s project. The second is to turn this vast haul of data into an assessment on global warming. Here, the Berkeley team is going its own way again. The big three groups – Nasa, Noaa and the Met Office – work out global warming trends by placing an imaginary grid over the planet and averaging temperatures records in each square. So for a given month, all the records in England and Wales might be averaged out to give one number. Muller’s team will take temperature records from individual stations and weight them according to how reliable they are.

This is where the Berkeley group faces its toughest task by far and it will be judged on how well it deals with it. There are errors running through global warming data that arise from the simple fact that the global network of temperature stations was never designed or maintained to monitor climate change. The network grew in a piecemeal fashion, starting with temperature stations installed here and there, usually to record local weather.

Among the trickiest errors to deal with are so-called systematic biases, which skew temperature measurements in fiendishly complex ways. Stations get moved around, replaced with newer models, or swapped for instruments that record in celsius instead of fahrenheit. The times measurements are taken varies, from say 6am to 9pm. The accuracy of individual stations drift over time and even changes in the surroundings, such as growing trees, can shield a station more from wind and sun one year to the next. Each of these interferes with a station’s temperature measurements, perhaps making it read too cold, or too hot. And these errors combine and build up.

This is the real mess that will take a Herculean effort to clean up. The Berkeley Earth team is using algorithms that automatically correct for some of the errors, a strategy Muller favours because it doesn’t rely on human interference. When the team publishes its results, this is where the scrutiny will be most intense.

Despite the scale of the task, and the fact that world-class scientific organisations have been wrestling with it for decades, Muller is convinced his approach will lead to a better assessment of how much the world is warming. “I’ve told the team I don’t know if global warming is more or less than we hear, but I do believe we can get a more precise number, and we can do it in a way that will cool the arguments over climate change, if nothing else,” says Muller. “Science has its weaknesses and it doesn’t have a stranglehold on the truth, but it has a way of approaching technical issues that is a closer approximation of truth than any other method we have.”

He will find out soon enough if his hopes to forge a true consensus on climate change are misplaced. It might not be a good sign that one prominent climate sceptic contacted by the Guardian, Canadian economist Ross McKitrick, had never heard of the project. Another, Stephen McIntyre, whom Muller has defended on some issues, hasn’t followed the project either, but said “anything that [Muller] does will be well done”. Phil Jones at the University of East Anglia was unclear on the details of the Berkeley project and didn’t comment.

Elsewhere, Muller has qualified support from some of the biggest names in the business. At Nasa, Hansen welcomed the project, but warned against over-emphasising what he expects to be the minor differences between Berkeley’s global warming assessment and those from the other groups. “We have enough trouble communicating with the public already,” Hansen says. At the Met Office, Peter Stott, head of climate monitoring and attribution, was in favour of the project if it was open and peer-reviewed.

Peter Thorne, who left the Met Office’s Hadley Centre last year to join the Co-operative Institute for Climate and Satellites in North Carolina, is enthusiastic about the Berkeley project but raises an eyebrow at some of Muller’s claims. The Berkeley group will not be the first to put its data and tools online, he says. Teams at Nasa and Noaa have been doing this for many years. And while Muller may have more data, they add little real value, Thorne says. Most are records from stations installed from the 1950s onwards, and then only in a few regions, such as North America. “Do you really need 20 stations in one region to get a monthly temperature figure? The answer is no. Supersaturating your coverage doesn’t give you much more bang for your buck,” he says. They will, however, help researchers spot short-term regional variations in climate change, something that is likely to be valuable as climate change takes hold.

Despite his reservations, Thorne says climate science stands to benefit from Muller’s project. “We need groups like Berkeley stepping up to the plate and taking this challenge on, because it’s the only way we’re going to move forwards. I wish there were 10 other groups doing this,” he says.

For the time being, Muller’s project is organised under the auspices of Novim, a Santa Barbara-based non-profit organisation that uses science to find answers to the most pressing issues facing society and to publish them “without advocacy or agenda”. Funding has come from a variety of places, including the Fund for Innovative Climate and Energy Research (funded by Bill Gates), and the Department of Energy’s Lawrence Berkeley Lab. One donor has had some climate bloggers up in arms: the man behind the Charles G Koch Charitable Foundation owns, with his brother David, Koch Industries, a company Greenpeace called a “kingpin of climate science denial”. On this point, Muller says the project has taken money from right and left alike.

No one who spoke to the Guardian about the Berkeley Earth project believed it would shake the faith of the minority who have set their minds against global warming. “As new kids on the block, I think they will be given a favourable view by people, but I don’t think it will fundamentally change people’s minds,” says Thorne. Brillinger has reservations too. “There are people you are never going to change. They have their beliefs and they’re not going to back away from them.”

Waking across the Berkeley campus, Muller stops outside Sproul Hall, where he was arrested more than 40 years ago. Today, the adjoining plaza is a designated protest spot, where student activists gather to wave banners, set up tables and make speeches on any cause they choose. Does Muller think his latest project will make any difference? “Maybe we’ll find out that what the other groups do is absolutely right, but we’re doing this in a new way. If the only thing we do is allow a consensus to be reached as to what is going on with global warming, a true consensus, not one based on politics, then it will be an enormously valuable achievement.”

>Can Geoengineering Save the World from Global Warming? (Scientific American)

>
Ask the Experts | Energy & Sustainability
Scientific American

Is manipulating Earth’s environment to combat climate change a good idea–and where, exactly, did the idea come from?

By David Biello | February 25, 2011

STARFISH PRIME: This nighttime atmospheric nuclear weapons test generated an aurora (pictured) in Earth’s magnetic field, along with an electromagnetic pulse that blew out streetlights in Honolulu. It is seen as an early instance of geoengineering by science historian James Fleming. Image: Courtesy of US Govt. Defense Threat Reduction Agency

As efforts to combat climate change falter despite ever-rising concentrations of heat-trapping gases in the atmosphere, some scientists and other experts have begun to consider the possibility of using so-called geoengineering to fix the problem. Such “deliberate, large-scale manipulation of the planetary environment” as the Royal Society of London puts it, is fraught with peril, of course.

For example, one of the first scientists to predict global warming as a result of increasing concentrations of greenhouse gases in the atmosphere—Swedish chemist Svante Arrhenius—thought this might be a good way to ameliorate the winters of his native land and increase its growing season. Whereas that may come true for the human inhabitants of Scandinavia, polar plants and animals are suffering as sea ice dwindles and temperatures warm even faster than climatologists predicted.

Scientific American corresponded with science historian James Fleming of Colby College in Maine, author of Fixing the Sky: The Checkered History of Weather and Climate Control, about the history of geoengineering—ranging from filling the air with the artificial aftermath of a volcanic eruption to seeding the oceans with iron in order to promote plankton growth—and whether it might save humanity from the ill effects of climate change.

[An edited transcript of the interview follows.]

What is geoengineering in your view?
Geoengineering is planetary-scale intervention [in]—or tinkering with—planetary processes. Period.

As I write in my book, Fixing the Sky: The Checkered History of Weather and Climate Control, “the term ‘geoengineering’ remains largely undefined,” but is loosely, “the intentional large-scale manipulation of the global environment; planetary tinkering; a subset of terraforming or planetary engineering.”

As of June 2010 the term has a draft entry in the Oxford English Dictionary—the modification of the global environment or the climate in order to counter or ameliorate climate change. A 2009 report issued by the Royal Society of London defines geoengineering as “the deliberate large-scale manipulation of the planetary environment to counteract anthropogenic climate change.”

But there are significant problems with both definitions. First of all, an engineering practice defined by its scale (geo) need not be constrained by its stated purpose (environmental improvement), by any of its currently proposed techniques (stratospheric aerosols, space mirrors, etcetera) or by one of perhaps many stated goals (to ameliorate or counteract climate change). Nuclear engineers, for example, are capable of building both power plants and bombs; mechanical engineers can design components for both ambulances and tanks. So to constrain the essence of something by its stated purpose, techniques or goals is misleading at best.

Geo-scale engineering projects were conducted by both the U.S. and the Soviet Union between 1958 and 1962 that had nothing to do with countering or ameliorating climate change. Starting with the [U.S.’s] 1958 Argus A-bomb explosions in space and ending with the 1962 Starfish Prime H-bomb test, the militaries of both nations sought to modify the global environment for military purposes.

Project Argus was a top-secret military test aimed at detonating atomic bombs in space to generate an artificial radiation belt, disrupt the near-space environment, and possibly intercept enemy missiles. It, and the later tests conducted by both the U.S. and the Soviet Union, peaked with H-bomb detonations in space in 1962 that created an artificial [electro]magnetic [radiation] belt that persisted for 10 years. This is geoengineering.

This idea of detonating bombs in near-space was proposed in 1957 by Nicholas Christofilos, a physicist at Lawrence Berkeley National Laboratory. His hypothesis, which was pursued by the [U.S.] Department of Defense’s Advanced Research Projects Agency [subsequently known as DARPA] and tested in Project Argus and other nuclear shots, held that the debris from a nuclear explosion, mainly highly energetic electrons, would be contained within lines of force in Earth’s magnetic field and would travel almost instantly as a giant current spanning up to half a hemisphere. Thus, if a detonation occurred above a point in the South Atlantic, immense currents would flow along the magnetic lines to a point far to the north, such as Greenland, where they would severely disrupt radio communications. A shot in the Indian Ocean might, then, generate a huge electromagnetic pulse over Moscow. In addition to providing a planetary “energy ray,” Christofilos thought nuclear shots in space might also disrupt military communications, destroy satellites and the electronic guidance systems of enemy [intercontinental ballistic missiles], and possibly kill any military cosmonauts participating in an attack launched from space. He proposed thousands of them to make a space shield.

So nuclear explosions in space by the U.S. and the Soviet Union constituted some of the earliest attempts at geoengineering, or intentional human intervention in planetary-scale processes.

The neologism “geoengineer” refers to one who contrives, designs or invents at the largest planetary scale possible for either military or civilian purposes. Today, geoengineering, as an unpracticed art, may be considered “geoscientific speculation”. Geoengineering is a subset of terraformation, which also does not exist outside of the fantasies of some engineers.

I have recently written to the Oxford English Dictionary asking them to correct their draft definition.

Can geoengineering save the world from climate change?
In short, I think it may be infinitely more dangerous than climate change, largely due to the suspicion and social disruption it would trigger by changing humanity’s relationship to nature.

To take just one example from my book, on page 194: “Sarnoff Predicts Weather Control” read the headline on the front page of The New York Times on October 1, 1946. The previous evening, at his testimonial dinner at the Waldorf Astoria, RCA president Brig. Gen. David Sarnoff had speculated on worthy peaceful projects for the postwar era. Among them were “transformations of deserts into gardens through diversion of ocean currents,” a technique that could also be reversed in time of war to turn fertile lands into deserts, and ordering “rain or sunshine by pressing radio buttons,” an accomplishment that, Sarnoff declared, would require a “World Weather Bureau” in charge of global forecasting and control (much like the “Weather Distributing Administration” proposed in 1938). A commentator in The New Yorker intuited the problems with such control: “Who” in this civil service outfit, he asked, “would decide whether a day was to be sunny, rainy, overcast…or enriched by a stimulating blizzard?” It would be “some befuddled functionary,” probably bedeviled by special interests such as the raincoat and galoshes manufacturers, the beachwear and sunburn lotion industries, and resort owners and farmers. Or if a storm was to be diverted—”Detour it where? Out to sea, to hit some ship with no influence in Washington?”

How old is the idea of geoengineering? What other names has it had?
I can trace geoengineering’s direct modern legacy to 1945, and have prepared a table of such proposals and efforts for the [Government Accountability Office]. Nuclear weapons, digital computers and satellites seem to be the modern technologies of choice. Geoengineering has also been called terraformation and, more restrictively, climate engineering, climate intervention or climate modification. Many have proposed abandoning the term geoengineering in favor of solar radiation management and carbon (or carbon dioxide) capture and storage. Of course, the idea of control of nature is ancient—for example, Phaeton or Archimedes.

Phaeton, the son of Helios, received permission from his father [the Greek sun god] to drive the sun chariot, but failed to control it, putting the Earth in danger of burning up. He was killed by a thunderbolt from Zeus to prevent further disaster. Recently, a prominent meteorologist has written about climate control and urged us to “take up Phaeton’s reins,” which is not a good idea.

Archimedes is known as an engineer who said: “Give me a lever long enough and a place to stand, and I will move the Earth.” Some geoengineers think that this is now possible and that science and technology have given us an Archimedean set of levers with which to move the planet. But I ask: “Where will it roll if you tip it?”

How are weather control and climate control related?
Weather and climate are intimately related: Weather is the state of the atmosphere at a given place and time, while climate is the aggregate of weather conditions over time. A vast body of scientific literature addresses these interactions. In addition, historians are revisiting the ancient but elusive term klima, seeking to recover its multiple social connotations. Weather, climate and the climate of opinion matter in complex ways that invite—some might say require or demand—the attention of both scientists and historians. Yet some may wonder how weather and climate are interrelated rather than distinct. Both, for example, are at the center of the debate over greenhouse warming and hurricane intensity. A few may claim that rainmaking, for example, has nothing to do with climate engineering, but any intervention in the Earth’s radiation or heat budget (such as managing solar radiation) would affect the general circulation and thus the location of upper-level patterns, including the jet stream and storm tracks. Thus, the weather itself would be changed by such manipulation. Conversely, intervening in severe storms by changing their intensity or their tracks or modifying weather on a scale as large as a region, a continent or the Pacific Basin would obviously affect cloudiness, temperature and precipitation patterns with major consequences for monsoonal flows, and ultimately the general circulation. If repeated systematically, such interventions would influence the overall heat budget and the climate.

Both weather and climate control have long and checkered histories: My book explains [meteorologist] James Espy’s proposal in the 1830s to set fire to the crest of the Appalachian Mountains every Sunday evening to generate heated updrafts that would stimulate rain and clear the air for cities of the east coast. It also examines efforts to fire cannons at the clouds in the arid Southwest in the hope of generating rain by concussion.

In the 1920s airplanes loaded with electrified sand were piloted by military aviators who “attacked” the clouds in futile attempts to both make rain and clear fog. Many others have proposed either a world weather control agency or creating a global thermostat, either by burning vast quantities of fossil fuels if an ice age threatened or sucking the CO2 out of the air if the world overheated.

After 1945 three technologies—nuclear weapons, digital computers and satellites—dominated discussions about ultimate weather and climate control, but with very little acknowledgement that unintended consequences and social disruption may be more damaging than any presumed benefit.

What would be the ideal role for geoengineering in addressing climate change?
That it generates interest in and awareness of the impossibility of heavy-handed intervention in the climate system, since there could be no predictable outcome of such intervention, physically, politically or socially.

Why do scientists continue to pursue this then, after 200 or so years of failure?
Science fantasy is informed by science fiction and driven by hubris. One of the dictionary definitions of hubris cites Edward Teller (the godfather of modern geoengineering).

Teller’s hubris knew no bounds. He was the [self-proclaimed] father of the H-bomb and promoted all things atomic, even talking about using nuclear weapons to create canals and harbors. He was also an advocate of urban sprawl to survive nuclear attack, the Star Wars [missile] defense system, and a planetary sunscreen to reduce global warming. He wanted to control nature and improve it using technology.

Throughout history rainmakers and climate engineers have typically fallen into two categories: commercial charlatans using technical language and proprietary techniques to cash in on a gullible public, and sincere but deluded scientific practitioners exhibiting a modicum of chemical and physical knowledge, a bare minimum of atmospheric insight, and an abundance of hubris. We should base our decision-making not on what we think we can do “now” and in the near future. Rather, our knowledge is shaped by what we have and have not done in the past. Such are the grounds for making informed decisions and avoiding the pitfalls of rushing forward, claiming we know how to “fix the sky.”

>What we have and haven’t learned from ‘Climategate’

>
DON’T KNOW MUCH AGNOTOLOGY

Grist.org
BY David Roberts
28 FEB 2011 1:29 PM

I wrote about the “Climategate” controversy (over emails stolen from the University of East Anglia’s Climatic Research Unit) once, which is about what it warranted.

My silent protest had no effect whatsoever, of course, and the story followed a depressingly familiar trajectory: hyped relentlessly by right-wing media, bullied into the mainstream press as he-said she-said, and later, long after the damage is done, revealed as utterly bereft of substance. It’s a familiar script for climate faux controversies, though this one played out on a slightly grander scale.

Investigations galore

Consider that there have now been five, count ‘em five, inquiries into the matter. Penn State established an independent inquiry into the accusations against scientist Michael Mann and found “no credible evidence” [PDF] of improper research conduct. A British government investigation run by the House of Commons’ Science and Technology Committee found that while the CRU scientists could have been more transparent and responsive to freedom-of-information requests, there was no evidence of scientific misconduct. The U.K.’s Royal Society (its equivalent of the National Academies) ran an investigation that found “no evidence of any deliberate scientific malpractice.” The University of East Anglia appointed respected civil servant Sir Muir Russell to run an exhaustive, six-month independent inquiry; he concluded that “the honesty and rigour of CRU as scientists are not in doubt … We have not found any evidence of behaviour that might undermine the conclusions of the IPCC assessments.”

All those results are suggestive, but let’s face it, they’re mostly … British. Sen. James Inhofe (R-Okla.) wanted an American investigation of all the American scientists involved in these purported dirty deeds. So he asked the Department of Commerce’s inspector general to get to the bottom of it. On Feb. 18, the results of that investigation were released. “In our review of the CRU emails,” the IG’s office said in its letter to Inhofe [PDF], “we did not find any evidence that NOAA inappropriately manipulated data … or failed to adhere to appropriate peer review procedures.” (Oddly, you’ll find no mention of this central result in Inhofe’s tortured public response.)

Whatever legitimate issues there may be about the responsiveness or transparency of this particular group of scientists, there was nothing in this controversy — nothing — that cast even the slightest doubt on the basic findings of climate science. Yet it became a kind of stain on the public image of climate scientists. How did that happen?

Smooth criminals

You don’t hear about it much in the news coverage, but recall, the story began with a crime. Hackers broke into the East Anglia email system and stole emails and documents, an illegal invasion of privacy. Yet according to The Wall Street Journal’s Kim Strassel, the emails “found their way to the internet.” In ABC science correspondent Ned Potter’s telling, the emails “became public.” The New York Times’ Andy Revkin says they were “extracted from computers.”

None of those phrasings are wrong, per se, but all pass rather lightly over the fact that some actual person or persons put them on the internet, made them public, extracted them from the computers. Someone hacked in, collected emails, sifted through and selected those that could be most damning, organized them, and timed the release for maximum impact, just before the Copenhagen climate talks. Said person or persons remain uncaught, uncharged, and unprosecuted. There have since been attempted break-ins at other climate research institutions.

If step one was crime, step two was character assassination. When the emails were released, they were combed over by skeptic blogs and right-wing media, who collected sentences, phrases, even individual terms that, when stripped of all context, create the worst possible impression. Altogether the whole thing was as carefully staged as any modern-day political attack ad.

Yet when the “scandal” broke, rather than being about criminal theft and character assassination, it was instantly “Climategate.” It was instantly about climate scientists, not the illegal and dishonest tactics of their attackers. The scientists, not the ideologues and ratf*ckers, had to defend themselves.

Burden of proof

It’s a numbingly familiar pattern in media coverage. The conservative movement that’s been attacking climate science for 20 years has a storied history of demonstrable fabrications, distortions, personal attacks, and nothingburger faux-scandals — not only on climate science, but going back to asbestos, ozone, leaded gasoline, tobacco, you name it. They don’t follow the rigorous standards of professional science; they follow no intellectual or ethical standards whatsoever. Yet no matter how long their record of viciousness and farce, every time the skeptic blogosphere coughs up a new “ZOMG!” it’s as though we start from zero again, like no one has a memory longer than five minutes.

Here’s the basic question: At this point, given their respective accomplishments and standards, wouldn’t it make sense to give scientists the strong benefit of the doubt when they are attacked by ideologues with a history of dishonesty and error? Shouldn’t the threshold for what counts as a “scandal” have been nudged a bit higher?

Agnotological inquiry

The lesson we’ve learned from climategate is simple. It’s the same lesson taught by death panels, socialist government takeover, Sharia law, and Obama’s birth certificate. To understand it we must turn to agnotology, the study of culturally induced ignorance or doubt. (Hat tip to an excellent recent post on this by John Quiggen.)

Beck, Palin, and the rest of Fox News and talk radio operate on the pretense that they are giving consumers access to a hidden “universe of reality,” to use Limbaugh’s term. It’s a reality being actively obscured the “lamestream media,” academics, scientists, and government officials. Affirming the tenets of that secret reality has become an act of tribal reinforcement, the equivalent of a secret handshake.

The modern right has created a closed epistemic loop containing millions of people. Within that loop, the implausibility or extremity of a claim itself counts as evidence. The more liberal elites reject it, the more it entrenches itself. Standards of evidence have nothing to do with it.

The notion that there is a global conspiracy by professional scientists to falsify results in order to get more research money is, to borrow Quiggen’s words about birtherism, “a shibboleth, that is, an affirmation that marks the speaker as a member of their community or tribe.” Once you have accepted that shibboleth, anything offered to you as evidence of its truth, no matter how ludicrous, will serve as affirmation. (Even a few context-free lines cherry-picked from thousands of private emails.)

Living with the loop

There’s one thing we haven’t learned from climategate (or death panels or birtherism). U.S. politics now contains a large, well-funded, tightly networked, and highly amplified tribe that defines itself through rejection of “lamestream” truth claims and standards of evidence. How should our political culture relate to that tribe?

We haven’t figured it out. Politicians and the political press have tried to accommodate the shibboleths of the right as legitimate positions for debate. The press in particular has practically sworn off plain judgments of accuracy or fact. But all that’s done is confuse and mislead the broader public, while the tribe pushes ever further into extremity. The tribe does not want to be accommodated. It is fueled by elite rejection.

At this point mainstream institutions like the press are in a bind: either accept the tribe’s assertions as legitimate or be deemed “biased.” Until there is a way out of that trap, there will be more and more Climategates.

>ICTs and the Climate Change ‘Unknowns’: Tackling Uncertainty

>
January 4, 2011
By Angelica Valeria Ospina
From http://niccd.wordpress.com/

Determining the repercussions of the changing climate is a field of great unknowns. While the impacts of climatic variations and seasonal changes on the most vulnerable populations are expected to increase and be manifest in more vulnerable ecosystems and natural habitats, the exact magnitude and impact of climate change effects remain, for the most part, open questions.

Such uncertainty is a key contributor to climate change vulnerability, particularly among developing country populations that lack the resources, including access to information and knowledge, to properly prepare for and cope with its impacts.

But, how can vulnerable contexts prepare for the ‘unknowns’ posed by climate change? And should the quest for ‘certainty’ be the focus of our attention?

The rapid diffusion of Information and Communication Technologies (ICTs) within developing country environments, the hardest hit by climate change-related manifestations, is starting to shed new light on these issues.

A recent article by Reuters identified 10 climate change adaptation technologies that will become crucial to cope and adapt to the effects of the changing climate over the next century.

The bullet points found bellow link these 10 aspects with the potential of ICTs within the climate change field, highlighting some of the ways in which they can help vulnerable populations to better prepare for and cope with the effects of climatic uncertainty.

  • Innovations around Infectious Diseases: Extreme weather events and changing climatic patterns associated with climate change have been linked to the spread of vector-borne (i.e. malaria and dengue) and water-borne diseases. Within this context, ICTs such as mobile phones, community radio and the Internet have the potential to enable information sharing, awareness raising and capacity building on key health threats, enabling effective prevention and response.
  • Flood Safeguards: Climatic changes such as increased and erratic patterns of precipitation negatively affect the capacity of flood and drainage systems, built environment, energy and transportation, among others. ICT applications such as Geographic Information Systems (GIS) can facilitate the monitoring and provision of relevant environmental information to relevant stakeholders, including decision-making processes for the adaptation of human habitats.
  • Weather Forecasting Technologies: ICTs play a key role in the implementation of innovative weather forecasting technologies, including the integration of community monitoring. The use of mobile phones and SMS for reporting on locally-relevant indicators (e.g. likelihood of floods) can contribute to greater accuracy and more precise flood warnings to communities. Based on this information, authorities could design and put in action more appropriate strategies, and farmers could better prepare for evacuations, protect their livestock and better plan local irrigation systems, among others.
  • Insurance Tools: Access to new and more diversified sources of information and knowledge through tools such as the Internet or the mobile phone can facilitate the access to insurance mechanisms, and to information about national programs/assistance available to support vulnerable populations.
  • More Resilient Crops: In the face of higher temperatures, more variable crop seasons and decreasing productivity, ICTs have the potential to enhance food security by strengthening agricultural production systems through information about pest and disease control, planting dates, seed varieties, irrigation applications, and early warning systems, as well as improving market access, among others.
  • Supercomputing: According to the International Telecommunication Union (ITU), the use of ICT-equipped sensors (telemetry), aerial photography, satellite imagery, grid technology, global positioning by satellite (GPS) (e.g. for tracking slow, long-term movement of glaciers) and computer modeling of the earth’s atmosphere, among others, play a key role in climate change monitoring. New technologies continue to be developed, holding great potential for real-time, more accurate information key to strengthen decision-making processes.
  • Water Purification, Water Recycling and Efficient Irrigation Systems: ICTs can contribute to the improvement of water resource management techniques, monitoring of water resources, capacity building and awareness rising. Broadly diffused applications such as mobile phones can serve as tools to disseminate information on low-cost methods for desalination, using gray water and harvesting rainwater for every day uses, as well as for capacity building on new irrigation mechanisms, among others.
  • Sensors: In addition to the role that sensors play in monitoring climate change by helping to capture more accurate data, research indicates that they also constitute promising technologies for improving energy efficiency. Sensors can be used in several environmental applications, such as control of temperature, heating and lighting.

This short identification of areas of potential does not suggest that ICTs can eliminate climatic uncertainty, but it does suggest their potential to help vulnerable populations to strengthen their capacity to withstand and recover from shocks and changing climatic trends.

By contributing to building resilience and strengthening adaptive capacity, ICTs have the potential to tackle climate change uncertainty not only by providing access to information and knowledge, but also by fostering networking, personal empowerment and participation, facilitating self-organisation, access to diverse resources and learning, among others, which ultimately contribute to better preparedness and response, including the possibility of transformation in the face of the unknown.

The need to reduce uncertainty should not substitute efforts to foster creativity and flexibility, which lie at the core of resilient responses to the ongoing challenges posed by climate change.

—————————————————–

*Further examples on the linkages between ICTs, climate change and vulnerability dimensions can be found at: http://www.niccd.org/ScopingStudy.pdf

>Robôs fazem ciência (FAPESP)

>
Especiais

25/2/2011
Por Mônica Pileggi

Sistemas automatizados desenvolvidos por pesquisadores do Reino Unido podem ser peça-chave para a criação de fármacos mais eficazes a custo reduzido (divulgação).

Agência FAPESP – Criar máquinas capazes de realizar novas descobertas é algo que está saindo do campo da ficção científica. Um dos maiores exemplos na atualidade está no Reino Unido, onde a equipe do professor Ross King, do Departamento de Ciências da Computação da Universidade de Gales, trabalha há mais de uma década no desenvolvimento de Adam e Eve (Adão e Eva).

O objetivo da dupla automatizada é diminuir o tempo dos ensaios em laboratório para o desenvolvimento de novos fármacos. Além disso, Eve, o modelo de segunda geração, permite encontrar drogas cujos compostos químicos são mais efetivos no tratamento de uma doença e de forma mais rápida e econômica.

Tal façanha é possível graças à capacidade que o robô tem de selecionar compostos, dentre os milhares armazenados em sua biblioteca, que surtirão mais efeito durante os ensaios no combate a determinada doença. E Eve consegue testar mais de um ao mesmo tempo. “Depois, o pesquisador humano analisa os resultados obtidos”, disse King à Agência FAPESP.

“Mas mesmo com todos esses recursos é importante destacar que Eve ainda não possui inteligência artificial”, completou o professor, que participou nesta quinta-feira (24/2) do Workshop on Synthetic Biology and Robotics, em São Paulo. O evento, organizado pela FAPESP e pelo Consulado Britânico em São Paulo, integra a Parceria Brasil–Reino Unido em Ciência e Inovação.

“Hoje, o robô testa os compostos químicos disponíveis na biblioteca, mas não identifica padrões. A partir da próxima semana trabalharemos para que entenda o trabalho que executa”, revelou.

Nessa fase final de desenvolvimento, a meta é tornar Eve capaz o suficiente para identificar novos padrões – combinações de moléculas – que possam vir a ajudar no desenvolvimento de drogas mais eficazes para, em seguida, testá-las.

Teste em larga escala

Embora incompleto, o robô cientista já mostrou do que é capaz. Ao realizar experimentos em larga escala, Eve reduziu de forma expressiva o escopo de fármacos que a engenheira agrônoma Elizabeth Bilsland, da Universidade de Cambridge, precisaria testar em sua pesquisa com os parasitas Schistosoma, Plasmodium vivax e P. falciparum, e Trypanosoma cruzi e T. brucei, além da Leishmania.

“Cada parasita se desenvolve em diferentes condições. E, para criar novos fármacos, é preciso testar novos métodos. Eve testou mais de 15 mil compostos químicos de sua biblioteca para encontrar aqueles capazes de inibir as enzimas dos parasitas, sem danificar os genes humanos”, disse Elizabeth.

De acordo com a pesquisadora, com base nos ensaios para as doenças causadas pelos parasitas listados, o robô teceu uma rede de hipóteses até chegar a um fármaco com potencial para tratar de todas ao mesmo tempo, exceto a leishmaniose. “É o que podemos chamar de droga miraculosa”, ressaltou.

Mas ainda falta muito para a droga chegar ao mercado, uma vez que a hipótese criada pelo robô precisa ser validada. Essa fase do trabalho contará com a colaboração de cientistas da Unicamp e da Unesp.

Por conta do período que uma nova droga leva para ser lançada, Elizabeth destacou as pesquisas que vem realizando com remédios já disponíveis e aprovados pela Food and Drug Administration do governo dos Estados Unidos.

“Algumas delas são aprovadas e indicadas para determinadas doenças, mas também têm potencial para o tratamento de outras. Testamos essas drogas no sistema que criamos e encontramos cerca de cinco que atacam também as enzimas de Trypanosoma e outras que atingem as enzimas do Plasmodium vivax”, explicou.

A finalidade desse estudo é reaproveitar medicamentos já existentes e aprovados para uso humano que sejam eficientes também em outras doenças.

“Durante uma visita a um hospital em Campinas, observei um caso em que um medicamento prescrito para problemas do coração foi utilizado para o tratamento da doença de Chagas, com bons resultados”, disse Elizabeth.

>Clima sob o olhar do Brasil (Fapesp)

>

Especiais

1/2/2011

Por Elton Alisson


Agência FAPESP – Nos modelos climáticos globais divulgados no mais recente relatório do Painel Intergovernamental sobre Mudança Climática (IPCC), divulgado em 2007, o Pantanal e o Cerrado são retratados como se fossem savanas africanas.

Já fenômenos como as queimadas, que podem intensificar o efeito estufa e mudar as características de chuvas e nuvens de uma determinada região, por exemplo, não são caracterizados por não serem considerados relevantes para os países que elaboraram os modelos numéricos utilizados.

Para dispor de um modelo capaz de gerar cenários de mudanças climáticas com perspectiva brasileira, pesquisadores de diversas instituições, integrantes do Programa FAPESP de Pesquisa em Mudanças Climáticas Globais, da Rede Brasileira de Pesquisa em Mudanças Climáticas Globais (Rede Clima) e do Instituto Nacional de Ciência e Tecnologia sobre Mudanças Climáticas (INCT-MC), estão desenvolvendo o Modelo Brasileiro do Sistema Climático Global (MBSCG).

Com conclusão estimada para 2013, o MSBCG deverá permitir aos climatologistas brasileiros realizar estudos sobre mudanças climáticas com base em um modelo que represente processos importantes para o Brasil e que são considerados secundários nos modelos climáticos estrangeiros.

“Boa parte desses modelos internacionais não atende às nossas necessidades. Temos muitos problemas associados ao clima em virtude de ações antropogênicas, como as queimadas e o desmatamento, que não são retratados e que agora serão incluídos no modelo que estamos desenvolvendo no Brasil”, disse Gilvan Sampaio de Oliveira, pesquisador do Centro de Ciência do Sistema Terrestre (CCST) do Instituto Nacional de Pesquisas Espaciais (Inpe), um dos pesquisadores que coordena a construção do MBSCG.

Segundo ele, o modelo brasileiro incorporará processos e interações hidrológicas, biológicas e físico-químicas relevantes do sistema climático regional e global. Dessa forma, possibilitará gerar cenários, com resolução de 10 a 50 quilômetros, de mudanças ambientais regionais e globais que poderão ocorrer nas próximas décadas para prever seus possíveis impactos em setores como agricultura e energia.

“Com esse modelo, teremos capacidade e autonomia para gerar cenários futuros confiáveis, de modo que o país possa se preparar para enfrentar os fenômenos climáticos extremos”, disse Sampaio à Agência FAPESP.

A primeira versão do modelo brasileiro com indicações do que pode ocorrer com o clima no Brasil nos próximos 50 anos deverá ficar pronta até o fim de 2011.

Para isso, os pesquisadores estão instalando e começarão a rodar em fevereiro no supercomputador Tupã, instalado no Centro de Previsão do Tempo e Estudos Climáticos (CPTEC), em Cachoeira Paulista (SP), uma versão preliminar do modelo, com módulos computacionais que analisam os fenômenos climáticos que ocorrem na atmosfera, no oceano e na superfície terrestre.

Os módulos computacionais serão integrados gradualmente a outros componentes do modelo, que avaliarão os impactos da vegetação, do ciclo de carbono terrestre, do gelo marinho e da química atmosférica no clima. Em contrapartida, um outro componente apontará as influências das mudanças climáticas em cultivares agrícolas como a cana-de-açúcar, soja, milho e café.

“No futuro, poderemos tentar estimar a produtividade da cana-de-açúcar e da soja, por exemplo, frente ao aumento da concentração de gases de efeito estufa na atmosfera”, disse Sampaio.


Classe IPCC

Segundo o cientista, como a versão final do MSBCG só ficará pronta em 2013, o modelo climático brasileiro não será utilizado no próximo relatório que o IPCC divulgará em 2014, o AR-5. Mas o modelo que será utilizado pelo Painel Intergovernamental para realizar as simulações do AR5, o HadGEM2, contará com participação brasileira.

Por meio de uma cooperação entre o Hadley Center, no Reino Unido, e o Inpe, os pesquisadores brasileiros introduziram no modelo internacional módulos computacionais que avaliarão o impacto das plumas de fumaça produzidas por queimadas e do fogo florestal sobre o clima global, que até então não eram levados em conta nas projeções climáticas.

Com isso, o modelo passou a ser chamado HadGEM2-ES/Inpe. “Faremos simulações considerando esses componentes que introduzimos nesse modelo”, contou Sampaio.

Em 2013, quando será concluída a versão final do Modelo Brasileiro do Sistema Climático Global, o sistema ganhará um módulo computacional de uso da terra e outro metereológico, com alta resolução espacial. No mesmo ano, também serão realizadas as primeiras simulações de modelos regionais de alta resolução para a elaboração de um modelo climático para América do Sul com resolução de 1 a 10 km.

“Até hoje, levávamos meses e até anos para gerar cenários regionais. Com o novo sistema de supercomputação os esforços em modelagem climática regional ganharão outra escala”, afirmou Sampaio.

Leia reportagem publicada pela revista Pesquisa FAPESP sobre o modelo climático brasileiro. 

>A essência da realidade física (JC, FSP)

>
JC e-mail 4184, de 24 de Janeiro de 2011

“Não enxergamos o que ocorre na essência da realidade física. Temos apenas nossos experimentos, e eles nos dão uma imagem incompleta do que ocorre”

Marcelo Gleiser é professor de física teórica no Dartmouth College, em Hanover (EUA). Artigo publicado na “Folha de SP”:

Vivemos num mundo quântico. Talvez não seja óbvio, mas sob nossa experiência do real -contínua e ordenada- existe uma outra realidade, que obedece a regras bem diferentes. A questão é, então, como conectar as duas, isto é, como começar falando de coisas que sequer são “coisas” -no sentido de que não têm extensão espacial, como uma cadeira ou um carro- e chegar em cadeiras e carros.

Costumo usar a imagem da “praia vista à distância” para ilustrar a transição da realidade quântica até nosso dia a dia: de longe, a praia parece contínua. Mas de perto, vemos sua descontinuidade, a granularidade da areia. A imagem funciona até pegarmos um grão de areia. Não vemos sua essência quântica, porque cada grão é composto de trilhões de bilhões de átomos. Com esses números, um grão é um objeto “comum”, ou “clássico”.

Portanto, não enxergamos o que ocorre na essência da realidade física. Temos apenas nossos experimentos, e eles nos dão uma imagem incompleta do que ocorre.

A mecânica quântica (MQ) revolve em torno do Princípio de Incerteza (PI). Na prática, o PI impõe uma limitação fundamental no quanto podemos saber sobre as partículas que compõem o mundo. Isso não significa que a MQ é imprecisa; pelo contrário, é a teoria mais precisa que há, explicando resultados de experimentos ao nível atômico e sendo responsável pela tecnologia digital que define a sociedade moderna.

O problema com a MQ não é com o que sabemos sobre ela, mas com o que não sabemos. E, como muitos fenômenos quânticos desafiam nossa intuição, há uma certa tensão entre os físicos a respeito da sua interpretação. A MQ estabelece uma relação entre o observador e o que é observado que não existe no dia a dia. Uma mesa é uma mesa, independentemente de olharmos para ela. No mundo quântico, não podemos afirmar que um elétron existe até que um detector interaja com ele e determine sua energia ou posição.

Como definimos a realidade pelo que existe, a MQ parece determinar que o artefato que detecta é responsável por definir a realidade. E como ele é construído por nós, é a mente humana que determina a realidade.

Vemos aqui duas consequências disso. Primeiro, que a mente passa a ocupar uma posição central na concepção do real. Segundo, como o que medimos vem em termos de informação adquirida, informação passa a ser o arcabouço do que chamamos de realidade. Vários cientistas, sérios e menos sérios, veem aqui uma espécie de teleologia: se existimos num cosmo que foi capaz de gerar a mente humana, talvez o cosmo tenha por objetivo criar essas mentes: em outras palavras, o cosmo vira uma espécie de deus!

Temos que tomar muito cuidado com esse tipo de consideração. Primeiro, porque em praticamente toda a sua existência (13,7 bilhões de anos), não havia qualquer mente no cosmo. E, mesmo sem elas, as coisas progrediram perfeitamente. Segundo, porque a vida, especialmente a inteligente, é rara. Terceiro, porque a informação decorre do uso da razão para decodificar as propriedades da matéria.

Atribuir a ela uma existência anterior à matéria, a meu ver, não faz sentido. Não há dúvida de que a MQ tem os seus mistérios. Mas é bom lembrar que ela é uma construção da mente humana.
(Folha de SP, 23/1)

>Coreia do Sul: robô substitui professor em sala de aula

>
Terra – 28 de dezembro de 2010 • 12h22

Batizado de “Engkey”, o robô sul-coreano custa cerca de R$ 12 mil. Foto: AFP.

Cerca de 30 robôs-professores foram introduzidos em salas de aula de 20 escolas primárias da Coreia do Sul. As máquinas, criadas pelo Instituto de Ciência e Tecnologia do país, tem a intenção de ensinar a língua inglesa para alunos sul-coreanos que não têm contato com o idioma.

Os robôs, chamados “Engkey”, são controlados ao vivo por professores de inglês a partir das Filipinas. Eles têm pouco mais de 1 m de altura e possuem uma tela que capta e mostra o rosto do professor que está, à distância, dando a aula. Os “Engkey” ainda conseguem ler os livros físicos dos alunos e dançar movimentos a cabeça e os braços.

Segundo Sagong Seong-Dae, cientista do Instituto, a questão financeira contou para a substituição do humano pela máquina. “Com boa formação e experiência, os professores filipinos são uma mão-de-obra mais barata do que os daqui”, contou ao site britânico Daily Mail.

Kim Mi-Young, uma oficial do departamento de educação do país, afirmou também ao site que a experiência foi bem-vinda. “As crianças parecem amar os robôs porque eles são bonitinhos. Mas alguns adultos também mostraram um interesse especial afirmando que se sentem menos nervosos de convesarem com máquinas do que com pessoas de verdade”, contou.

Mi-Young fez questão de destacar, no entanto, que os robôs não vão substituir completamente a atuação dos professores humanos, apesar do investimento governamental de cerca de US$ 1,5 milhão, algo em torno de R$ 2,5 milhões. Cada robô tem o preço de aproximadamente R$ 12 mil.

>Superprevisão do tempo? Pergunte ao Tupã (Agência FAPESP)

>
Especiais

Por Elton Alisson, de Cachoeira Paulista (SP)
29/12/2010

Um dos maiores supercomputadores do mundo para previsão de tempo e de mudanças climáticas é inaugurado em Cachoeira Paulista. Equipamento permitirá fazer previsões de tempo mais confiáveis, com maior prazo de antecedência e de melhor qualidade (foto: Eduardo Cesar/Ag.FAPESP)

Agência FAPESP – O Instituto Nacional de Pesquisas Espaciais (Inpe) inaugurou terça-feira (28/12), no Centro de Previsão do Tempo e Estudos Climáticos (CPTEC), em Cachoeira Paulista (SP), o supercomputador Tupã.

Com o nome do deus do trovão na mitologia tupi-guarani, o sistema computacional é o terceiro maior do mundo em previsão operacional de tempo e clima sazonal e o oitavo em previsão de mudanças climáticas.

Não apenas isso. De acordo com a mais recente relação do Top 500 da Supercomputação, que lista os sistemas mais rápidos do mundo, divulgada em novembro, o Tupã ocupa a 29ª posição. Essa é a mais alta colocação já alcançada por uma máquina instalada no Brasil.

Ao custo de R$ 50 milhões, dos quais R$ 15 milhões foram financiados pela FAPESP e R$ 35 milhões pelo Ministério da Ciência e Tecnologia (MCT), por meio da Financiadora de Estudos e Projetos (Finep), o sistema foi fabricado pela Cray, em Wisconsin, nos Estados Unidos.

O Tupã é capaz de realizar 205 trilhões de operações de cálculos por segundo e processar em 1 minuto um conjunto de dados que um computador convencional demoraria mais de uma semana.

Com vida útil de seis anos, o equipamento permitirá ao Inpe gerar previsões de tempo mais confiáveis, com maior prazo de antecedência e de melhor qualidade, ampliando o nível de detalhamento para 5 quilômetros na América do Sul e 20 quilômetros para todo o globo.

A máquina também possibilitará melhorar as previsões ambientais e da qualidade do ar, gerando prognósticos de maior resolução – de 15 quilômetros – com até seis dias de antecedência, e prever com antecedência de pelo menos dois dias eventos climáticos extremos, como as chuvas intensas que abateram as cidades de Angra dos Reis (RJ) e São Luiz do Paraitinga (SP) no início de 2010.

“Com o novo computador, conseguiremos rodar modelos meteorológicos mais sofisticados, que possibilitarão melhorar o nível de detalhamento das previsões climáticas no país”, disse Marcelo Enrique Seluchi, chefe de supercomputação do Inpe e coodernador substituto do CPTEC, à Agência FAPESP.

Segundo o pesquisador, no início de janeiro de 2011 começarão a ser rodados no supercomputador, em nível de teste, os primeiros modelos meteorológicos para previsão de tempo e de mudanças climáticas. E até o fim de 2011 será possível ter os primeiros resultados sobre os impactos das mudanças climáticas no Brasil com dados que não são levados em conta nos modelos internacionais.

Modelo climático brasileiro

De acordo com Gilberto Câmara, diretor do Inpe, o supercomputador foi o primeiro equipamento comprado pela instituição de pesquisa que dispensou a necessidade de financiamento estrangeiro.

“Todos os outros três supercomputadores do Inpe contaram com financiamento estrangeiro, que acaba custando mais caro para o Brasil. O financiamento da FAPESP e do MCT nos permitiu realizar esse investimento sem termos que contar com recursos estrangeiros”, afirmou.

O supercomputador será utilizado, além do Inpe, por outros grupos de pesquisa, instituições e universidades integrantes do Programa FAPESP de Pesquisa em Mudanças Climáticas Globais, da Rede Brasileira de Pesquisa sobre Mudanças Climática (Rede Clima) e do Instituto Nacional de Ciência e Tecnologia (INCT) para Mudanças Climáticas.

Em seu discurso na inauguração, Carlos Henrique de Brito Cruz, diretor científico da FAPESP, destacou a importância do supercomputador para o avanço das pesquisas realizadas no âmbito do Programa FAPESP de Pesquisa em Mudanças Climáticas Globais, que foi concebido para durar pelo menos dez anos, e para a criação do Modelo Brasileiro do Sistema Climático Global (MBSCG).

O modelo incorporará os elementos do sistema terrestre (atmosfera, oceanos, criosfera, vegetação e ciclos biogeoquímicos, entre outros), suas interações e de que modo está sendo perturbado por ações antropogênicas, como, por exemplo, emissões de gases de efeito estudo, mudanças na vegetação e urbanização.

A construção do novo modelo envolve um grande número de pesquisadores do Brasil e do exterior, provenientes de diversas instituições. E se constitui em um projeto interdisciplinar de desenvolvimento de modelagem climática sem precedentes em países em desenvolvimento.

“Não tínhamos, no Brasil, a capacidade de criar um modelo climático global do ponto de vista brasileiro. Hoje, a FAPESP está financiando um grande programa de pesquisa para o desenvolvimento de um modelo climático brasileiro”, disse Brito Cruz.

Na avaliação dele, o supercomputador representará um avanço na pesquisa brasileira em previsão de tempo e mudanças climáticas globais, que são duas questões estratégicas para o país.

Impossibilitado de participar do evento, o ministro da Ciência e Tecnologia, Sergio Rezende, gravou um vídeo, exibido na solenidade de inauguração do supercomputador, em que declarou o orgulho da instalação no Brasil do maior supercomputador do hemisfério Sul.

“Com esse supercomputador, o Brasil dá mais um passo para cumprir as metas de monitoramento do clima assumidas internacionalmente e entra no seleto grupo de países capazes de gerar cenários climáticos futuros”, disse.

>Science Scorned (Nature)

>
Nature, Volume 467:133 (09 September 2010)

The anti-science strain pervading the right wing in the United States is the last thing the country needs in a time of economic challenge.

“The four corners of deceit: government, academia, science and media. Those institutions are now corrupt and exist by virtue of deceit. That’s how they promulgate themselves; it is how they prosper.” It is tempting to laugh off this and other rhetoric broadcast by Rush Limbaugh, a conservative US radio host, but Limbaugh and similar voices are no laughing matter.

There is a growing anti-science streak on the American right that could have tangible societal and political impacts on many fronts — including regulation of environmental and other issues and stem-cell research. Take the surprise ousting last week of Lisa Murkowski, the incumbent Republican senator for Alaska, by political unknown Joe Miller in the Republican primary for the 2 November midterm congressional elections. Miller, who is backed by the conservative ‘Tea Party movement’, called his opponent’s acknowledgement of the reality of global warming “exhibit ‘A’ for why she needs to go”.

“The country’s future crucially depends on education, science and technology.”

The right-wing populism that is flourishing in the current climate of economic insecurity echoes many traditional conservative themes, such as opposition to taxes, regulation and immigration. But the Tea Party and its cheerleaders, who include Limbaugh, Fox News television host Glenn Beck and Sarah Palin (who famously decried fruitfly research as a waste of public money), are also tapping an age-old US political impulse — a suspicion of elites and expertise.

Denialism over global warming has become a scientific cause célèbre within the movement. Limbaugh, for instance, who has told his listeners that “science has become a home for displaced socialists and communists”, has called climate-change science “the biggest scam in the history of the world”. The Tea Party’s leanings encompass religious opposition to Darwinian evolution and to stem-cell and embryo research — which Beck has equated with eugenics. The movement is also averse to science-based regulation, which it sees as an excuse for intrusive government. Under the administration of George W. Bush, science in policy had already taken knocks from both neglect and ideology. Yet President Barack Obama’s promise to “restore science to its rightful place” seems to have linked science to liberal politics, making it even more of a target of the right.

US citizens face economic problems that are all too real, and the country’s future crucially depends on education, science and technology as it faces increasing competition from China and other emerging science powers. Last month’s recall of hundreds of millions of US eggs because of the risk of salmonella poisoning, and the Deepwater Horizon oil spill, are timely reminders of why the US government needs to serve the people better by developing and enforcing improved science-based regulations. Yet the public often buys into anti-science, anti-regulation agendas that are orchestrated by business interests and their sponsored think tanks and front groups.

In the current poisoned political atmosphere, the defenders of science have few easy remedies. Reassuringly, polls continue to show that the overwhelming majority of the US public sees science as a force for good, and the anti-science rumblings may be ephemeral. As educators, scientists should redouble their efforts to promote rationalism, scholarship and critical thought among the young, and engage with both the media and politicians to help illuminate the pressing science-based issues of our time.

>UN climate experts ‘overstated dangers’: Keep your noses out of politics, scientists told (Mail Online)

>
By Fiona Macrae
Mail Online – 31st August 2010

UN climate change experts have been accused of making ‘imprecise and vague’ statements and over-egging the evidence.

A scathing report into the Intergovernmental Panel on Climate Change called for it to avoid politics and stick instead to predictions based on solid science.

The probe, by representatives of the Royal Society and foreign scientific academies, took a thinly-veiled swipe at Rajendra Pachauri, the panel’s chairman for the past eight years.

Exaggerated? Science academies say the Intergovernmental Panel on Climate Change relied on ‘vague’ predictions in making its reports

It recommended a new leader be appointed to bring a ‘fresh approach’ with the term of office cut from 12 years to six.

The IPCC is important because its reports are used by governments to set environmental policy.

The review, which focused on the day-to-day running of the panel, rather than its science, was commissioned after the UN body was accused of making glaring mistakes.

These included the claim that the Himalayan glaciers would vanish within 25 years – and that 55 per cent of the Netherlands was prone to flooding because it was below sea level.

An email scandal involving experts at the University of East Anglia had already fuelled fears that global warming was being exaggerated.

The report demanded a more rigorous conflict of interest policy and said executives should have formal qualifications.

It said: ‘Because the IPCC chair is both the leader and the face of the organisation, he or she must have strong credentials (including high professional standing in an area covered by IPCC assessments), international stature, a broad vision, strong leadership skills, considerable management experience at a senior level, and experience relevant to the assessment task.’

Dr Pachauri has a background in railway engineering rather than science and in recent months has been forced to deny profiting from his role at the IPCC.

When asked yesterday if he would consider resigning, he said he intended to continue working on the panel’s next report on climate change but would abide by any decision the IPCC made.

‘We’ve listened to and learnt from our critics,’ he said.

‘Now that the review has been carried out I believe I have a responsibility to help to implement the changes.

‘I see this as a mission that I cannot shirk or walk away from. It’s now up to the world’s governments to decide when they want to implement the recommendations and which ones they want to implement.’

Dr Benny Peiser, Director of The Global Warming Policy Foundation, said: ‘I interpret the review as an indirect call for Dr Pachauri to step down. That is what it says between the lines, whether or not he understands it.

‘It is clearly a very, very strong criticism of his management and of him personally.

‘The problem is that many in the international community regard him as damaged goods.’

The investigation said the IPCC’s mandate calls for it to be ‘policy relevant’ without ‘straying into advocacy’ which would hurt its credibility.

The scientists charged with writing the IPCC assessments were criticised for saying they were ‘highly confident’ about statements without having the evidence.

One of the summary documents prepared for government use ‘contains many such statements that are not supported sufficiently by the literature, not put into perspective or not expressed clearly’.

Achim Steiner, head of the UN’s environmental programme, said the review of the IPCC ‘re-affirms the integrity, the importance and validity of the IPCC’s work while recognising areas for improvement in a rapidly evolving field’.

Read more: http://www.dailymail.co.uk/news/worldnews/article-1307446/UN-climate-change-experts-overstated-dangers.html#ixzz0yJXrE6ZZ

MAN IN THE HOT SEAT

Arrogant? Dr Rajendra Pachauri

To his admirers, Rajendra Pachauri is a tireless champion of the perils of climate change. To his critics, he is flamboyant and arrogant.

The Indian-born mechanical engineer worked in the railway industry before entering academia.

He taught in the U.S. and then joined a think-tank promoting sustainable development. He became involved with the UN in the 1990s and was elected chairman of its IPCC climate panel in 2002.

The 70-year-old lives in an exclusive district of New Delhi and is said to enjoy a lavish personal lifestyle with a taste for expensive suits.

He has dismissed claims he profited from his links to green energy firms, saying he gave away all the money earned from directorships.

Despite having full use of an eco-friendly vehicle, he uses a chauffeur-driven car to make the one-mile journey to his office.

He raised eyebrows earlier this year with the publication of a raunchy novel about the life and times of an ageing environmentalist and former engineer.

And he made powerful enemies by refusing to apologise for the false claim that Himalayan glaciers would vanish in 25 years.

SCEPTIC CHANGES HIS MIND ON CLIMATE CHANGE

Bjorn Lomborg, author and political scientist

The world’s most high-profile climate change sceptic has changed his mind and now believes that global warming is ‘a challenge humanity must confront’.

The influential economist Bjorn Lomborg (pictured), who has been compared to Adolf Hitler by the UN’s climate chief, is calling for tens of billions of dollars to be invested into tackling climate change.

He described the current rise in temperatures as ‘undoubtedly one of the chief concerns facing the world today.’

Mr Lomborg proposes taxing people on their carbon emissions to pay for the research and improving green energy.

>Are you ready for life in world 3? (New Scientist)

>
Jo Marchant, consultant
New Scientist – 2 August 2010

In the 1970s, Karl Popper came up with a philosophical theory of reality that involved three interacting worlds: the physical world, the mental world, and “world 3”, which comprises all products of the human mind – from ideas, pictures and music to every word ever written.

Something very similar to world 3 is now real and increasingly influencing how we live, says George Djorgovski, co-director of the Center for Advanced Computing Research at Caltech. It’s called the internet.

It’s the first morning of Science Foo camp, and I’ve chosen a session called “virtualisation of science and virtualisation of the world”. In fact – fittingly for a meeting being held at Google headquarters – how we deal with life increasingly lived online turns out to be one of the main themes of the day. Djorgovski reckons that before long, being online will soon mean (among other things) not staring at a computer screen but being immersed in 3D virtual reality.

He thinks this will be key to how we’ll make scientific discoveries in the future. Forget graphs – two dimensions are totally inadequate for dealing with the vast amounts of data pouring out of everything from high-throughput genome sequencing to atom smashers like the Large Hadron Collider. We’ll need machine intelligence capable of analysing these huge data sets, he says, as well as ways to visualise and interact with the results in three dimensions.

Such technologies will surely revolutionise education too, with virtual learning replacing the traditional lecture. Djorgovski wants scientists and researchers to get more involved with this process now, pointing out that so far, advances in 3D technology are all coming from the entertainment industry: “We can’t let the video game industry drive the future in what’s the most important technology on the planet. There has to be more to it than spilling blood and slaying dragons.”

Sitting round the table are experts in everything from psychology and bioethics to space science. Pat Kuhl, an expert in early child learning from the University of Washington, wonders what learning everything online will do to young brains. The consensus around the table is that good or bad, the move into virtual reality environments is inevitable. “So let’s try and offer something more than games,” says Djorgovski.

In a subsequent session on children’s minds, Kuhl tells us about the importance of social cues in early learning. For example, it’s well-known that babies differ in their ability to distinguish sounds, depending on the language they are exposed to, by the time they are 10-12 months old. But Kuhl and her colleagues have recently shown that simply hearing the sounds is not enough. After a few sessions with a Mandarin speaker, American babies could distinguish certain sounds as well as Taiwanese babies, but those given the same exposure via audio or video learned nothing.

So if we don’t want kids’ brains to atrophy in an increasingly virtual world, we must work out how to incorporate the relevant social cues. Kuhl has already found that making the TV screen interactive, so babies can turn it on and off by slapping it, increases – a little bit – how much they learn. She’s now experimenting with web cams.

In the afternoon, UK journalist and commentator Andrew Marr tackles the question of what will happen to journalism in an online world, particularly as e-readers like the iPad – which Marr calls a “great engine of destruction” – become ubiquitous.

The media we consume will no longer be just words, or just pictures, but a collision of text, video, audio and animated graphics. And people will be able to choose individual items to consume, rather than buying a whole newspaper or watching just one channel.

Like most commentators, Marr thinks this will be the end of newspapers – and perhaps of traditional journalists too. But he thinks this can only be a good thing, arguing that journalism, with its short-term focus and trivial level of debate, has been failing us anyway. In the future he thinks news will come from niche, specialist groups, for example people interested in access to clean water, coming together online. These might include bloggers, campaigners and lobbyists. Above them, authoratitive news aggregators will pick out the most important stories of the day and feed them to the rest of us.

Marr says this new model will be good for journalism and for democracy, because the people within each community of interest will be experts, and won’t lose interest in a topic in the way that traditional reporters do.

I’m sure Marr’s right that newspapers as we know them are not going to survive. But I don’t feel so optimistic about his vision. I’m not sure that having aggregators pick from a pool of stories written by specialists with an agenda is necessarily going to give us good journalism. Who is going to write articles in a way that non-specialists can understand? Who will make connections between different fields? Who will have the authority to hold politicans to account? Unfortunately the session ends before we have a chance to get into these questions.

For some historical perspective, I end the day in a session run by Tilly Blyth, curator of computing at the Science Museum in London. Whereas Marr spoke to a packed lecture hall, now just five of us sit cosily around a table. Blyth tells us how the Science Museum is using online technologies to try to bring the history of science and technology into our everyday lives.

One project is an iPhone app that displays stories and pictures from history that are relevant to a user’s location. The other involves asking 200 British scientists to tell their life stories, then linking those oral histories to video clips, searchable transcripts, and perhaps the relevant scientific papers.

Blyth wants to create a “global memory” for science, so that we can learn from changes that have gone before. “We tend to think that we’re living through this amazing period of revolution,” she says.

Then she shows us a satirical illustration from 1880, entitled March of the Intellect, which depicts an array of futuristic contraptions including a steam-powered horse, a flying man, and a pneumatic tube linking London with Bengal. We aren’t the first generation to grapple with the implications of radical technological change. Food for thought as I join the queue for dinner.

>Tensions Grow Between Tornado Scientists and Storm Chasers

>
By Jeffrey R. Young
Boulder, Colo.
The Chronicle of Higher Education
June 17, 2010

A long line of storm chasers gets in the way of scientists studying severe weather. Photo: Carlye Calvin

There is a crowd under the funnel cloud.

Researchers wrapping up one of the largest-ever scientific field studies of tornadoes say that amateur storm chasers hindered their research and created dangerous traffic jams. Storm chasers, for their part, say that they have just as much right to observe storms as Ph.D.’s.

Hundreds of camera-toting amateurs in cars ended up chasing the same storms as a fleet of scientific vehicles during the high-profile research project, called Vortex2, which wrapped up data collection this week. At times the line of traffic caused the Midwestern roads to look like the freeways of Los Angeles, said Roger Wakimoto, director of the National Center for Atmospheric Research, during a briefing for reporters this week.

“I worry about this as a safety hazard,” Mr. Wakimoto said. “These people were blocking our escape routes because of the sheer number of cars.”

Researchers refer to their own fleet as an “armada,” and it was made up of about 40 vehicles, several of them carrying radar gear. The research goal is to understand how tornadoes form, to discover why some big storms generate deadly tornadoes and others don’t, and to improve forecasters’ ability to warn people of the severe weather events.

“It’s embarrassing to say, but we still do not understand what triggers tornado genesis,” Mr. Wakimoto said. “It’s got to be one of the most fundamental things we don’t understand today, but maybe we captured the data [during this study] to answer the question.”

At times amateur storm chasers kept the armada of science trucks from even getting to a budding tornado. One example was on May 19 in Oklahoma, when the number of storm chasers reached about 200 to 300 cars, according to Joshua Wurman, president of the Center for Severe Weather Research, in Boulder.

“The chasers basically made a rolling roadblock,” he said in a phone interview Thursday, while preparing to head out for his last day of data collection. He said that many of the amateur chasers were trying to roll along parallel to the storm to shoot video, but the researchers wanted to get ahead of the storm to set up their radar equipment. Mr. Wurman said that most of the chasers refused to move aside to let the research vehicles pass. While people have no legal obligation to yield to radar trucks, he said that he felt the amateurs should have given way as a courtesy.

“Just like you open the door for a guy with crutches—it’s not required by law, it’s just polite,” he said. “Nobody let us by, and I was really disappointed by that. It basically crippled our science mission that day.”

One veteran storm chaser pointed out on his blog, however, that some of the scientists involved in Vortex2 have been on major television programs that have led to an increase in amateur storm chasers. “Dr. Wurman’s research has benefited financially from his previous affiliation with the Discovery Channel program Storm Chasers—this program implicitly is encouraging viewers to engage in storm-chasing by glamorizing it,” wrote Chuck Doswell.”Dr. Wurman is a well-respected researcher, but he’s not Moses. Nor is he a first responder going about his duties—law enforcement officers are authorized to break laws in the performance of their jobs. I know of no researcher/storm chaser who has that particular blank check.”

Citizen Scientists?

Storm chasers argue that they offer a valuable service because some call in reports and observations to the National Weather Service.

“Storm chasers are out there to save lives—we’re out there to give warnings faster than the early warning systems,” said Aaron Estman, who has been chasing storms for a few years and runs a Web site called TexasChaser.com, in an interview.

But Mr. Wurman said that amateur storm chasers rarely offer useful information because, by the time they call in their reports, officials are already aware of the storms, thanks to radar equipment. And even the few storm chasers who equip their cars with scientific instruments do not properly calibrate their equipment to aid scientific literature, he said.

“They haven’t done the boring stuff—the tedious stuff of doing good science,” he said.

Some researchers say there is hope that storm chasers can become valuable citizen scientists, as has happened in other fields, such as astronomy.

“Right now there’s no coordination,” said Brian M. Argrow, a professor of aerospace engineering sciences and associate dean for education at the University of Colorado at Boulder. “That’s an important thing that Vortex2 brings to the table—a coordinated effort.”
The Next Step

This is the final year of the two-year Vortex2 project, which cost about $13-million and is supported by the National Science Foundation and the National Oceanic and Atmospheric Administration, and involves about 20 teams of scientists from universities and federal laboratories. The endeavor is a sequel to the original Vortex project, a similar effort in 1994 and 1995. (It became one of the inspirations for the Hollywood film “Twister,” which, like the television shows cited by bloggers, helped increase interest in storm-chasing.)

The scientists will now analyze the terabytes of data—the equivalent of thousands of filled hard drives from typical laptops—including images of the storms they observed. The first papers from the project are expected to be presented at a severe-weather conference in Boulder in October.

>A Monsanto e os transgênicos: uma história de horror (da Vanity Fair)

>
Investigation

Monsanto’s Harvest of Fear

Monsanto already dominates America’s food chain with its genetically modified seeds. Now it has targeted milk production. Just as frightening as the corporation’s tactics–ruthless legal battles against small farmers–is its decades-long history of toxic contamination.

By Donald L. Barlett and James B. Steele
Vanity Fair
May 2008

Go to the article:http://www.vanityfair.com/politics/features/2008/05/monsanto200805

No thanks: An anti-Monsanto crop circle made by farmers and volunteers in the Philippines. By Melvyn Calderon/Greenpeace HO/A.P. Images.

>Naomi Oreskes on Merchants of Doubt (WNYC Radio)

>
Science and Speech
Wednesday, May 26, 2010

http://beta.wnyc.org/media/audioplayer/red_progress_player_no_pop.swf(function(){var s=function(){__flash__removeCallback=function(i,n){if(i)i[n]=null;};window.setTimeout(s,10);};s();})();

Naomi Oreskes reveals how a small but powerful group of scientists has managed to obscure the truth about issues from the dangers of smoking to the causes of climate change. And we’ll hear about the origins of the New York accent and how the accent is changing.

>The Climategate Chronicle (Spiegel Online)

>
How the Science of Global Warming Was Compromised

By Axel Bojanowski
14 May 2010 – Spiegel Online

To what extent is climate change actually occuring? Late last year, climate researchers were accused of exaggerating study results. SPIEGEL ONLINE has since analyzed the hacked “Climategate” e-mails and provided insights into one of the most unprecedented spats in recent scientific history.

Is our planet warming up by 1 degree Celsius, 2 degrees, or more? Is climate change entirely man made? And what can be done to counteract it? There are myriad possible answers to these questions, as well as scientific studies, measurements, debates and plans of action. Even most skeptics now concede that mankind — with its factories, heating systems and cars — contributes to the warming up of our atmosphere.

But the consequences of climate change are still hotly contested. It was therefore something of a political bombshell when unknown hackers stole more than 1,000 e-mails written by British climate researchers, and published some of them on the Internet. A scandal of gigantic proportions seemed about to break, and the media dubbed the affair “Climategate” in reference to the Watergate scandal that led to the resignation of US President Richard Nixon. Critics claimed the e-mails would show that climate change predictions were based on unsound calculations.

Although a British parliamentary inquiry soon confirmed that this was definitely not a conspiracy, the leaked correspondence provided in-depth insight into the mechanisms, fronts and battles within the climate-research community. SPIEGEL ONLINE has analyzed the more than 1,000 Climategate e-mails spanning a period of 15 years, e-mails that are freely available over the Internet and which, when printed out, fill five thick files. What emerges is that leading researchers have been subjected to sometimes brutal attacks by outsiders and become bogged down in a bitter and far-reaching trench war that has also sucked in the media, environmental groups and politicians.

SPIEGEL ONLINE reveals how the war between climate researchers and climate skeptics broke out, the tricks the two sides used to outmaneuver each other and how the conflict could be resolved.

Part 2: From Staged Scandal to the Kyoto Triumph

The fronts in the climate debate have long been etched in the sand. On the one side there is a handful of highly influential climate researchers, on the other a powerful lobby of industrial associations determined to trivialize the dangers of global warming. This latter group is supported by the conservative wing of the American political spectrum, conspiracy theorists as well as critical scientists.

But that alone would not suffice to divide the roles so neatly into good and evil. Most climate researchers were somewhere between the two extremes. They often had difficulty drawing clear conclusions from their findings. After all, scientific facts are often ambiguous. Although it is generally accepted that there is good evidence to back forecasts of coming global warming, there is still considerable uncertainty about the consequences it will have.

Both sides — the leading climate researchers on the one hand and their opponents in industry and smaller groups of naysayers on the other — played hardball from the very beginning. It all started in 1986, when German physicists issued a dramatic public appeal, the first of its kind. They warned about what they saw as a “climatic disaster.” However, their avowed goal was to promote nuclear power over carbon dioxide-belching coal-fired power stations.

The First Scandal

At the time, there was certainly clear scientific evidence of a dangerous increase in temperatures, prompting the United Nations to form the Intergovernmental Panel on Climate Change (IPCC) in 1988 to look into the matter. However, the idea didn’t take hold in the United States until the country was hit by an unusually severe drought in the summer of 1988. Politicians in Congress used the dry spell to listen to NASA scientist James Hansen, who had been publishing articles in trade journals for years warning about the threat of man-made climate change.

When Washington instructed Hansen to put more emphasis on the uncertainties in his theory, Senator and later Vice President Al Gore cried foul. Gore notified the media about the government’s alleged attempted cover-up, forcing the government’s hand on the matter.

The oil companies reacted with alarm and forged alliances with companies in other sectors who were worried about a possible rise in the price of fossil fuels. They even managed to rope in a few shrewd climate researchers like Patrick Michaels of the University of Virginia.

The aim of the industrial lobby was to focus as much as possible on the doubts about the scientific findings. According to a strategy paper by the Global Climate Science Team, a crude-oil lobby group, “Victory will be achieved when average citizens recognize uncertainties in climate science.” In the meantime, scientists found themselves on the defensive, having to convince the public time and again that their warnings were indeed well-founded.

Industrial Propaganda for the ‘Less Educated’

A dangerous dynamic had been set in motion: Any climate researcher who expressed doubts about findings risked playing into the hands of the industrial lobby. The leaked e-mails show how leading scientists reacted to the PR barrage by the so-called “skeptics lobby.” Out of fear that their opponents could take advantage of ambiguous findings, many researchers tried to simply hide the weaknesses of their findings from the public.

The lobby spent millions on propaganda campaigns. In 1991, the Information Council on the Environment (ICE) issued a strategy paper aimed at what it called “less-educated people.” This proposed a campaign that would “reposition global warming as a theory (not fact).” However, the skeptics also wanted to address better educated sectors of society. The Global Climate Coalition, for example, an alliance of energy companies, specifically tried to influence UN delegates. The advice of skeptical scientists was also given considerable credence in the US Congress.

Nonetheless, the lobbyists had less success on the international stage. In 1997, the international community agreed on the first-ever climate protection treaty: the Kyoto Protocol. “Scientists had issued a warning, the media amplified it and the politicians reacted,” recalls Peter Weingart, a science sociologist at Bielefeld University in Germany, who researched the climate debate.

But just as numerous industrial firms began to acknowledge the need for climate protection and left the Global Climate Coalition, some scientists began getting too cozy with environmental organizations.

Part 3: How Climate Researchers Plotted with Interest Groups

Even before the UN climate conference in Kyoto in 1997, environmentalist groups and leading climate researchers began joining forces to put pressure on industry and politicians. In August 1997, Greenpeace sent a letter to The Times newspaper in London, appealing on behalf of British researchers. All the climatologists had to do was sign on the dotted line. In October of that year, other climate researchers — ostensibly acting on behalf of the World Wildlife Fund, or WWF — e-mailed hundreds of colleagues calling on them to sign an appeal to the politicians in connection with the Kyoto conference.

The tactic was controversial. Whereas German scientists immediately put their names on the list, others had their doubts. In a leaked e-mail dated Nov. 25, 1997, renowned American paleoclimatologist Tom Wigley told a colleague he was worried that such appeals were almost as “dishonest ” as the propaganda employed by the skeptics’ lobby. Personal views, Wigley said, should not be confused with scientific facts.

Researchers ‘Beef Up’ Appeals by Environmental Groups
 
Wigley’s calls fell on deaf ears, and many of his colleagues unthinkingly fell in line with the environmental lobby. Asked to comment by WWF, climate researchers in Australia and Britain, for example, made particularly pessimistic predictions. What’s more, the experts said they had been fully aware that the WWF wanted to have the warnings “beefed up,” as it had stated in an e-mail dated July 1999. One Australian climatologist wrote to colleagues on July 28, 1999, that he would be “very concerned” if environmental protection literature contained data that might suggest “large areas of the world will have negligible climate change.”

Two years later, German climate researchers at the Potsdam Institute for Climate Impact Research (PIK) and from the Hamburg-based Max Planck Institute for Meteorology also drew up a position paper together with WWF. Germany’s Wuppertal Institute for Climate, Environment and Energy scientific research institute was a pioneer in this respect. It was very open about working together with the environmental group BUND, the German chapter of Friends of the Earth, in developing climate protection strategy recommendations in the mid-1990s.

Part 4: Industry and Researchers Fight for Media Supremacy

From then on, the battle was all about dominance of the media. The media are often accused of giving climate-change skeptics too much attention. Indeed theories that cast doubt over global warming with little scientific backing regularly appeared in the press. These included so-called “information brochures” sent to journalists by oil industry lobbyists.

This is partly because the US media, in particular, are extremely keen to ensure what they see as balanced reporting — in other words, giving both sides in a debate a chance to air their views. This has meant that even more outlandish theories by climate-change skeptics have been given just as much airtime as the findings of established experts.

Media researchers believe the phenomenon of newsworthiness is another reason why anti-climate-change theories are reported so widely. The more unambiguous the warnings about an impending disaster, the more interesting critical viewpoints become. The media debate about the issue also focused on the potentially scandalous question of whether climatologists had speculated about nightmare scenarios simply in order to obtain access to research grants.

Renowned climate researcher Klaus Hasselmann of the Max Planck Institute for Meteorology rebuffed these accusations in a much-quoted article in the German newspaper Die Zeit in 1997. Hasselmann pointed out that scientific findings suggest that there is an extremely high likelihood that man was indeed responsible for climate change. “If we wait until the very last doubts have been overcome, it will be too late to do anything about it,” he wrote.

‘Climatologists Tend Not to Mention their More Extreme Suspicions’
 
Hasselmann blamed the media for all the hype. In fact, sociologists have identified “one-up debates” in the media in which darker and darker pictures were painted of the possible consequences of global warming. “Many journalists don’t want to hear about uncertainty in the research findings,” Max Planck Institute researcher Martin Claussen complains. Sociologist Peter Weingart criticizes not just journalists but also scientists. “Climatologists tend not to mention their more extreme suspicions,” he bemoans.
Whereas the debate flared up time and again in the US, “the skeptics in Germany were quickly marginalized again,” recalls sociologist Hans Peter Peters of the Forschungszentrum Jülich research center, who analyzed climate-related reporting in Germany. Peters believes that the communication strategy of leading researchers has proven successful in the long run. “The announced climate problem has been taken seriously by the media,” he says. He even sees signs of a “strong alignment of scientists and journalists in reporting about climate change.”

Nonetheless, scientists have tried to apply pressure on the media if they disagreed with the way stories were reported. Editorial offices have been inundated with protest letters whenever news stories said that the dangers of runaway climate change appeared to be diminishing. E-mails show that climate researchers coordinated their protests, targeting specific journalists to vent their fury on. For instance, when an article entitled “What Happened to Global Warming?” appeared on the BBC website in October 2009, British scientists first discussed the matter among themselves by e-mail before demanding that an apparently balanced editor explain what was going on.

Social scientists are well aware that good press can do wonders for a person’s career. David Philips, a sociologist at the University of San Diego, suggests that the battle for supremacy in the mass media is not only a means to mobilize public support, but also a great way to gain kudos within the scientific community.

Part 5: Scientific Opinion Becomes Entrenched

The leaked e-mails show that some researchers use tactics that are every bit as ruthless as those employed by critics outside the scientific community. Under attack from global-warming skeptics, the climatologists took to the barricades. Indeed, the criticism only seemed to increase the scientists’ resolve. And worried that any uncertainties in their findings might be pounced upon, the scientists desperately tried to conceal such uncertainties.

“Don’t leave anything for the skeptics to cling on to,” wrote renowned British climatologist Phil Jones of the University of East Anglia (UEA) in a leaked e-mail dated Oct. 4, 2000. Jones, who heads UEA’s Climate Research Unit (CRU), is at the heart of the e-mail scandal. But there have always been plenty of studies that critics could quote because the research findings continue to be ambiguous.

At times scientists have been warned by their own colleagues that they may be playing into the enemy’s hands. Kevin Trenberth from the National Center for Atmospheric Research in the US, for example, came under enormous pressure from oil-producing nations while he was drawing up the IPCC’s second report in 1995. In January 2001, he wrote an e-mail to his colleague John Christy at the University of Alabama complaining that representatives from Saudi Arabia had quoted from one of Christy’s studies during the negotiations over the third IPCC climate report. “We are under no gag rule to keep our thoughts to ourselves,” Christy replied.

‘Effective Long-Term Strategies’
 
Paleoclimatologist Michael Mann from Pennsylvania State University also tried to rein in his colleagues. In an e-mail dated Sept. 17, 1998, he urged them to form a “united front” in order to be able to develop “effective long-term strategies.” Paleoclimatologists try to reconstruct the climate of the past. Their primary source of data is found in old tree trunks whose annual rings give clues about the weather in years gone by.

No one knows better than the researchers themselves that tree data can be very unreliable, and an exchange of e-mails shows that they discussed the problems at length. Even so, meaningful climate reconstructions can be made if the data are analyzed carefully. The only problem is that you get different climate change graphs depending on which data you use.

Mann and his colleagues were pioneers in this field. They were the first to draw up a graph of average temperatures in the Northern Hemisphere over the past 1,000 years. That is indisputably an impressive achievement. Because of its shape, his diagram was dubbed the “hockey stick graph.” According to this, the climate changed little for about 850 years, then temperatures rose dramatically (the blade of the stick). However, a few years later, it turned out that the graph was not as accurate as first assumed.

‘I’d Hate to Give It Fodder’
 
In 1999, CRU chief Phil Jones and fellow British researcher Keith Briffa drew up a second climate graph. Perhaps not surprisingly, this led to a row between the two groups about which graph should be published in the summary for politicians at the front of the IPCC report.

The hockey stick graph was appealing on account of its convincing shape. After all, the unique temperature rise of the last 150 years appeared to provide clear proof of man’s influence on our climate. But Briffa cautioned about overestimating the significance of the hockey stick. In an e-mail to his colleagues in September 1999, Briffa said that Mann’s graph “should not be taken as read,” even though it presented “a nice tidy story.”

In contrast to Mann et al’s hockey stick, Briffa’s graph contained a warm period in the High Middle Ages. “I believe that the recent warmth was probably matched about 1,000 years ago,” he wrote. Fortunately for the researchers, the hefty dispute that followed was quickly defused when they realized they were better served by joining forces against the common

. Climate-change skeptics use Briffa’s graph to cast doubt over the assertion that man’s activities have affected our climate. They claim that if our atmosphere is as warm now as it was in the Middle Ages — when there was no man-made pollution — carbon dioxide emissions can’t possibly be responsible for the rise in temperatures.

“I don’t think that doubt is scientifically justified, and I’d hate to be the one to have to give it fodder,” Mann wrote to his colleagues. The tactic proved a successful one. Mann’s hockey stick graph ended up at the front of the UN climate report of 2001. In fact it became the report’s defining element.

An Innocent Phrase Seized by Republicans
 
In order to get unambiguous graphs, the researchers had to tweak their data slightly. In probably the most infamous of the Climategate e-mails, Phil Jones wrote that he had used Mann’s “trick” to “hide the decline” in temperatures. Following the leaking of the e-mails, the expression “hide the decline” was turned into a song about the alleged scandal and seized upon by Republican politicians in the US, who quoted it endlessly in an attempt to discredit the climate experts.

But what appeared at first glance to be fraud was actually merely a face-saving fudge: Tree-ring data indicates no global warming since the mid-20th century, and therefore contradicts the temperature measurements. The clearly erroneous tree data was thus corrected by the so-called “trick” with the temperature graphs.

The row grew more and more bitter as the years passed, as the leaked e-mails between researchers shows. Since the late 1990s, several climate-change skeptics have repeatedly asked Jones and Mann for their tree-ring data and calculation models, citing the legal right to access scientific data.

‘I Think I’ll Delete the File’
 
In 2003, mineralogist Stephen McIntyre and economist Ross McKitrick published a paper that highlighted systematic errors in the statistics underlying the hockey stick graph. However Michael Mann rejected the paper, which he saw as part of a “highly orchestrated, heavily funded corporate attack campaign,” as he wrote in September 2009.

More and more, Mann and his colleagues refused to hand out their data to “the contrarians,” as skeptical researchers were referred to in a number of e-mails. On Feb. 2, 2005, Jones went so far as to write, “I think I’ll delete the file rather than send it to anyone.”

Today, Mann defends himself by saying his university has looked into the e-mails and decided that he had not suppressed data at any time. However, an inquiry conducted by the British parliament came to a very different conclusion. “The leaked e-mails appear to show a culture of non-disclosure at CRU and instances where information may have been deleted to avoid disclosure,” the House of Commons’ Science and Technology Committee announced in its findings on March 31.

Sociologist Peter Weingart believes that the damage could be irreparable. “A loss of credibility is the biggest risk inherent in scientific communication,” he said, adding that trust can only be regained through complete transparency.

Part 6: From Deserved Reputations to Illegitimate Power

The two sides became increasingly hostile toward one another. They debated about whom they could trust, who was a part of their “team” — and who among them might secretly be a skeptic. All those who were between the two extremes or even tried to maintain links with both sides soon found themselves under suspicion.

This distrust helped foster a system of favoritism, as the hacked e-mails show. According to these, Jones and Mann had a huge influence over what was published in the trade press. Those who controlled the journals also controlled what entered the public arena — and therefore what was perceived as scientific reality.

All journal articles are checked anonymously by colleagues before publication as part of what is known as the “peer review” process. Behind closed doors, researchers complained for years that Mann, who is a sought-after reviewer, acted as a kind of “gatekeeper” in relation to magazine articles on paleoclimatology. It’s well-known that renowned scientists can gain influence within journals. But it’s a risky business. “The danger that deserved reputations become illegitimate power is the greatest risk that science faces,” Weingart says.

From Peer Review to Connivance
 
In an e-mail to SPIEGEL ONLINE, Mann rejected the claims that he exercised undue influence. He said the editors of scientific journals — not he — chose the reviewers. However, as Weingart points out, in specialist areas like paleoclimatology, which have only a handful of experts, certain scientists can gain considerable power — provided they have a good connection to the publishers of the relevant journals.

The “hockey team,” as the group around Mann and Jones liked to call itself, undoubtedly had good connections to the journals. The colleagues coordinated and discussed their reviews among themselves. “Rejected two papers from people saying CRU has it wrong over Siberia,” CRU head Jones wrote to Mann in March 2004. The articles he was referring to were about tree data from Siberia, a basis of the climate graphs. In fact, it later turned out that Jones’ CRU group probably misinterpreted the Siberian data, and the findings of the study rejected by Jones in March 2004 were actually correct.

However, Jones and Mann had the backing of the majority of the scientific community in another case. A study published in Climate Research in 2003 looked into findings on the current warm period and the medieval one, concluding that the 20th century was “probably not the warmest nor a uniquely extreme climactic period of the last millennium.” Although climate skeptics were thrilled, most experts thought the study was methodologically flawed. But if the pro-climate-change camp controlled the peer review process, then why was it ever published?

Plugging the Leak
 
In an e-mail dated March 11, 2003, Michael Mann said there was only one possibility: Skeptics had taken over the journal. He therefore demanded that the enemy be stopped in its tracks. The “hockey team” launched a powerful counterattack that shook Climate Research magazine to its foundations. Several of its editors resigned. Vociferous as they were, though, the skeptics did not have that much influence. If it turned out that alarmist climate studies were flawed — and this was the case on several occasions — the consequences of the climate catastrophe would not be as dire as had been predicted.

Yet there were also limits to the influence had by Mann and Jones, as became apparent in 2005, when relentless hockey stick critics Ross McKitrick and Stephen McIntyre were able to publish studies in the most important geophysical journal, Geophysical Research Letters (GRL). “Apparently, the contrarians now have an ‘in’ with GRL,” Mann wrote to his colleagues in a leaked e-mail. “We can’t afford to lose GRL.”

Mann discovered that one of the editors of GRL had once worked at the same university as the feared climate skeptic Patrick Michaels. He therefore put two and two together: “I think we now know how various papers have gotten published in GRL,” he wrote on January 20, 2005. At the same time, the scientists discussed how to get rid of GRL editor James Saiers, himself a climate researcher. Saiers quit his post a year later — allegedly of his own accord. “The GRL leak may have been plugged up now,” a relieved Mann wrote in an e-mail to the “hockey team.”

Internal Conflict and the External Façade
 
Climategate appears to confirm the criticism that scientific systems always benefit cartels. However, Sociologist Hans Peter Peters cautions against over-interpreting the affair. He says alliances are commonplace in every area of the scientific world. “Internal communication within all groups differs from the facade,” Peters says.

Weingart also believes the inner workings of a group should not be judged by the criteria of the outside world. After all, controversy is the very basis of science, and “demarcation and personal conflict are inevitable.” Even so, he says the extent to which camps have built up in climate research is certainly unusual.

Part 7: Conclusive Proof Is Impossible

Weingart says the political ramifications only fuelled the battle between the two sides in the global warming debate. He believes that the more an issue is politicized, the deeper the rifts between opposing stances.

Immense public scrutiny made life extremely difficult for the scientists. On May 2, 2001, paleoclimatologist Edward Cook of the Lamont Doherty Earth Observatory complained in an e-mail: “This global change stuff is so politicized by both sides of the issue that it is difficult to do the science in a dispassionate environment.” The need to summarize complex findings for a UN report appears only to have exacerbated the problem. “I tried hard to balance the needs of the science and the IPCC, which were not always the same,” Keith Briffa wrote in 2007. Max Planck researcher Martin Claussen says too much emphasis was put on consensus in an attempt to satisfy politicians’ demands.

And even scientists are not always interested solely in the actual truth of the matter. Weingart notes that public debate is mostly “only superficially about enlightenment.” Rather, it is more about “deciding on and resolving conflicts through general social agreement.” That’s why it helps to present unambiguous findings.

The Time for Clear Answers Is Over
 
However, it seems all but impossible to provide conclusive proof in climate research. Scientific philosopher Silvio Funtovicz foresaw this dilemma as early as 1990. He described climate research as a “postnormal science.” On account of its high complexity, he said it was subject to great uncertainty while, at the same time, harboring huge risks.

The experts therefore face a dilemma: They have little chance of giving the right advice. If they don’t sound the alarm, they are accused of not fulfilling their moral obligations. However, alarmist predictions are criticized if the predicted changes fail to materialize quickly.

Climatological findings will probably remain ambiguous even if further progress is made. Weingart says it’s now up to scientists and society to learn to come to terms with this. In particular, he warns, politicians must understand that there is no such thing as clear results. “Politicians should stop listening to scientists who promise simple answers,” Weingart says.

Translated from the German by Jan Liebelt

A colorful oracle: A visitor watches an animation demonstrating oceanic acidity levels at the UN Climate Change Conference in Copenhagen in December.

Red colors equals a warmer future: Climate prognoses forecast a noticeable warming of the planet if greenhouse-gas emissions are not curtailed.

Several climate researchers are calling for the resignation of Rajendra Pachauri, a Nobel Peace Prize winner and chairman of the UN’s Intergovernmental Panel on Climate Change, because he took too long to acknowledge that the panel published inaccurate research on climate change.

The German Climate Computing Center (DKRZ) in Hamburg uses supercomputers to predict future climates.

>Climate sceptics rally to expose ‘myth’ (BBC)

>
By Roger Harrabin
Environment analyst, BBC News
21 May 2010

In the Grand Ballroom Of Chicago’s Magnificent Mile Hotel, dinner was over.

Beef, of course. A great pink hunk of it from the American Mid-West.

At the world’s biggest gathering of climate change sceptics, organised by the right-wing Heartland Institute, vegetarians were an endangered species.

Wine flowed and blood coursed during a rousing address from Heartland’s libertarian president Joseph Bast. Climate change is being used by governments to oppress the people, he believes.

After years of opposing government rules on smoking and the environment, Mr Bast now aims to forge a global movement of climate sceptics to end the “myth” that humans are endangering the atmosphere.

He urged the audience to spread the word among their families, friends and work colleagues that climate science is too uncertain to guide government policy, and that plans for climate laws in the US would bankrupt the nation.

“We just didn’t realise in those days how important and controversial this would all become” – Professor Roy Spencer, University of Alabama

In turn, he introduced an all-American hero, Harrison Schmitt, one of the last people to walk on the Moon and still going strong.

Mr Schmitt trained as a geologist and like some other geologists believes that climate change is part of a natural fluctuation. He’s also a former Republican Senator and he made the case that the American constitution contains no powers for government to legislate CO2.

The audience, containing some international faces, but mostly American libertarians and Republicans, loved the small-government message.

They cheered when a member of the audience demanded that the “Climategate criminals” – the scientists behind the University of East Anglia (UEA) hacked emails – should be jailed for fraud.

‘Anti-climax’

And the fervour reached a peak when the reluctant hero, Steve McIntyre, shambled on to the stage.

Mr McIntyre is the retired mining engineer who started enquiring into climate statistics as a hobby and whose requests for raw data from the UEA led to a chain of events which have thrown climate science into turmoil.

The crowd rose to applaud him to the stage in recognition of his extraordinary statistical battle to disprove the “Hockey Stick” graph that had become an emblem of man-made global warming.

There was a moment of anticipation as Mr McIntyre stood nervously before the podium – a lugubrious bear of a man resembling a character from Garrison Keillor’s Lake Wobegon.

Steve McIntyre has worked to “break” the hockey stick

“I’m not used to speaking in front of such big crowds,” he mumbled. And he winced a little when one emotional admirer blurted that he had travelled 10,000 miles from South Africa for the thrill of hearing him speak.

But then came a sudden and unexpected anti-climax. Mr McIntyre urged the audience to support the battle for open source data on climate change – but then he counselled them to stop clamouring for the blood of the e-mailers. McIntyre does not want them jailed, or even punished. He just wants them to say they are sorry.

The audience disappointment was tangible – like a houndpack denied the kill.

Mr McIntyre then advised sceptics to stop insisting that the Hockey Stick is a fraud. It is understandable for scientists to present their data in a graphic way to “sell” their message, he said. He understood why they had done it. But their motives were irrelevant.

The standard of evidence required to prove fraud over the Hockey Stick was needlessly high, he said. All that was needed was an acknowledgement by the science authorities that the Hockey Stick was wrong.


Political associations

This was clearly not the sort of emollient message the sceptics expected from one of their heavy hitters. And the speech slipped further into climate pacifism when Mr McIntyre confessed that he did not share the libertarian tendencies of many in the ballroom.

As a Canadian, he said, he was brought up to believe that governments should govern on behalf of the people – so if CO2 were reckoned to be dangerous, it would be the duty of politicians to make laws to cut emissions.

The quiet man said he thought that the work of his climate-statistical website was probably done. He sat down to one-handed applause.

Not so much of a call to arms as a whispered advice to the adversary to lay down his weapons and depart the battlefield.

His message of climate conciliation was reinforced by Tom Harris, founder of the International Climate Science Coalition.

He says he’s not a right-winger, and he told the conference that many scientists sharing his political views had misgivings about establishment climate theory, but would not speak out for fear of being associated with their political opponents or with the fossil fuel industry.

Indeed some moderate climate sceptics told me they have shunned this conference for fear of being publicly associated with a highly-politicised group.

And Sonia Boehmer Christiansen, the British-based climate agnostic (her term), brought to a juddering halt an impassioned anti-government breakfast discussion with a warning to libertarians that they would never win the policy argument on climate unless they could carry people from the Left with them.

Governments needed taxes, she said – and energy taxes – were an efficient way of gathering them.

Cloud effect

Even some right-wingers agreed the need to review the language of scam and fraud. Professor Roy Spencer, for instance, is a climate sceptic scientist from the University of Alabama in Huntsville.

But when I asked him about the future of Professor Phil Jones, the man of the heart of the UEA e-mail affair, he said he had some sympathy.

“He says he’s not very organised. I’m not very organised myself,” said Professor Spencer. “If you asked me to find original data from 20 years ago I’d have great difficulty too.

“We just didn’t realise in those days how important and controversial this would all become – now it would just all be stored on computer. Phil Jones has been looking at climate records for a very long time. Frankly our data set agrees with his, so unless we are all making the same mistake we’re not likely to find out anything new from the data anyway.”

Professor Spencer admits that he is regarded by orthodox climate scientists as a renegade. But as a very conservative Christian he is at home here, and his views carry weight at this meeting.

Like most climate sceptic scientists, he accepts that CO2 is a warming gas – this is basic physics, he says, and very hard to dispute.

But he says his studies on incoming and outgoing Earth radiation measured by satellites suggest that changes in cloudiness are mitigating warming caused by CO2.

He thinks all the world’s climate modellers are wrong to assume that the Earth’s natural systems will augment warming from CO2, and he hopes that a forthcoming paper will prove his case.

He admits that he has been wrong often enough to know it’s easy to be wrong on a subject as complex as the climate. But he says that means the modellers can all be wrong, too.

The key question for the future, he said, was the one that has been asked for the past 30 years with inconclusive answers – how sensitive will the climate be to a doubling of CO2?

‘Climate resilience’

The godfather of climate scepticism Richard Lindzen, professor of meteorology at Massachusetts Institute of Technology (MIT), has been pre-occupied with this question for decades.

He is a member of the US National Academy of Sciences and a former lead author for the IPCC. But he is immensely controversial and his views run directly counter to those of his institute, which, he says, is looking forward to his retirement.

He has been accused of ignoring recent developments in science.

He believes CO2 is probably keeping the Earth warmer than it would otherwise be, but says he is more convinced than ever that the climate will prove increasingly resilient to extra CO2.

He thinks that this greenhouse gas will not increase temperature much more than 1C in total because the positive feedbacks predicted by computer models will not occur.

The final word of this conference – part counter-orthodox science brainstorm, part political rally – was left to a man who is not a scientist at all, Christopher Monckton, former adviser to Mrs Thatcher, now the darling of climate sceptics worldwide.

In a bravura performance he had the audience roaring at his mocking impersonation of “railway engineer Rajendra Pachauri – the Casey Jones of climate change”; hissing with pantomime fury at the “scandal” of Climategate, then emotionally applauding the American troops who have given their lives for the freedom that their political masters are surrendering to the global socialist tyranny of global warming.

His closing words were delivered in a weeping whisper, a soft prayer of praise to the American constitution and individual liberty.

As the ecstatic crowd filtered out I pointed one delegate to a copy of the Wall Street Journal on the table. A front page paragraph noted that April had been the warmest on record.

“So what?” he shrugged. “So what?”

>Craig Venter e a célula artifical

>
O único DNA presente é sintético – entrevista com Craig Venter

Steve Connor, do Independent
O Globo, 21/5/2010 – reproduzido no Jornal da Ciência (JC e-mail 4015)

Para cientista, mau uso da tecnologia pode ser enfrentado com uma nova legislação

A criação, pela primeira vez na História, de uma forma de vida artificial, pelo grupo do geneticista Craig Venter – o mesmo responsável pela apresentação do genoma humano em 2001 – abre caminho para a compreensão das origens da vida, e inaugura uma nova era da biologia sintética. O grupo criou uma célula sintética, a partir de um DNA produzido de forma artificial e transplantado para uma bactéria.

Nesta entrevista concedida ao jornal britânico “Independent”, Venter deixa claro que o seu feito foi, de fato, criar a primeira forma de vida artificial. “O único DNA presente (na célula criada) é o sintético”, afirma.

O próximo passo dessa linha de pesquisa, de acordo com ele, “é entender a natureza básica da vida, quais são os conjuntos de genes mínimos necessários para ela. Ainda não sabemos todas as funções de genes presentes em uma célula. Trata-se, portanto, de um enigma fundamental.”

– Qual é novidade de seu estudo?

Esta foi a primeira vez que alguém construiu um cromossomo inteiro, de 1,08 milhão de pares de bases, transplantou-o para uma célula receptora e o fez assumir o controle desta célula, convertendo-a em uma nova espécie de bactéria.

Estabelecemos, portanto, um novo paradigma – temos uma célula totalmente controlada por um cromossomo artificial.

– É, então, uma vida artificial?

Nós a definimos assim por ela ser totalmente determinada por um cromossomo artificial. Começamos com uma célula viva, mas o cromossomo que construímos a transformou completamente. Não há qualquer elemento da célula receptora. Nossa célula ar tificial passou por um bilhão de réplicas. O único DNA presente ali é o artificial. Todas as proteínas foram codificadas por ele. Isso é importante ressaltar: não produzimos as proteínas nem as células artificialmente. Tudo foi ditado pelo cromossomo.

– Podemos dizer que uma vida foi criada do zero?

Não considero que tenha acontecido isso. Criamos uma nova vida a partir de outra já existente, usando um DNA artificial que reprogramou as células.

– Por que a bactéria Mycoplasma mycoides foi escolhida para a pesquisa?

Este é o primeiro passo, a forma escolhida para estabelecer um novo paradigma. Faz sentido começar com algo que, sabemos, é biologicamente ativo. Provamos, assim, que nosso estudo poderia ser feito, o que não é pouca coisa. Mudamos para real o estágio de algo que, dois meses atrás, era considerado hipotético.

– Essa nova forma de vida é um organismo de vida livre e capaz de replicar?

Sim, se considerarmos que o conceito de “vida livre” também pode ser atribuído ao que cresce em um laboratório. Fora dele, o experimento não sobreviveria. No laboratório, se dermos os nutrientes corretos, este organismo pode se replicar sem qualquer intervenção.

– Qual foi a maior dificuldade enfrentada por sua equipe?

Em um determinado momento, havia apenas um erro em um milhão de pares de bases do cromossomo (e não conseguíamos prosseguir). Chegamos a interpretar este episódio como um sinal de que seria impossível conseguir dali uma vida. Foi um momento difícil, porque contrariava algo que eu havia previsto três anos atrás. Enormes obstáculos precisavam ser superados. Tivemos de aprender e inventar novos sistemas para tornar tudo isso possível, o que nunca é algo trivial.

– E agora, o que o senhor espera atingir?

Queremos entender a natureza básica da vida, quais são os conjuntos de genes mínimos necessários para ela. Ainda não sabemos todas as funções de genes presentes em uma célula. Não sabemos o que fazem e como trabalham. Há 15 anos tentamos achar estas respostas, mesmo em células simples. Trata-se, portanto, de um enigma fundamental para chegarmos à próxima etapa. Com o passar dos anos, o uso de novas tecnologias torna tudo mais evidente para nós. É só lembrar dos anos 40 e 50, quando a revolução dos eletrônicos ainda não havia decolado.

Os cientistas que se dedicavam à construção de circuitos àquela época tinham muito pouca noção sobre o que viriam a ser os celulares e os computadores. É muito difícil imaginar todas as aplicações de uma tecnologia.

Considera-se que a população mundial, hoje de 6,8 bilhões de pessoas, passará para 9 bilhões em três ou quatro décadas. E atualmente nós sequer conseguimos prover comida, energia, água potável e remédios para todos. Então, precisamos urgentemente de novas técnicas para atingir esse objetivo, e isso deve ser feito sem destruir o planeta.

– O senhor está brincando de Deus?

Esta pergunta tornou-se quase um clichê. É lembrada toda vez que há uma grande descoberta científica, particularmente na biologia. Ciência é a compreensão da vida em seus níveis mais básicos, e a tentativa de usar esse conhecimento para a melhoria da Humanidade.

Acredito, portanto, que somos parte do progresso do conhecimento científico, e que contribuímos para o entendimento do mundo ao nosso redor.

– O senhor está preocupado com o mau uso das técnicas aplicadas em sua pesquisa?

Tenho que estar. É uma tecnologia poderosa. Propus novas regulações para esta área, porque sinto que as atuais não vão longe como seria necessário. Por sermos os inventores, queremos que seja feito tudo o possível para proteger nossa técnica contra o mau uso. Sugeri, por exemplo, uma legislação para as empresas que sintetizam DNA, para que não façam o genoma de DNAs que sejam potencialmente perigosos.

Queremos que essa descoberta seja posta em um contexto que as pessoas saibam o que ela significa. Creio que esta tenha sido a primeira vez, na ciência, em que uma extensa análise bioética foi realizada antes de os experimentos estarem concluídos.

Esta é parte de um processo em andamento que estamos conduzindo, a nossa tentativa de ter certeza do que esses procedimentos significarão no futuro.

>The root of the climate email fiasco (The Guardian)

>
Learning forced into silos of humanities and science has created closed worlds of specialists who just don’t understand each other

George Monbiot
The Guardian, Tuesday 6 April 2010

The MPs were kind to Professor Phil Jones. During its hearings, the Commons science and technology committee didn’t even ask the man at the centre of the hacked climate emails crisis about the central charge he faces: that he urged other scientists to delete material subject to a freedom of information request. Last week the committee published its report, and blamed his university for the “culture of non-disclosure” over which Jones presided.

Perhaps the MPs were swayed by the disastrous performance of his boss at the hearings. Edward Acton, the vice-chancellor of the University of East Anglia, came across as flamboyant, slippery and insincere. Jones, on the other hand, seemed both deathly dull and painfully honest. How could this decent, nerdy man have messed up so badly?

None of it made sense: the intolerant dismissal of requests for information, the utter failure to engage when the hacked emails were made public, the refusal by other scientists to accept that anything was wrong. Then I read an article by the computer scientist Steve Easterbrook, and for the first time the light began to dawn.

Easterbrook, seeking to defend Jones and his colleagues, describes a closed culture in which the rest of the world is a tedious and incomprehensible distraction. “Scientists normally only interact with other scientists. We live rather sheltered lives … to a scientist, anyone stupid enough to try to get scientific data through repeated FoI requests quite clearly deserves our utter contempt. Jones was merely expressing (in private) a sentiment that most scientists would share – and extreme frustration with people who clearly don’t get it.”

When I read that, I was struck by the gulf between our worlds. To those of us who clamoured for freedom of information laws in Britain, FoI requests are almost sacred. The passing of these laws was a rare democratic victory; they’re among the few means we possess of ensuring that politicians and public servants are answerable to the public. What scientists might regard as trivial and annoying, journalists and democracy campaigners see as central and irreducible. We speak in different tongues and inhabit different worlds.

I know how it happens. Like most people with a science degree, I left university with a store of recondite knowledge that I could share with almost no one. Ill-equipped to understand any subject but my own, I felt cut off from the rest of the planet. The temptation to retreat into a safe place was almost irresistible. Only the extreme specialisation demanded by a PhD, which would have walled me in like an anchorite, dissuaded me.

I hated this isolation. I had a passionate interest in literature, history, foreign languages and the arts, but at the age of 15 I’d been forced, like all students, to decide whether to study science or humanities. From that point we divided into two cultures, and the process made idiots of us all. Perhaps eventually we’ll split into two species. Reproducing only with each other, scientists will soon become so genetically isolated that they’ll no longer be able to breed with other humans.

We all detest closed worlds: the Vatican and its dismissal of the paedophilia scandals as “idle chatter”; the Palace of Westminster, whose members couldn’t understand the public outrage about their expenses; the police forces that refuse to discipline errant officers. Most of us would endorse George Bernard Shaw’s opinion that all professions are conspiracies against the laity. Much of the public hostility to science arises from the perception that it’s owned by a race to which we don’t belong.

But science happens to be the closed world with one of the most effective forms of self-regulation: the peer review process. It is also intensely competitive, and the competition consists of seeking to knock each other down. The greatest scientific triumph is to falsify a dominant theory. It happens very rarely, as only those theories which have withstood constant battery still stand. If anyone succeeded in overturning the canon of climate science, they would soon become as celebrated as Newton or Einstein. There are no rewards for agreeing with your colleagues, tremendous incentives to prove them wrong. These are the last circumstances in which a genuine conspiracy could be hatched.

But it is no longer sufficient for scientists to speak only to each other. Painful and disorienting as it is, they must engage with that irritating distraction called the rest of the world. Everyone owes something to the laity, and science would die if it were not for the billions we spend on it. Scientists need make no intellectual concessions, but they have a duty to understand the context in which they operate. It is no longer acceptable for climate researchers to wall themselves off and leave the defence of their profession to other people.

There are signs that this is changing. The prominent climate change scientist Simon Lewis has just sent a long submission to the Press Complaints Commission about misrepresentation in the Sunday Times. The paper claimed that the Intergovernmental Panel on Climate Change’s contention that global warming could destroy up to 40% of the Amazon rainforest “was based on an unsubstantiated claim by green campaigners who had little scientific expertise”. It quoted Lewis to suggest he supported the story. The article and its claims were reproduced all over the world.

But the claims were wrong: there is solid scientific research showing damage on this scale is plausible in the Amazon. Lewis claims that the Sunday Times falsely represented his views. He left a comment on the website but it was deleted. He sent a letter to the paper but it wasn’t published. Only after he submitted his complaint to the PCC did the Sunday Times respond to him. The paper left a message on his answerphone, which he has made public: “It’s been recognised that the story was flawed.” After seven weeks of stonewalling him, the Sunday Times offered to run his letter. But it has neither taken down the flawed article nor published a correction.

Good luck to Lewis, but as the PCC’s treatment of the News of the World phone-hacking scandal suggests, he’s likely to find himself shut out of another closed world – journalism – in which self-regulation manifestly doesn’t work. Here’s a profession that looks like a conspiracy against the laity even from the inside.

The incomprehension with which science and humanities students regard each other is a tragedy of lost opportunities. Early specialisation might allow us to compete in the ever more specialised labour market, but it equips us for nothing else. As Professor Don Nutbeam, the vice-chancellor of Southampton University, complains: “Young people learn more and more about less and less.”

We are deprived by our stupid schooling system of most of the wonders of the world, of the skills and knowledge required to navigate it, above all of the ability to understand each other. Our narrow, antiquated education is forcing us apart like the characters in a Francis Bacon painting, each locked in our boxes, unable to communicate.

>Should geoengineering tests be governed by the principles of medical ethics?

>
Rules for Planet Hackers

By Eli Kintisch
Thu Apr. 22, 2010 1:00 AM PDT

[Image: Flickr/indigoprime (Creative Commons)]

Nearly 200 scientists from 14 countries met last month at the famed Asilomar retreat center outside Monterey, California, in a very deliberate bid to make history. Their five-day meeting focused on setting up voluntary ground rules for research into giant algae blooms, cloud-brightening, and other massive-scale interventions to cool the planet. It’s unclear how significant the meeting will turn out to be, but the intent of its organizers was unmistakable: By choosing Asilomar, they hoped to summon the spirit of a groundbreaking meeting of biologists that took place on the same site in 1975. Back then, scientists with bushy sideburns and split collars—the forefathers of the molecular revolution, it turned out—established principles for the safe and ethical study of deadly pathogens.

The planners of Asilomar II, as they called it, hoped to accomplish much the same for potentially dangerous experiments in geoengineering. Instead of devising new medical treatments for people, the scientists involved in planet-hacking research are after novel ways to treat the Earth. The analogy of global warming to a curable disease was central to the discussions at the meeting. Climate scientist Steve Schneider of Stanford talked about administering “planetary methadone to get over our carbon addiction.” Others debated what “doses” of geoengineering would be necessary. Most crucially, the thinkers at Asilomar focused on the idea that medical ethics might provide a framework for balancing the risks and benefits of all this new research.

What would it mean to apply the established principles of biomedical research to the nascent field of geoengineering? The ethicists at Asilomar—particularly David Winickoff from Berkeley and David Morrow from the University of Chicago—began with three pillars laid out in the landmark 1979 Belmont Report. The first, respect for persons, says that biomedical scientists should obtain “informed consent” from their test subjects. The second, beneficence, requires that scientists assess the risks and benefits of a given test before they start. The third, justice, invokes the rights of research subjects to whatever medical advances result from the testing. (The people who are placed at risk should be the same ones who might benefit from a successful result.)

Then Winickoff and Morrow proposed applying the Belmont principles to the study of the most aggressive forms of geoengineering—the ones that would block the sun, like a volcanic eruption does, with a spray of sulfur or other particles into the stratosphere. Before we could embark on a radical intervention like that, we’d need to run smaller-scale tests that might themselves pose a risk to the environment. In much the way that a clinical drug trial might produce adverse reactions, so might a real-world trial of, say, the Pinatubo Option. Instead of causing organ failure or death in its subjects, a botched course of geoengineering might damage the ozone layer or reduce rainfall.

The problem, admitted the ethicists, is how to go about applying the Belmont rules outside of medicine. In clinical drug trials, researchers obtain consent from individuals, and they can precisely define the worse-case outcome (like death). But a trial run of hazing up the stratosphere wouldn’t affect specific, identifiable people in any one town, city, or state. The climate is interconnected in many ways, some still mysterious to scientists, and so the risks of even a small-scale test in a particular location might apply across the globe. If everyone on Earth could be affected, how do you figure out whom to ask for informed consent?

One possibility would be to require that all nations of the world agree ahead of time on any tests of consequence. To many gathered at Asilomar, however, this seemed naive; speakers repeatedly invoked the failure of all-inclusive talks to cut global carbon emissions, and it would presumably be much tougher to secure an agreement on work that might damage crop yields or open a hole in the ozone. A more pragmatic approach would be to set up something like a United Nations Planet Hacking Security Council, comprising 15 or so powerful nations whose oversight of research tests would take into account the concerns of a broad swath of countries. But that undemocratic approach would surely face howls of protest.

The principle of beneficence may be just as difficult to follow. Under the Belmont guidelines, doctors must balance the particular risks of a clinical trial with the potential benefit to any individual who might participate. Since it would be impossible to make such a calculation for every person on Earth, planet hackers could at best choose the experiments that minimize harm to the most vulnerable communities—like people living on the coasts of Southeast Asia. But we may not know enough about the risks of geoengineering to make any such credible calculation when the time comes. Consider the Pinatubo Option, by which scientists would mimic the cooling effect of volcanoes. Putting particles in the stratosphere could reduce the total amount of energy that strikes the Earth. Some climate modelers say this would disrupt rainfall by reducing moisture in the atmosphere obtained by evaporation. Others say that geoengineering’s droughts and famines would be less harmful than those caused by unchecked warming. Right now, no one can agree on the nature of the risks, let alone the degree to which they would apply to particular communities.

And what about justice? Among the disruptions that could result from testing the Pinatubo Option is a weakening of the Asian monsoon, a source of water for hundreds of millions of people in India. Those in developing countries will “eat the risk” of geoengineering trials, shouted one of the Asilomar speakers. If representatives from just a small set of countries were appointed as doctors to the planet, then the less powerful nations might end up as the world’s guinea pigs. Of course, the citizens of those nations also would seem to have the most to lose from uninterrupted global warming. These two dangers would have to be measured one against the other—and compensation as part of the experimental program could be one way of making tests more fair.

If medical ethics aren’t quite up to the task of guiding our forays into geoengineering, what other sort of principles should we keep in mind? One important danger to be aware of is the moral hazard that might come with successful trials. That’s the idea that protective circumstances or actions can encourage people to take undue risks—government insurance of banks led to risky investments that caused the savings-and-loan crisis in the 1980s, for example. Moral hazard looms particularly large for geoengineering studies since medium-scale field tests could prematurely give us the sense that we have a low-cost technical fix for global warming, no emissions cuts needed. (Moral hazard isn’t quite as potent in medical research. The availability of cholesterol-lowering drugs may well discourage people from maintaining healthy diets, but it’s unlikely that mere clinical trials would have the same effect.)

Another ethical principle that might apply to geoengineering is minimization—the idea that, a priori, it’s better to tinker at the smallest possible scale necessary to answer vital scientific questions. This notion comes from the ethics of animal experimentation; now we might apply it to planetary systems and the environment more broadly. Up until now, the medical ethics frame for geoengineering has guided discussions of how geoengineering might affect people in various countries. Perhaps we should be talking about how it affects the planet itself.

By that token, we might gain something by thinking of the Earth as a patient on its own terms. The rules and regulations we come up with for tests of geoengineering should take into account the way those experiments might affect ecosystems and nonhuman animals, both under threat from warming. And so maybe the most famous piece of medical ethics ought to apply: the Hippocratic Oath. “First, do no harm” is the crux of the original, but an updated version exhorts doctors to avoid “the twin traps of overtreatment and therapeutic nihilism.” The climate crisis may force us to act despite myriad ethical challenges, for our benefit and for the planet’s.

This piece was produced by Slate as part of the Climate Desk collaboration.

Eli Kintisch is a reporter at Science and author of a new book on geoengineering, Hack the Planet.

>Marcelo Leite: Águas turvas (FSP)

>
“Preconceitos, estridência, falácias, invenções e estatísticas, aliás, transformam todo o debate público numa bacia amazônica de turbidez. Não é privilégio da questão indígena. Tome a usina hidrelétrica de Belo Monte. Ou o tema explosivo da disponibilidade de terras para o agronegócio”

Marcelo Leite
Folha de S.Paulo, 09/05/2010 – reproduzido no Jornal de Ciência (JC e-mail 4006)

Por uma dessas coincidências sintomáticas que a época produz, duas frases que abrem a reportagem de capa da presente edição do caderno Mais! – “No Brasil todo mundo é índio, exceto quem não é” e “Só é índio quem se garante” – estão no centro de um bate-boca entre seu autor, o antropólogo Eduardo Viveiros de Castro, e a revista “Veja”.

A abertura foi escrita antes do quiproquó, mas pouco importa. Se ela e todo o texto sobre educação indígena forem recebidos como tomada de posição, tanto melhor.

De qualquer maneira, é instrutivo ler a reportagem da revista que deu origem a tudo, assim como as réplicas e tréplicas que se seguiram. Permite vislumbrar a profundidade dos preconceitos anti-indígenas e da estridência jornalística que turvam essa vertente de discussão no país.

Preconceitos, estridência, falácias, invenções e estatísticas, aliás, transformam todo o debate público numa bacia amazônica de turbidez. Não é privilégio da questão indígena. Tome a usina hidrelétrica de Belo Monte. Ou o tema explosivo da disponibilidade de terras para o agronegócio, epicentro da indigitada reportagem da revista “Veja”.

“Áreas de preservação ecológica, reservas indígenas e supostos antigos quilombos abarcam, hoje, 77,6% da extensão do Brasil”, afirmam seus autores, sem citar a fonte. “Se a conta incluir também os assentamentos de reforma agrária, as cidades, os portos, as estradas e outras obras de infraestrutura, o total alcança 90,6% do território nacional.”

É provável que a origem omitida seja o estudo “Alcance Territorial da Legislação Ambiental e Indigenista”, encomendado à Embrapa Monitoramento por Satélite pela Presidência da República e encampado pela Confederação Nacional da Agricultura e Pecuária do Brasil (CNA, leia-se senadora Kátia Abreu, DEM-TO). Seu coordenador foi o então chefe da unidade da Embrapa, Evaristo Eduardo de Miranda. A estimativa terminou bombardeada por vários especialistas, inclusive do Instituto Nacional de Pesquisas Espaciais (Inpe).

Nesta semana veio à luz, graças às repórteres Afra Balazina e Andrea Vialli, mais um levantamento que contradiz a projeção alarmante. O novo estudo foi realizado por Gerd Sparovek, da Escola Superior de Agricultura Luiz de Queiroz (Esalq-USP), em colaboração com a Universidade de Chalmers (Suécia).

Para Miranda, se toda a legislação ambiental, fundiária e indigenista fosse cumprida à risca, faltariam 334 mil km2 – 4% do território do Brasil – para satisfazer todas as suas exigências. O valor dá quase um Mato Grosso do Sul de deficit.

Para Sparovek, mesmo que houvesse completa obediência ao Código Florestal ora sob bombardeio de ruralistas, sobraria ainda 1 milhão de km2, além de 600 mil km2 de pastagens poucos produtivas usadas para pecuária extensiva (um boi por hectare). Dá 4,5 Mato Grosso do Sul de superavit.

A disparidade abissal entre as cifras deveria bastar para ensopar as barbas de quem acredita em neutralidade científica, ou a reivindica. Premissas, interpretações da lei e fontes de dados diversas decerto explicam o hiato.

Mas quem as examina a fundo, entrando no mérito e extraindo conclusões úteis para o esclarecimento do público e a tomada de decisão? Faltam pessoas e instituições, no Brasil, com autoridade para decantar espuma e detritos, clarificando as águas para que se possa enxergar o fundo. De blogueiros e bucaneiros já estamos cheios.

>SBPC: o jornalismo irresponsável da revista Veja

>
Em nota, SBPC repudia reportagem de ‘Veja’

Jornal da Ciência – JC e-mail 4007, de 11 de Maio de 2010

Reportagem trata da demarcação de terras indígenas e é acusada de distorcer informações

Intitulada “A farra da antropologia oportunista”, a reportagem foi publicada na edição de 5 de maio da revista semanal. O texto já havia sido objeto de nota da Associação Brasileira de Antropologia (ABA). Leia a nota da ABA em http://www.jornaldaciencia.org.br/Detalhe.jsp?id=70689.

No domingo, a coluna do jornalista Marcelo Leite, no caderno “Mais!”, da “Folha de SP”, também tratou da polêmica reportagem e da reação de membros da comunidade científica da antropologia. Leia a coluna em http://www.jornaldaciencia.org.br/Detalhe.jsp?id=70771

A reportagem da “Veja” pode ser lida no acervo digital da revista, em http://www.veja.com.br/acervodigital/home.aspx

Leia abaixo a íntegra da nota da SBPC:

“A Sociedade Brasileira para o Progresso da Ciência (SBPC) vem a público hipotecar inteira solidariedade a sua filiada, a Associação Brasileira de Antropologia (ABA), que em notas de sua diretoria e da Comissão de Assuntos Indígenas repudiou cabalmente matéria publicada pela revista ‘Veja’ em sua edição de 5 de maio do corrente, intitulada “Farra da Antropologia Oportunista”.

Registra, também, que a referida matéria vem sendo objeto de repulsa por parte de cientistas e pesquisadores de diversas áreas do conhecimento, os quais inclusive registram precedentes de jornalismo irresponsável por parte da referida revista, caracterizando assim um movimento de indignação que alcança o conjunto da comunidade científica nacional.

Por outro lado, a maneira pela qual foram inventadas declarações, o tratamento irônico e preconceituoso no que diz respeito às populações indígenas e quilombolas e a utilização de dados inverídicos evidenciam o exercício de um jornalismo irresponsável, incitam atitudes preconceituosas, revelam uma falta total de consideração pelos profissionais antropólogos – cuja atuação muito honra o conjunto da comunidade científica brasileira – e mostram profundo e inconcebível desrespeito pelas coletividades subalternizadas e o direito de buscarem os seus próprios caminhos.

Tudo isso indo em direção contrária ao fortalecimento da democracia e da justiça social entre nós e à constituição de uma sociedade que verdadeiramente se nutra e se orgulhe da sua diversidade cultural.

Adicionalmente, a SBPC declara-se pronta a acompanhar a ABA nas medidas que julgar apropriadas no campo jurídico e a levar o seu repúdio ao âmbito da 4ª. Conferência Nacional de Ciência, Tecnologia e Inovação, que se realizará no final deste mês de maio em Brasília.”

>História e arqueologia do mundo digital

>
Site arqueológico

25 de abril de 2010 – 17h40
Por Heloisa Lupinacci
Estadão.com.br

Em 2001, o cientista Joseph Miller pediu à Nasa dados coletados pela sonda Viking em Marte nos anos 70. A Nasa achou as fitas, mas os dados gravados ali não puderam ser abertos. O software que os lia não existia mais, e, como disse Miller à época à agência de notícias Reuters, os técnicos que conheciam o formato estavam todos mortos.

Essa é uma história. Há muitas outras. Parte do conhecimento produzido de maneira digital já era. De dados científicos a modinhas da internet. “Temos poucos serviços de preservação da história da cultura digital e muito conteúdo já se perdeu ao longo dos últimos anos”, diz Roberto Taddei, coordenador do Simpósio Internacional de Políticas Públicas para Acervos Digitais, que discutirá o tema em São Paulo de hoje até quinta.

Decifra-me ou… Se um pergaminho pode ser desenrolado por qualquer pessoa, um cartão perfurado, bisavô do disquete, não se deixa abrir facilmente. Carlos Augusto Ditadi, da Câmara Técnica de Documentos Eletrônicos do Conselho Nacional de Arquivo (CONARQ, que já cuida de parte patrimônio digital do País), dá a medida da encrenca: “o disco depende do driver, que depende do computador, que depende do software, que depende do sistema operacional: isso se chama interdependência”. Some a essa equação o fato de computadores ficarem obsoletos, programas saírem de linha e linguagens caírem em desuso: o cartão perfurável fica tão indecifrável quanto hieróglifos egípcios.

A preocupação com o patrimônio digital é recente. Em 2002, foi apresentada pela Unesco a Carta pela Preservação do Patrimônio Digital. Diz o documento: “Muitas dessas fontes têm valor e relevância duradouros e, assim, constituem um patrimônio a ser preservado”. A organização criou o órgão E-Heritage, dedicado, sobretudo, à conscientização de governos e à capacitação de arquivistas. É um bom começo, mas o patrimônio digital tem lá seus obstáculos específicos.

A interdependência é um deles. E, nesse caso, uma das melhores soluções veio de um jeito que é a cara da web: dos usuários. “A primeira geração de gamers percebeu, nos anos 90, que não tinha mais acesso a jogos da infância. Eles foram os primeiros a usar emuladores, que sempre existiram, como ferramentas de preservação. Graças a eles há emuladores para quase qualquer plataforma computacional”, diz Andreas Lange, diretor do Museu de Jogos de Computador, em Berlim, que tenta evitar o desaparecimento de games. O emulador é um programa que recria qualquer ambiente de computador: softwares extintos, consoles não mais fabricados, etc.

…devoro-te. Outro desafio evidente é o volume. Em 2009, de acordo com o Instituto de Pesquisas IDC, a humanidade produziu 750 bilhões de GB de informação. Como escolher o que preservar? “Não fazemos nenhuma seleção. Tentamos fazer o registro mais exaustivo. Arquivamos tudo o que encontramos sob o domínio .pt”, diz Daniel Gomes, coordenador do projeto Arquivo da Web Portuguesa. A declaração da Unesco sugere: “Os principais critérios devem ser significância e durabilidade (cultural, científica). Materiais ‘nativos digitais’ devem ter prioridade”.

Decidido o que guardar, falta definir como guardar e arrumar dinheiro para isso. Duas questões nada simples. Segundo Ditadi, o site é das coisas mais difíceis de preservar. “Ele deve permanecer navegável, mas como garantir os links? E eles levam a coisas protegidas por direitos autorais. É um registro muito dinâmico.” E o armazenamento custa caro. É preciso fazer uma cópia no formato nativo, chamada cópia de testemunho, que é a garantia de que aquele documento é real. Então, é feita a versão de preservação, em uma extensão mais duradoura – quase sempre um formato aberto, baseado em software livre. Daí, grava-se a cópia de acesso, aquela que fica disponível para consulta. Multiplique, portanto, tudo por três.

Por essas e outras, muitas vezes a memória da web é preservada justo por quem a alimenta. De novo, o usuário. Mas daí não há novidade. “Muitas bibliotecas foram montadas por usuários e depois doadas a instituições ou bibliotecas”, lembra Taddei.

* Visite os marcos: Parceria firmada entre Google e Unesco colocará todos os 890 sítios tombados pelo órgão no Google Earth e no Maps. O Street View permite a ‘visita’ em 19 patrimônios da humanidade.

* Histórico e natural: A Unesco divide o patrimônio da humanidade em duas categorias, o patrimônio histórico e o patrimônio natural. Há sítios mistos, tombados em ambas, como Ibiza, na Espanha.

* Patrimônio no Brasil: Há três esferas de tombamento histórico no Brasil, a municipal (na cidade de São Paulo é o Conpresp), a estadual (no Estado de São Paulo é o Condephaat) e a federal, o Iphan.

* Biblioteca de tweets: A coordenadora de mídias digitais da Biblioteca do Congresso dos EUA, que arquivará tweets, fala sobre a preservação de informações digitais na insituição.

* Biblioteca de tudo: A Wayback Machine, parte do Internet Archive que guarda sites, foi incorporada à Biblioteca de Alexandria após parceria com o E-Heritage. Para navegar pelo acervo, vá a Archive.org.

* Todos juntos: Comparando o fim do Geocities à destruição, pelo Taleban, dos budas de Bamyan, no Afeganistão, o holandês Jacques Matthei conclama todos a resgatá-lo em seu Reocities.com