Arquivo da categoria: tecnocracia

>Can a group of scientists in California end the war on climate change? (Guardian)

>
The Berkeley Earth project say they are about to reveal the definitive truth about global warming

Ian Sample
guardian.co.uk
Sunday 27 February 2011 20.29 GMT

Richard Muller of the Berkeley Earth project is convinced his approach will lead to a better assessment of how much the world is warming. Photograph: Dan Tuffs for the Guardian

In 1964, Richard Muller, a 20-year-old graduate student with neat-cropped hair, walked into Sproul Hall at the University of California, Berkeley, and joined a mass protest of unprecedented scale. The activists, a few thousand strong, demanded that the university lift a ban on free speech and ease restrictions on academic freedom, while outside on the steps a young folk-singer called Joan Baez led supporters in a chorus of We Shall Overcome. The sit-in ended two days later when police stormed the building in the early hours and arrested hundreds of students. Muller was thrown into Oakland jail. The heavy-handedness sparked further unrest and, a month later, the university administration backed down. The protest was a pivotal moment for the civil liberties movement and marked Berkeley as a haven of free thinking and fierce independence.

Today, Muller is still on the Berkeley campus, probably the only member of the free speech movement arrested that night to end up with a faculty position there – as a professor of physics. His list of publications is testament to the free rein of tenure: he worked on the first light from the big bang, proposed a new theory of ice ages, and found evidence for an upturn in impact craters on the moon. His expertise is highly sought after. For more than 30 years, he was a member of the independent Jason group that advises the US government on defence; his college lecture series, Physics for Future Presidents was voted best class on campus, went stratospheric on YouTube and, in 2009, was turned into a bestseller.

For the past year, Muller has kept a low profile, working quietly on a new project with a team of academics hand-picked for their skills. They meet on campus regularly, to check progress, thrash out problems and hunt for oversights that might undermine their work. And for good reason. When Muller and his team go public with their findings in a few weeks, they will be muscling in on the ugliest and most hard-fought debate of modern times.

Muller calls his latest obsession the Berkeley Earth project. The aim is so simple that the complexity and magnitude of the undertaking is easy to miss. Starting from scratch, with new computer tools and more data than has ever been used, they will arrive at an independent assessment of global warming. The team will also make every piece of data it uses – 1.6bn data points – freely available on a website. It will post its workings alongside, including full information on how more than 100 years of data from thousands of instruments around the world are stitched together to give a historic record of the planet’s temperature.

Muller is fed up with the politicised row that all too often engulfs climate science. By laying all its data and workings out in the open, where they can be checked and challenged by anyone, the Berkeley team hopes to achieve something remarkable: a broader consensus on global warming. In no other field would Muller’s dream seem so ambitious, or perhaps, so naive.

“We are bringing the spirit of science back to a subject that has become too argumentative and too contentious,” Muller says, over a cup of tea. “We are an independent, non-political, non-partisan group. We will gather the data, do the analysis, present the results and make all of it available. There will be no spin, whatever we find.” Why does Muller feel compelled to shake up the world of climate change? “We are doing this because it is the most important project in the world today. Nothing else comes close,” he says.

Muller is moving into crowded territory with sharp elbows. There are already three heavyweight groups that could be considered the official keepers of the world’s climate data. Each publishes its own figures that feed into the UN’s Intergovernmental Panel on Climate Change. Nasa’s Goddard Institute for Space Studies in New York City produces a rolling estimate of the world’s warming. A separate assessment comes from another US agency, the National Oceanic and Atmospheric Administration (Noaa). The third group is based in the UK and led by the Met Office. They all take readings from instruments around the world to come up with a rolling record of the Earth’s mean surface temperature. The numbers differ because each group uses its own dataset and does its own analysis, but they show a similar trend. Since pre-industrial times, all point to a warming of around 0.75C.

You might think three groups was enough, but Muller rolls out a list of shortcomings, some real, some perceived, that he suspects might undermine public confidence in global warming records. For a start, he says, warming trends are not based on all the available temperature records. The data that is used is filtered and might not be as representative as it could be. He also cites a poor history of transparency in climate science, though others argue many climate records and the tools to analyse them have been public for years.

Then there is the fiasco of 2009 that saw roughly 1,000 emails from a server at the University of East Anglia’s Climatic Research Unit (CRU) find their way on to the internet. The fuss over the messages, inevitably dubbed Climategate, gave Muller’s nascent project added impetus. Climate sceptics had already attacked James Hansen, head of the Nasa group, for making political statements on climate change while maintaining his role as an objective scientist. The Climategate emails fuelled their protests. “With CRU’s credibility undergoing a severe test, it was all the more important to have a new team jump in, do the analysis fresh and address all of the legitimate issues raised by sceptics,” says Muller.

This latest point is where Muller faces his most delicate challenge. To concede that climate sceptics raise fair criticisms means acknowledging that scientists and government agencies have got things wrong, or at least could do better. But the debate around global warming is so highly charged that open discussion, which science requires, can be difficult to hold in public. At worst, criticising poor climate science can be taken as an attack on science itself, a knee-jerk reaction that has unhealthy consequences. “Scientists will jump to the defence of alarmists because they don’t recognise that the alarmists are exaggerating,” Muller says.

The Berkeley Earth project came together more than a year ago, when Muller rang David Brillinger, a statistics professor at Berkeley and the man Nasa called when it wanted someone to check its risk estimates of space debris smashing into the International Space Station. He wanted Brillinger to oversee every stage of the project. Brillinger accepted straight away. Since the first meeting he has advised the scientists on how best to analyse their data and what pitfalls to avoid. “You can think of statisticians as the keepers of the scientific method, ” Brillinger told me. “Can scientists and doctors reasonably draw the conclusions they are setting down? That’s what we’re here for.”

For the rest of the team, Muller says he picked scientists known for original thinking. One is Saul Perlmutter, the Berkeley physicist who found evidence that the universe is expanding at an ever faster rate, courtesy of mysterious “dark energy” that pushes against gravity. Another is Art Rosenfeld, the last student of the legendary Manhattan Project physicist Enrico Fermi, and something of a legend himself in energy research. Then there is Robert Jacobsen, a Berkeley physicist who is an expert on giant datasets; and Judith Curry, a climatologist at Georgia Institute of Technology, who has raised concerns over tribalism and hubris in climate science.

Robert Rohde, a young physicist who left Berkeley with a PhD last year, does most of the hard work. He has written software that trawls public databases, themselves the product of years of painstaking work, for global temperature records. These are compiled, de-duplicated and merged into one huge historical temperature record. The data, by all accounts, are a mess. There are 16 separate datasets in 14 different formats and they overlap, but not completely. Muller likens Rohde’s achievement to Hercules’s enormous task of cleaning the Augean stables.

The wealth of data Rohde has collected so far – and some dates back to the 1700s – makes for what Muller believes is the most complete historical record of land temperatures ever compiled. It will, of itself, Muller claims, be a priceless resource for anyone who wishes to study climate change. So far, Rohde has gathered records from 39,340 individual stations worldwide.

Publishing an extensive set of temperature records is the first goal of Muller’s project. The second is to turn this vast haul of data into an assessment on global warming. Here, the Berkeley team is going its own way again. The big three groups – Nasa, Noaa and the Met Office – work out global warming trends by placing an imaginary grid over the planet and averaging temperatures records in each square. So for a given month, all the records in England and Wales might be averaged out to give one number. Muller’s team will take temperature records from individual stations and weight them according to how reliable they are.

This is where the Berkeley group faces its toughest task by far and it will be judged on how well it deals with it. There are errors running through global warming data that arise from the simple fact that the global network of temperature stations was never designed or maintained to monitor climate change. The network grew in a piecemeal fashion, starting with temperature stations installed here and there, usually to record local weather.

Among the trickiest errors to deal with are so-called systematic biases, which skew temperature measurements in fiendishly complex ways. Stations get moved around, replaced with newer models, or swapped for instruments that record in celsius instead of fahrenheit. The times measurements are taken varies, from say 6am to 9pm. The accuracy of individual stations drift over time and even changes in the surroundings, such as growing trees, can shield a station more from wind and sun one year to the next. Each of these interferes with a station’s temperature measurements, perhaps making it read too cold, or too hot. And these errors combine and build up.

This is the real mess that will take a Herculean effort to clean up. The Berkeley Earth team is using algorithms that automatically correct for some of the errors, a strategy Muller favours because it doesn’t rely on human interference. When the team publishes its results, this is where the scrutiny will be most intense.

Despite the scale of the task, and the fact that world-class scientific organisations have been wrestling with it for decades, Muller is convinced his approach will lead to a better assessment of how much the world is warming. “I’ve told the team I don’t know if global warming is more or less than we hear, but I do believe we can get a more precise number, and we can do it in a way that will cool the arguments over climate change, if nothing else,” says Muller. “Science has its weaknesses and it doesn’t have a stranglehold on the truth, but it has a way of approaching technical issues that is a closer approximation of truth than any other method we have.”

He will find out soon enough if his hopes to forge a true consensus on climate change are misplaced. It might not be a good sign that one prominent climate sceptic contacted by the Guardian, Canadian economist Ross McKitrick, had never heard of the project. Another, Stephen McIntyre, whom Muller has defended on some issues, hasn’t followed the project either, but said “anything that [Muller] does will be well done”. Phil Jones at the University of East Anglia was unclear on the details of the Berkeley project and didn’t comment.

Elsewhere, Muller has qualified support from some of the biggest names in the business. At Nasa, Hansen welcomed the project, but warned against over-emphasising what he expects to be the minor differences between Berkeley’s global warming assessment and those from the other groups. “We have enough trouble communicating with the public already,” Hansen says. At the Met Office, Peter Stott, head of climate monitoring and attribution, was in favour of the project if it was open and peer-reviewed.

Peter Thorne, who left the Met Office’s Hadley Centre last year to join the Co-operative Institute for Climate and Satellites in North Carolina, is enthusiastic about the Berkeley project but raises an eyebrow at some of Muller’s claims. The Berkeley group will not be the first to put its data and tools online, he says. Teams at Nasa and Noaa have been doing this for many years. And while Muller may have more data, they add little real value, Thorne says. Most are records from stations installed from the 1950s onwards, and then only in a few regions, such as North America. “Do you really need 20 stations in one region to get a monthly temperature figure? The answer is no. Supersaturating your coverage doesn’t give you much more bang for your buck,” he says. They will, however, help researchers spot short-term regional variations in climate change, something that is likely to be valuable as climate change takes hold.

Despite his reservations, Thorne says climate science stands to benefit from Muller’s project. “We need groups like Berkeley stepping up to the plate and taking this challenge on, because it’s the only way we’re going to move forwards. I wish there were 10 other groups doing this,” he says.

For the time being, Muller’s project is organised under the auspices of Novim, a Santa Barbara-based non-profit organisation that uses science to find answers to the most pressing issues facing society and to publish them “without advocacy or agenda”. Funding has come from a variety of places, including the Fund for Innovative Climate and Energy Research (funded by Bill Gates), and the Department of Energy’s Lawrence Berkeley Lab. One donor has had some climate bloggers up in arms: the man behind the Charles G Koch Charitable Foundation owns, with his brother David, Koch Industries, a company Greenpeace called a “kingpin of climate science denial”. On this point, Muller says the project has taken money from right and left alike.

No one who spoke to the Guardian about the Berkeley Earth project believed it would shake the faith of the minority who have set their minds against global warming. “As new kids on the block, I think they will be given a favourable view by people, but I don’t think it will fundamentally change people’s minds,” says Thorne. Brillinger has reservations too. “There are people you are never going to change. They have their beliefs and they’re not going to back away from them.”

Waking across the Berkeley campus, Muller stops outside Sproul Hall, where he was arrested more than 40 years ago. Today, the adjoining plaza is a designated protest spot, where student activists gather to wave banners, set up tables and make speeches on any cause they choose. Does Muller think his latest project will make any difference? “Maybe we’ll find out that what the other groups do is absolutely right, but we’re doing this in a new way. If the only thing we do is allow a consensus to be reached as to what is going on with global warming, a true consensus, not one based on politics, then it will be an enormously valuable achievement.”

>Can Geoengineering Save the World from Global Warming? (Scientific American)

>
Ask the Experts | Energy & Sustainability
Scientific American

Is manipulating Earth’s environment to combat climate change a good idea–and where, exactly, did the idea come from?

By David Biello | February 25, 2011

STARFISH PRIME: This nighttime atmospheric nuclear weapons test generated an aurora (pictured) in Earth’s magnetic field, along with an electromagnetic pulse that blew out streetlights in Honolulu. It is seen as an early instance of geoengineering by science historian James Fleming. Image: Courtesy of US Govt. Defense Threat Reduction Agency

As efforts to combat climate change falter despite ever-rising concentrations of heat-trapping gases in the atmosphere, some scientists and other experts have begun to consider the possibility of using so-called geoengineering to fix the problem. Such “deliberate, large-scale manipulation of the planetary environment” as the Royal Society of London puts it, is fraught with peril, of course.

For example, one of the first scientists to predict global warming as a result of increasing concentrations of greenhouse gases in the atmosphere—Swedish chemist Svante Arrhenius—thought this might be a good way to ameliorate the winters of his native land and increase its growing season. Whereas that may come true for the human inhabitants of Scandinavia, polar plants and animals are suffering as sea ice dwindles and temperatures warm even faster than climatologists predicted.

Scientific American corresponded with science historian James Fleming of Colby College in Maine, author of Fixing the Sky: The Checkered History of Weather and Climate Control, about the history of geoengineering—ranging from filling the air with the artificial aftermath of a volcanic eruption to seeding the oceans with iron in order to promote plankton growth—and whether it might save humanity from the ill effects of climate change.

[An edited transcript of the interview follows.]

What is geoengineering in your view?
Geoengineering is planetary-scale intervention [in]—or tinkering with—planetary processes. Period.

As I write in my book, Fixing the Sky: The Checkered History of Weather and Climate Control, “the term ‘geoengineering’ remains largely undefined,” but is loosely, “the intentional large-scale manipulation of the global environment; planetary tinkering; a subset of terraforming or planetary engineering.”

As of June 2010 the term has a draft entry in the Oxford English Dictionary—the modification of the global environment or the climate in order to counter or ameliorate climate change. A 2009 report issued by the Royal Society of London defines geoengineering as “the deliberate large-scale manipulation of the planetary environment to counteract anthropogenic climate change.”

But there are significant problems with both definitions. First of all, an engineering practice defined by its scale (geo) need not be constrained by its stated purpose (environmental improvement), by any of its currently proposed techniques (stratospheric aerosols, space mirrors, etcetera) or by one of perhaps many stated goals (to ameliorate or counteract climate change). Nuclear engineers, for example, are capable of building both power plants and bombs; mechanical engineers can design components for both ambulances and tanks. So to constrain the essence of something by its stated purpose, techniques or goals is misleading at best.

Geo-scale engineering projects were conducted by both the U.S. and the Soviet Union between 1958 and 1962 that had nothing to do with countering or ameliorating climate change. Starting with the [U.S.’s] 1958 Argus A-bomb explosions in space and ending with the 1962 Starfish Prime H-bomb test, the militaries of both nations sought to modify the global environment for military purposes.

Project Argus was a top-secret military test aimed at detonating atomic bombs in space to generate an artificial radiation belt, disrupt the near-space environment, and possibly intercept enemy missiles. It, and the later tests conducted by both the U.S. and the Soviet Union, peaked with H-bomb detonations in space in 1962 that created an artificial [electro]magnetic [radiation] belt that persisted for 10 years. This is geoengineering.

This idea of detonating bombs in near-space was proposed in 1957 by Nicholas Christofilos, a physicist at Lawrence Berkeley National Laboratory. His hypothesis, which was pursued by the [U.S.] Department of Defense’s Advanced Research Projects Agency [subsequently known as DARPA] and tested in Project Argus and other nuclear shots, held that the debris from a nuclear explosion, mainly highly energetic electrons, would be contained within lines of force in Earth’s magnetic field and would travel almost instantly as a giant current spanning up to half a hemisphere. Thus, if a detonation occurred above a point in the South Atlantic, immense currents would flow along the magnetic lines to a point far to the north, such as Greenland, where they would severely disrupt radio communications. A shot in the Indian Ocean might, then, generate a huge electromagnetic pulse over Moscow. In addition to providing a planetary “energy ray,” Christofilos thought nuclear shots in space might also disrupt military communications, destroy satellites and the electronic guidance systems of enemy [intercontinental ballistic missiles], and possibly kill any military cosmonauts participating in an attack launched from space. He proposed thousands of them to make a space shield.

So nuclear explosions in space by the U.S. and the Soviet Union constituted some of the earliest attempts at geoengineering, or intentional human intervention in planetary-scale processes.

The neologism “geoengineer” refers to one who contrives, designs or invents at the largest planetary scale possible for either military or civilian purposes. Today, geoengineering, as an unpracticed art, may be considered “geoscientific speculation”. Geoengineering is a subset of terraformation, which also does not exist outside of the fantasies of some engineers.

I have recently written to the Oxford English Dictionary asking them to correct their draft definition.

Can geoengineering save the world from climate change?
In short, I think it may be infinitely more dangerous than climate change, largely due to the suspicion and social disruption it would trigger by changing humanity’s relationship to nature.

To take just one example from my book, on page 194: “Sarnoff Predicts Weather Control” read the headline on the front page of The New York Times on October 1, 1946. The previous evening, at his testimonial dinner at the Waldorf Astoria, RCA president Brig. Gen. David Sarnoff had speculated on worthy peaceful projects for the postwar era. Among them were “transformations of deserts into gardens through diversion of ocean currents,” a technique that could also be reversed in time of war to turn fertile lands into deserts, and ordering “rain or sunshine by pressing radio buttons,” an accomplishment that, Sarnoff declared, would require a “World Weather Bureau” in charge of global forecasting and control (much like the “Weather Distributing Administration” proposed in 1938). A commentator in The New Yorker intuited the problems with such control: “Who” in this civil service outfit, he asked, “would decide whether a day was to be sunny, rainy, overcast…or enriched by a stimulating blizzard?” It would be “some befuddled functionary,” probably bedeviled by special interests such as the raincoat and galoshes manufacturers, the beachwear and sunburn lotion industries, and resort owners and farmers. Or if a storm was to be diverted—”Detour it where? Out to sea, to hit some ship with no influence in Washington?”

How old is the idea of geoengineering? What other names has it had?
I can trace geoengineering’s direct modern legacy to 1945, and have prepared a table of such proposals and efforts for the [Government Accountability Office]. Nuclear weapons, digital computers and satellites seem to be the modern technologies of choice. Geoengineering has also been called terraformation and, more restrictively, climate engineering, climate intervention or climate modification. Many have proposed abandoning the term geoengineering in favor of solar radiation management and carbon (or carbon dioxide) capture and storage. Of course, the idea of control of nature is ancient—for example, Phaeton or Archimedes.

Phaeton, the son of Helios, received permission from his father [the Greek sun god] to drive the sun chariot, but failed to control it, putting the Earth in danger of burning up. He was killed by a thunderbolt from Zeus to prevent further disaster. Recently, a prominent meteorologist has written about climate control and urged us to “take up Phaeton’s reins,” which is not a good idea.

Archimedes is known as an engineer who said: “Give me a lever long enough and a place to stand, and I will move the Earth.” Some geoengineers think that this is now possible and that science and technology have given us an Archimedean set of levers with which to move the planet. But I ask: “Where will it roll if you tip it?”

How are weather control and climate control related?
Weather and climate are intimately related: Weather is the state of the atmosphere at a given place and time, while climate is the aggregate of weather conditions over time. A vast body of scientific literature addresses these interactions. In addition, historians are revisiting the ancient but elusive term klima, seeking to recover its multiple social connotations. Weather, climate and the climate of opinion matter in complex ways that invite—some might say require or demand—the attention of both scientists and historians. Yet some may wonder how weather and climate are interrelated rather than distinct. Both, for example, are at the center of the debate over greenhouse warming and hurricane intensity. A few may claim that rainmaking, for example, has nothing to do with climate engineering, but any intervention in the Earth’s radiation or heat budget (such as managing solar radiation) would affect the general circulation and thus the location of upper-level patterns, including the jet stream and storm tracks. Thus, the weather itself would be changed by such manipulation. Conversely, intervening in severe storms by changing their intensity or their tracks or modifying weather on a scale as large as a region, a continent or the Pacific Basin would obviously affect cloudiness, temperature and precipitation patterns with major consequences for monsoonal flows, and ultimately the general circulation. If repeated systematically, such interventions would influence the overall heat budget and the climate.

Both weather and climate control have long and checkered histories: My book explains [meteorologist] James Espy’s proposal in the 1830s to set fire to the crest of the Appalachian Mountains every Sunday evening to generate heated updrafts that would stimulate rain and clear the air for cities of the east coast. It also examines efforts to fire cannons at the clouds in the arid Southwest in the hope of generating rain by concussion.

In the 1920s airplanes loaded with electrified sand were piloted by military aviators who “attacked” the clouds in futile attempts to both make rain and clear fog. Many others have proposed either a world weather control agency or creating a global thermostat, either by burning vast quantities of fossil fuels if an ice age threatened or sucking the CO2 out of the air if the world overheated.

After 1945 three technologies—nuclear weapons, digital computers and satellites—dominated discussions about ultimate weather and climate control, but with very little acknowledgement that unintended consequences and social disruption may be more damaging than any presumed benefit.

What would be the ideal role for geoengineering in addressing climate change?
That it generates interest in and awareness of the impossibility of heavy-handed intervention in the climate system, since there could be no predictable outcome of such intervention, physically, politically or socially.

Why do scientists continue to pursue this then, after 200 or so years of failure?
Science fantasy is informed by science fiction and driven by hubris. One of the dictionary definitions of hubris cites Edward Teller (the godfather of modern geoengineering).

Teller’s hubris knew no bounds. He was the [self-proclaimed] father of the H-bomb and promoted all things atomic, even talking about using nuclear weapons to create canals and harbors. He was also an advocate of urban sprawl to survive nuclear attack, the Star Wars [missile] defense system, and a planetary sunscreen to reduce global warming. He wanted to control nature and improve it using technology.

Throughout history rainmakers and climate engineers have typically fallen into two categories: commercial charlatans using technical language and proprietary techniques to cash in on a gullible public, and sincere but deluded scientific practitioners exhibiting a modicum of chemical and physical knowledge, a bare minimum of atmospheric insight, and an abundance of hubris. We should base our decision-making not on what we think we can do “now” and in the near future. Rather, our knowledge is shaped by what we have and have not done in the past. Such are the grounds for making informed decisions and avoiding the pitfalls of rushing forward, claiming we know how to “fix the sky.”

>What we have and haven’t learned from ‘Climategate’

>
DON’T KNOW MUCH AGNOTOLOGY

Grist.org
BY David Roberts
28 FEB 2011 1:29 PM

I wrote about the “Climategate” controversy (over emails stolen from the University of East Anglia’s Climatic Research Unit) once, which is about what it warranted.

My silent protest had no effect whatsoever, of course, and the story followed a depressingly familiar trajectory: hyped relentlessly by right-wing media, bullied into the mainstream press as he-said she-said, and later, long after the damage is done, revealed as utterly bereft of substance. It’s a familiar script for climate faux controversies, though this one played out on a slightly grander scale.

Investigations galore

Consider that there have now been five, count ‘em five, inquiries into the matter. Penn State established an independent inquiry into the accusations against scientist Michael Mann and found “no credible evidence” [PDF] of improper research conduct. A British government investigation run by the House of Commons’ Science and Technology Committee found that while the CRU scientists could have been more transparent and responsive to freedom-of-information requests, there was no evidence of scientific misconduct. The U.K.’s Royal Society (its equivalent of the National Academies) ran an investigation that found “no evidence of any deliberate scientific malpractice.” The University of East Anglia appointed respected civil servant Sir Muir Russell to run an exhaustive, six-month independent inquiry; he concluded that “the honesty and rigour of CRU as scientists are not in doubt … We have not found any evidence of behaviour that might undermine the conclusions of the IPCC assessments.”

All those results are suggestive, but let’s face it, they’re mostly … British. Sen. James Inhofe (R-Okla.) wanted an American investigation of all the American scientists involved in these purported dirty deeds. So he asked the Department of Commerce’s inspector general to get to the bottom of it. On Feb. 18, the results of that investigation were released. “In our review of the CRU emails,” the IG’s office said in its letter to Inhofe [PDF], “we did not find any evidence that NOAA inappropriately manipulated data … or failed to adhere to appropriate peer review procedures.” (Oddly, you’ll find no mention of this central result in Inhofe’s tortured public response.)

Whatever legitimate issues there may be about the responsiveness or transparency of this particular group of scientists, there was nothing in this controversy — nothing — that cast even the slightest doubt on the basic findings of climate science. Yet it became a kind of stain on the public image of climate scientists. How did that happen?

Smooth criminals

You don’t hear about it much in the news coverage, but recall, the story began with a crime. Hackers broke into the East Anglia email system and stole emails and documents, an illegal invasion of privacy. Yet according to The Wall Street Journal’s Kim Strassel, the emails “found their way to the internet.” In ABC science correspondent Ned Potter’s telling, the emails “became public.” The New York Times’ Andy Revkin says they were “extracted from computers.”

None of those phrasings are wrong, per se, but all pass rather lightly over the fact that some actual person or persons put them on the internet, made them public, extracted them from the computers. Someone hacked in, collected emails, sifted through and selected those that could be most damning, organized them, and timed the release for maximum impact, just before the Copenhagen climate talks. Said person or persons remain uncaught, uncharged, and unprosecuted. There have since been attempted break-ins at other climate research institutions.

If step one was crime, step two was character assassination. When the emails were released, they were combed over by skeptic blogs and right-wing media, who collected sentences, phrases, even individual terms that, when stripped of all context, create the worst possible impression. Altogether the whole thing was as carefully staged as any modern-day political attack ad.

Yet when the “scandal” broke, rather than being about criminal theft and character assassination, it was instantly “Climategate.” It was instantly about climate scientists, not the illegal and dishonest tactics of their attackers. The scientists, not the ideologues and ratf*ckers, had to defend themselves.

Burden of proof

It’s a numbingly familiar pattern in media coverage. The conservative movement that’s been attacking climate science for 20 years has a storied history of demonstrable fabrications, distortions, personal attacks, and nothingburger faux-scandals — not only on climate science, but going back to asbestos, ozone, leaded gasoline, tobacco, you name it. They don’t follow the rigorous standards of professional science; they follow no intellectual or ethical standards whatsoever. Yet no matter how long their record of viciousness and farce, every time the skeptic blogosphere coughs up a new “ZOMG!” it’s as though we start from zero again, like no one has a memory longer than five minutes.

Here’s the basic question: At this point, given their respective accomplishments and standards, wouldn’t it make sense to give scientists the strong benefit of the doubt when they are attacked by ideologues with a history of dishonesty and error? Shouldn’t the threshold for what counts as a “scandal” have been nudged a bit higher?

Agnotological inquiry

The lesson we’ve learned from climategate is simple. It’s the same lesson taught by death panels, socialist government takeover, Sharia law, and Obama’s birth certificate. To understand it we must turn to agnotology, the study of culturally induced ignorance or doubt. (Hat tip to an excellent recent post on this by John Quiggen.)

Beck, Palin, and the rest of Fox News and talk radio operate on the pretense that they are giving consumers access to a hidden “universe of reality,” to use Limbaugh’s term. It’s a reality being actively obscured the “lamestream media,” academics, scientists, and government officials. Affirming the tenets of that secret reality has become an act of tribal reinforcement, the equivalent of a secret handshake.

The modern right has created a closed epistemic loop containing millions of people. Within that loop, the implausibility or extremity of a claim itself counts as evidence. The more liberal elites reject it, the more it entrenches itself. Standards of evidence have nothing to do with it.

The notion that there is a global conspiracy by professional scientists to falsify results in order to get more research money is, to borrow Quiggen’s words about birtherism, “a shibboleth, that is, an affirmation that marks the speaker as a member of their community or tribe.” Once you have accepted that shibboleth, anything offered to you as evidence of its truth, no matter how ludicrous, will serve as affirmation. (Even a few context-free lines cherry-picked from thousands of private emails.)

Living with the loop

There’s one thing we haven’t learned from climategate (or death panels or birtherism). U.S. politics now contains a large, well-funded, tightly networked, and highly amplified tribe that defines itself through rejection of “lamestream” truth claims and standards of evidence. How should our political culture relate to that tribe?

We haven’t figured it out. Politicians and the political press have tried to accommodate the shibboleths of the right as legitimate positions for debate. The press in particular has practically sworn off plain judgments of accuracy or fact. But all that’s done is confuse and mislead the broader public, while the tribe pushes ever further into extremity. The tribe does not want to be accommodated. It is fueled by elite rejection.

At this point mainstream institutions like the press are in a bind: either accept the tribe’s assertions as legitimate or be deemed “biased.” Until there is a way out of that trap, there will be more and more Climategates.

>Should geoengineering tests be governed by the principles of medical ethics?

>
Rules for Planet Hackers

By Eli Kintisch
Thu Apr. 22, 2010 1:00 AM PDT

[Image: Flickr/indigoprime (Creative Commons)]

Nearly 200 scientists from 14 countries met last month at the famed Asilomar retreat center outside Monterey, California, in a very deliberate bid to make history. Their five-day meeting focused on setting up voluntary ground rules for research into giant algae blooms, cloud-brightening, and other massive-scale interventions to cool the planet. It’s unclear how significant the meeting will turn out to be, but the intent of its organizers was unmistakable: By choosing Asilomar, they hoped to summon the spirit of a groundbreaking meeting of biologists that took place on the same site in 1975. Back then, scientists with bushy sideburns and split collars—the forefathers of the molecular revolution, it turned out—established principles for the safe and ethical study of deadly pathogens.

The planners of Asilomar II, as they called it, hoped to accomplish much the same for potentially dangerous experiments in geoengineering. Instead of devising new medical treatments for people, the scientists involved in planet-hacking research are after novel ways to treat the Earth. The analogy of global warming to a curable disease was central to the discussions at the meeting. Climate scientist Steve Schneider of Stanford talked about administering “planetary methadone to get over our carbon addiction.” Others debated what “doses” of geoengineering would be necessary. Most crucially, the thinkers at Asilomar focused on the idea that medical ethics might provide a framework for balancing the risks and benefits of all this new research.

What would it mean to apply the established principles of biomedical research to the nascent field of geoengineering? The ethicists at Asilomar—particularly David Winickoff from Berkeley and David Morrow from the University of Chicago—began with three pillars laid out in the landmark 1979 Belmont Report. The first, respect for persons, says that biomedical scientists should obtain “informed consent” from their test subjects. The second, beneficence, requires that scientists assess the risks and benefits of a given test before they start. The third, justice, invokes the rights of research subjects to whatever medical advances result from the testing. (The people who are placed at risk should be the same ones who might benefit from a successful result.)

Then Winickoff and Morrow proposed applying the Belmont principles to the study of the most aggressive forms of geoengineering—the ones that would block the sun, like a volcanic eruption does, with a spray of sulfur or other particles into the stratosphere. Before we could embark on a radical intervention like that, we’d need to run smaller-scale tests that might themselves pose a risk to the environment. In much the way that a clinical drug trial might produce adverse reactions, so might a real-world trial of, say, the Pinatubo Option. Instead of causing organ failure or death in its subjects, a botched course of geoengineering might damage the ozone layer or reduce rainfall.

The problem, admitted the ethicists, is how to go about applying the Belmont rules outside of medicine. In clinical drug trials, researchers obtain consent from individuals, and they can precisely define the worse-case outcome (like death). But a trial run of hazing up the stratosphere wouldn’t affect specific, identifiable people in any one town, city, or state. The climate is interconnected in many ways, some still mysterious to scientists, and so the risks of even a small-scale test in a particular location might apply across the globe. If everyone on Earth could be affected, how do you figure out whom to ask for informed consent?

One possibility would be to require that all nations of the world agree ahead of time on any tests of consequence. To many gathered at Asilomar, however, this seemed naive; speakers repeatedly invoked the failure of all-inclusive talks to cut global carbon emissions, and it would presumably be much tougher to secure an agreement on work that might damage crop yields or open a hole in the ozone. A more pragmatic approach would be to set up something like a United Nations Planet Hacking Security Council, comprising 15 or so powerful nations whose oversight of research tests would take into account the concerns of a broad swath of countries. But that undemocratic approach would surely face howls of protest.

The principle of beneficence may be just as difficult to follow. Under the Belmont guidelines, doctors must balance the particular risks of a clinical trial with the potential benefit to any individual who might participate. Since it would be impossible to make such a calculation for every person on Earth, planet hackers could at best choose the experiments that minimize harm to the most vulnerable communities—like people living on the coasts of Southeast Asia. But we may not know enough about the risks of geoengineering to make any such credible calculation when the time comes. Consider the Pinatubo Option, by which scientists would mimic the cooling effect of volcanoes. Putting particles in the stratosphere could reduce the total amount of energy that strikes the Earth. Some climate modelers say this would disrupt rainfall by reducing moisture in the atmosphere obtained by evaporation. Others say that geoengineering’s droughts and famines would be less harmful than those caused by unchecked warming. Right now, no one can agree on the nature of the risks, let alone the degree to which they would apply to particular communities.

And what about justice? Among the disruptions that could result from testing the Pinatubo Option is a weakening of the Asian monsoon, a source of water for hundreds of millions of people in India. Those in developing countries will “eat the risk” of geoengineering trials, shouted one of the Asilomar speakers. If representatives from just a small set of countries were appointed as doctors to the planet, then the less powerful nations might end up as the world’s guinea pigs. Of course, the citizens of those nations also would seem to have the most to lose from uninterrupted global warming. These two dangers would have to be measured one against the other—and compensation as part of the experimental program could be one way of making tests more fair.

If medical ethics aren’t quite up to the task of guiding our forays into geoengineering, what other sort of principles should we keep in mind? One important danger to be aware of is the moral hazard that might come with successful trials. That’s the idea that protective circumstances or actions can encourage people to take undue risks—government insurance of banks led to risky investments that caused the savings-and-loan crisis in the 1980s, for example. Moral hazard looms particularly large for geoengineering studies since medium-scale field tests could prematurely give us the sense that we have a low-cost technical fix for global warming, no emissions cuts needed. (Moral hazard isn’t quite as potent in medical research. The availability of cholesterol-lowering drugs may well discourage people from maintaining healthy diets, but it’s unlikely that mere clinical trials would have the same effect.)

Another ethical principle that might apply to geoengineering is minimization—the idea that, a priori, it’s better to tinker at the smallest possible scale necessary to answer vital scientific questions. This notion comes from the ethics of animal experimentation; now we might apply it to planetary systems and the environment more broadly. Up until now, the medical ethics frame for geoengineering has guided discussions of how geoengineering might affect people in various countries. Perhaps we should be talking about how it affects the planet itself.

By that token, we might gain something by thinking of the Earth as a patient on its own terms. The rules and regulations we come up with for tests of geoengineering should take into account the way those experiments might affect ecosystems and nonhuman animals, both under threat from warming. And so maybe the most famous piece of medical ethics ought to apply: the Hippocratic Oath. “First, do no harm” is the crux of the original, but an updated version exhorts doctors to avoid “the twin traps of overtreatment and therapeutic nihilism.” The climate crisis may force us to act despite myriad ethical challenges, for our benefit and for the planet’s.

This piece was produced by Slate as part of the Climate Desk collaboration.

Eli Kintisch is a reporter at Science and author of a new book on geoengineering, Hack the Planet.

>Municípios terão acesso livre a software para organizar contas e melhorar gestão

>
A ferramenta, chamada de e-cidade, estará disponível no Portal do Software Público Brasileiro no fim de outubro e poderá ser acessada gratuitamente pelos municípios

O Povo – 15 Out 2009 – 07h37min

Prefeituras de todo o país terão acesso a um software livre de gestão que vai possibilitar a organização de gastos, do orçamento, da receita tributária, do controle de medicamentos, de recursos humanos e outros serviços no mesmo aplicativo. A ferramenta, chamada de e-cidade, estará disponível no Portal do Software Público Brasileiro no fim de outubro e poderá ser acessada gratuitamente pelos municípios.

O acesso livre ao software foi acertado entre o Ministério do Planejamento e a empresa que criou o programa. De acordo com o secretário de Logística e Tecnologia da Informação do ministério, Rogério Santana, com o uso livre, as prefeituras poderão adaptar as funções do aplicativo à realidade local e trocar experiências com outros gestores.

“Isso vai melhorar a gestão dos recursos e a prestação de serviços à sociedade. Também facilita a auditoria e a apresentação de contas”, avalia. A ferramenta é utilizada atualmente por 15 municípios e será apresentada oficialmente durante o Encontro Nacional de Tecnologia da Informação para os Municípios Brasileiros, nos dias 27 e 28 de outubro em Brasília.

Segundo Santana, o e-cidade vai permitir que prefeitos registrem as receitas arrecadadas com impostos, conheçam melhor os gastos dos municípios em saúde, educação e pessoal, acompanhem o andamento de obras e o controle de imóveis, por exemplo.

A ferramenta também registra a autorização, emissão e liquidação de empenhos orçamentários integradas ao processo de aquisições e emissões de notas fiscais.

“Os municípios brasileiros são muito carentes de soluções tecnológicas. Temos inúmeros municípios que têm gestão ineficiente e que precisam de ajuda. Essa solução tecnológica pode ser uma alternativa.”

A redução da burocracia também é uma das vantagens do aplicativo, segundo Santana. “Vai reduzir a papelada. Muitas árvores serão economizadas na medida em que a gente automatize e use processos eletrônicos no lugar de processos em papel”, completou.

O e-cidade vai estar disponível para download na página na internet do Portal do Software Público Brasileiro a partir do dia 28 de outubro, no endereço eletrônico http://www.softwarepublico.gov.br .

Agência Brasil

>Mídias digitais no exercício da cidadania

>
Agência FAPESP

Especiais
Mídias digitais no exercício da cidadania

24/8/2009

Por Jussara Mangini

Uma equipe de profissionais de várias áreas da computação está desenvolvendo ferramentas para ajudar governos a implementar o conceito de crossmedia (uso cruzado de múltiplos meios de comunicação) em serviços eletrônicos de atendimento ao público, de modo a ampliar e facilitar a interação com os cidadãos.

Coordenado pela professora Lucia Vilela Leite Filgueiras, da Escola Politécnica da Universidade de São Paulo (USP), o Projeto X-Gov foi criado em outubro de 2007 no âmbito do Instituto Virtual de Pesquisas FAPESP-Microsoft Research. Tem como proposta facilitar o processo de produção de aplicações de mídia cruzada pelo gestor público, para que ele desenvolva serviços mais rapidamente e se beneficie das novas tecnologias.

“Acreditamos que isso será melhor para o cidadão porque a diversidade de mídias dá maior alcance e disponibilidade no acesso ao governo. Sabemos que é difícil para o gestor público criar um serviço que use de forma integrada as várias mídias. Por isso, estamos desenvolvendo um software que facilitará esse processo”, explicou Lucia à Agência FAPESP.

Embora bastante usado na publicidade e em jogos, na esfera governamental os pesquisadores não viram ainda nenhuma outra abordagem semelhante, nem no Brasil nem no exterior. De acordo com a coordenadora do projeto, o trabalho está atento à maneira como as novas gerações, mais acostumadas com as mídias digitais, vão querer se relacionar com o governo e exercitar sua cidadania nos próximos anos.

“Essa geração nativa digital – que já nasceu com celular, televisão e internet – acha muito natural trabalhar com tudo isso ao mesmo tempo. Desejamos ajudar os gestores públicos a fazer um governo eletrônico para essa geração”, disse.

Com as novas tecnologias pode-se, por exemplo, mandar uma fotografia pelo celular para mostrar que uma obra está sendo mal feita ou fazer denúncias de desrespeito às leis. Da mesma forma, órgãos públicos podem substituir cartas ou mesmo o e-mail por mensagens de celular (SMS) para notificar as pessoas mais rapidamente sobre o andamento de determinados processos.

“O governo está começando a usar essas mídias, mas ainda de forma não integrada e com muito esforço, porque cada sistema tem que desenvolver tudo de novo. Uma prefeitura, por exemplo, tem que treinar um técnico de informática no desenvolvimento de software para celular, SMS, TV digital, etc. Queremos facilitar esse processo”, indicou Lucia.

O Projeto X-Gov permitirá transições entre mídias: e-mail, SMS, TV digital, web, códigos de barras bidimensionais, click to call e outros. A ideia é que o link entre celular e TV possibilite que um determinado conteúdo acessado por celular – um vídeo de instruções, por exemplo – seja exibido também na televisão, sem prejuízo ao que estava acontecendo no celular.

“Já a transição click to call possibilitará ao cidadão que estiver navegando pelo website solicitar ajuda via telefone, apenas clicando em um botão do navegador que, automaticamente, realizará uma ligação para o usuário”, disse Lucia.

No desenvolvimento do servidor que faz toda a orquestração dos usuários no crossmedia, os pesquisadores estão criando uma linguagem de programação específica e trabalhando na comunicação entre o sistema e o gestor público, para facilitar que esse crie o plano de seu serviço.

Processo de criação

Segundo a professora da Poli, uma das grandes vantagens para o gestor público é que no X-Gov a questão de compatibilidade foi muito bem resolvida, usando-se arquitetura de serviços web, que é um padrão bem aceito no governo e que dá muita flexibilidade na integração com os sistemas atuais.

Esse tipo de software é chamado de framework (arcabouço) e serve como um esqueleto sobre o qual os desenvolvedores do governo criarão seus projetos.

“Como não há serviços crossmedia disponíveis no governo para análise, o arcabouço está sendo construído a partir de sucessivas provas de conceito, cada uma focada em uma funcionalidade específica, de modo que cada ciclo contribua para a evolução da arquitetura final”, explicou Lucia.

Para chegar a isso, a equipe cria um serviço fictício, estuda como esse serviço funciona e congela uma parte dele no esqueleto. São os chamados padrões de tarefa – uma abstração de algo que todo mundo faz. Por exemplo, em comércio eletrônico um padrão de tarefa é o uso de um carrinho de compras. Todo site que tem um funciona mais ou menos do mesmo jeito.

No governo eletrônico também é assim. Foram definidos 18 padrões de tarefa e para cada um deles a equipe do X-Gov está criando softwares para três meios de comunicação: web, celular e TV digital. Isso significa que serão orquestrados em torno de 54 componentes. A cada semana, a equipe tem acrescentado cerca de três novos componentes ao conjunto.

A previsão é que o framework esteja funcionado até outubro, com os componentes dos blocos de construção das tarefas de governo e das transições entre mídias. A partir daí a equipe deseja realizar uma etapa de testes com usuários.

A implementação, na prática, dependerá da aceitação e interesse dos gestores públicos. A equipe do X-Gov estima que o uso será muito natural para os “nativos digitais”.

* * *

Não é notável o contraste do tom entre esse artigo e a entrevista com o Martín-Barbero anteriormente postada? É impressionante a forma como a idéia de “cidadania” é simplificada, neste texto, a ponto de ser entendida como relação de comunicação, geralmente unidirecional, com o Estado.