Arquivo da tag: Clima

The Conversion of a Climate-Change Skeptic (N.Y.Times)

OP-ED CONTRIBUTOR

By RICHARD A. MULLER

Published: July 28, 2012

Berkeley, Calif.

CALL me a converted skeptic. Three years ago I identified problems in previous climate studies that, in my mind, threw doubt on the very existence of global warming. Last year, following an intensive research effort involving a dozen scientists, I concluded that global warming was real and that the prior estimates of the rate of warming were correct. I’m now going a step further: Humans are almost entirely the cause.

My total turnaround, in such a short time, is the result of careful and objective analysis by the Berkeley Earth Surface Temperature project, which I founded with my daughter Elizabeth. Our results show that the average temperature of the earth’s land has risen by two and a half degrees Fahrenheit over the past 250 years, including an increase of one and a half degrees over the most recent 50 years. Moreover, it appears likely that essentially all of this increase results from the human emission of greenhouse gases.

These findings are stronger than those of the Intergovernmental Panel on Climate Change, the United Nations group that defines the scientific and diplomatic consensus on global warming. In its 2007 report, the I.P.C.C. concluded only that most of the warming of the prior 50 years could be attributed to humans. It was possible, according to the I.P.C.C. consensus statement, that the warming before 1956 could be because of changes in solar activity, and that even a substantial part of the more recent warming could be natural.

Our Berkeley Earth approach used sophisticated statistical methods developed largely by our lead scientist, Robert Rohde, which allowed us to determine earth land temperature much further back in time. We carefully studied issues raised by skeptics: biases from urban heating (we duplicated our results using rural data alone), from data selection (prior groups selected fewer than 20 percent of the available temperature stations; we used virtually 100 percent), from poor station quality (we separately analyzed good stations and poor ones) and from human intervention and data adjustment (our work is completely automated and hands-off). In our papers we demonstrate that none of these potentially troublesome effects unduly biased our conclusions.

The historic temperature pattern we observed has abrupt dips that match the emissions of known explosive volcanic eruptions; the particulates from such events reflect sunlight, make for beautiful sunsets and cool the earth’s surface for a few years. There are small, rapid variations attributable to El Niño and other ocean currents such as the Gulf Stream; because of such oscillations, the “flattening” of the recent temperature rise that some people claim is not, in our view, statistically significant. What has caused the gradual but systematic rise of two and a half degrees? We tried fitting the shape to simple math functions (exponentials, polynomials), to solar activity and even to rising functions like world population. By far the best match was to the record of atmospheric carbon dioxide, measured from atmospheric samples and air trapped in polar ice.

Just as important, our record is long enough that we could search for the fingerprint of solar variability, based on the historical record of sunspots. That fingerprint is absent. Although the I.P.C.C. allowed for the possibility that variations in sunlight could have ended the “Little Ice Age,” a period of cooling from the 14th century to about 1850, our data argues strongly that the temperature rise of the past 250 years cannot be attributed to solar changes. This conclusion is, in retrospect, not too surprising; we’ve learned from satellite measurements that solar activity changes the brightness of the sun very little.

How definite is the attribution to humans? The carbon dioxide curve gives a better match than anything else we’ve tried. Its magnitude is consistent with the calculated greenhouse effect — extra warming from trapped heat radiation. These facts don’t prove causality and they shouldn’t end skepticism, but they raise the bar: to be considered seriously, an alternative explanation must match the data at least as well as carbon dioxide does. Adding methane, a second greenhouse gas, to our analysis doesn’t change the results. Moreover, our analysis does not depend on large, complex global climate models, the huge computer programs that are notorious for their hidden assumptions and adjustable parameters. Our result is based simply on the close agreement between the shape of the observed temperature rise and the known greenhouse gas increase.

It’s a scientist’s duty to be properly skeptical. I still find that much, if not most, of what is attributed to climate change is speculative, exaggerated or just plain wrong. I’ve analyzed some of the most alarmist claims, and my skepticism about them hasn’t changed.

Hurricane Katrina cannot be attributed to global warming. The number of hurricanes hitting the United States has been going down, not up; likewise for intense tornadoes. Polar bears aren’t dying from receding ice, and the Himalayan glaciers aren’t going to melt by 2035. And it’s possible that we are currently no warmer than we were a thousand years ago, during the “Medieval Warm Period” or “Medieval Optimum,” an interval of warm conditions known from historical records and indirect evidence like tree rings. And the recent warm spell in the United States happens to be more than offset by cooling elsewhere in the world, so its link to “global” warming is weaker than tenuous.

The careful analysis by our team is laid out in five scientific papers now online atBerkeleyEarth.org. That site also shows our chart of temperature from 1753 to the present, with its clear fingerprint of volcanoes and carbon dioxide, but containing no component that matches solar activity. Four of our papers have undergone extensive scrutiny by the scientific community, and the newest, a paper with the analysis of the human component, is now posted, along with the data and computer programs used. Such transparency is the heart of the scientific method; if you find our conclusions implausible, tell us of any errors of data or analysis.

What about the future? As carbon dioxide emissions increase, the temperature should continue to rise. I expect the rate of warming to proceed at a steady pace, about one and a half degrees over land in the next 50 years, less if the oceans are included. But if China continues its rapid economic growth (it has averaged 10 percent per year over the last 20 years) and its vast use of coal (it typically adds one new gigawatt per month), then that same warming could take place in less than 20 years.

Science is that narrow realm of knowledge that, in principle, is universally accepted. I embarked on this analysis to answer questions that, to my mind, had not been answered. I hope that the Berkeley Earth analysis will help settle the scientific debate regarding global warming and its human causes. Then comes the difficult part: agreeing across the political and diplomatic spectrum about what can and should be done.

Richard A. Muller, a professor of physics at the University of California, Berkeley, and a former MacArthur Foundation fellow, is the author, most recently, of “Energy for Future Presidents: The Science Behind the Headlines.”

*   *   *

Climate change study forces sceptical scientists to change minds (The Guardian)

Earth’s land shown to have warmed by 1.5C over past 250 years, with humans being almost entirely responsible

Leo Hickman
guardian.co.uk, Sunday 29 July 2012 14.03 BST

Prof Richard MullerProf Richard Muller considers himself a converted sceptic following the study’s surprise results. Photograph: Dan Tuffs for the Guardian

The Earth’s land has warmed by 1.5C over the past 250 years and “humans are almost entirely the cause”, according to a scientific study set up to address climate change sceptics’ concerns about whether human-induced global warming is occurring.

Prof Richard Muller, a physicist and climate change sceptic who founded the Berkeley Earth Surface Temperature (Best) project, said he was surprised by the findings. “We were not expecting this, but as scientists, it is our duty to let the evidence change our minds.” He added that he now considers himself a “converted sceptic” and his views had undergone a “total turnaround” in a short space of time.

“Our results show that the average temperature of the Earth’s land has risen by 2.5F over the past 250 years, including an increase of 1.5 degrees over the most recent 50 years. Moreover, it appears likely that essentially all of this increase results from the human emission of greenhouse gases,” Muller wrote in an opinion piece for the New York Times.

Can scientists in California end the war on climate change?
Study finds no grounds for climate sceptics’ concerns
Video: Berkeley Earth tracks climate change
Are climate sceptics more likely to be conspiracy theorists?

The team of scientists based at the University of California, Berkeley, gathered and merged a collection of 14.4m land temperature observations from 44,455 sites across the world dating back to 1753. Previous data sets created by Nasa, the US National Oceanic and Atmospheric Administration, and the Met Office and the University of East Anglia’s climate research unit only went back to the mid-1800s and used a fifth as many weather station records.

The funding for the project included $150,000 from the Charles G Koch Charitable Foundation, set up by the billionaire US coal magnate and key backer of the climate-sceptic Heartland Institute thinktank. The research also received $100,000 from the Fund for Innovative Climate and Energy Research, which was created by Bill Gates.

Unlike previous efforts, the temperature data from various sources was not homogenised by hand – a key criticism by climate sceptics. Instead, the statistical analysis was “completely automated to reduce human bias”. The Best team concluded that, despite their deeper analysis, their own findings closely matched the previous temperature reconstructions, “but with reduced uncertainty”.

Last October, the Best team published results that showed the average global land temperature has risen by about 1C since the mid-1950s. But the team did not look for possible fingerprints to explain this warming. The latest data analysis reached much further back in time but, crucially, also searched for the most likely cause of the rise by plotting the upward temperature curve against suspected “forcings”. It analysed the warming impact of solar activity – a popular theory among climate sceptics – but found that, over the past 250 years, the contribution of the sun has been “consistent with zero”. Volcanic eruptions were found to have caused short dips in the temperature rise in the period 1750–1850, but “only weak analogues” in the 20th century.

“Much to my surprise, by far the best match came to the record of atmospheric carbon dioxide, measured from atmospheric samples and air trapped in polar ice,” said Muller. “While this doesn’t prove that global warming is caused by human greenhouse gases, it is currently the best explanation we have found, and sets the bar for alternative explanations.”

Muller said his team’s findings went further and were stronger than the latest report published by the Intergovernmental Panel on ClimateChange.

In an unconventional move aimed at appeasing climate sceptics by allowing “full transparency”, the results have been publicly released before being peer reviewed by the Journal of Geophysical Research. All the data and analysis is now available to be freely scrutinised at the Bestwebsite. This follows the pattern of previous Best results, none of which have yet been published in peer-reviewed journals.

When the Best project was announced last year, the prominent climate sceptic blogger Anthony Watts was consulted on the methodology. He stated at the time: “I’m prepared to accept whatever result they produce, even if it proves my premise wrong.” However, tensions have since arisen between Watts and Muller.

Early indications suggest that climate sceptics are unlikely to fully accept Best’s latest results. Prof Judith Curry, a climatologist at the Georgia Institute of Technology who runs a blog popular with climate sceptics and who is a consulting member of the Best team, told the Guardian that the method used to attribute the warming to human emissions was “way over-simplistic and not at all convincing in my opinion”. She added: “I don’t think this question can be answered by the simple curve fitting used in this paper, and I don’t see that their paper adds anything to our understanding of the causes of the recent warming.”

Prof Michael Mann, the Penn State palaeoclimatologist who has faced hostility from climate sceptics for his famous “hockey stick” graph showing a rapid rise in temperatures during the 20th century, said he welcomed the Best results as they “demonstrated once again what scientists have known with some degree of certainty for nearly two decades”. He added: “I applaud Muller and his colleagues for acting as any good scientists would, following where their analyses led them, without regard for the possible political repercussions. They are certain to be attacked by the professional climate change denial crowd for their findings.”

Muller said his team’s analysis suggested there would be 1.5 degrees of warming over land in the next 50 years, but if China continues its rapid economic growth and its vast use of coal then that same warming could take place in less than 20 years.

“Science is that narrow realm of knowledge that, in principle, is universally accepted,” wrote Muller. “I embarked on this analysis to answer questions that, to my mind, had not been answered. I hope that the Berkeley Earth analysis will help settle the scientific debate regarding global warming and its human causes. Then comes the difficult part: agreeing across the political and diplomatic spectrum about what can and should be done.”

Climate Change and the Next U.S. Revolution (ZNet)

Thursday, July 26, 2012

The U.S. heat wave is slowly shaking the foundations of American politics. It may take years for the deep rumble to evolve into an above ground, institution-shattering earthquake, but U.S. society has changed for good.

The heat wave has helped convince tens of millions of Americans that climate change is real, overpowering the fake science and right-wing media – funded by corporate cash – to convince Americans otherwise.

Republicans and Democrats alike also erect roadblocks to understanding climate change. By the politicians’ complete lack of action towards addressing the issue, the “climate change is fake” movement was strengthened, since Americans presumed that any sane government would be actively trying to address an issue that had the potential to destroy civilization.

But working people have finally made up their mind. A recent poll showed that 70 percent of Americans now believe that climate change is real, up from 52 percent in 2010. And a growing number of people are recognizing that the warming of the planet is caused by human activity.

Business Week explains: “A record heat wave, drought and catastrophic wildfires are accomplishing what climate scientists could not: convincing a wide swath of Americans that global temperatures are rising.”

This means that working class families throughout the Midwest and southern states simply don’t believe what their media and politicians are telling them.

It also implies that these millions of Americans are being further politicized in a deeper sense.

Believing that climate change exists implies that you are somewhat aware about the massive consequences to humanity if the global economy doesn’t drastically change, and fast.

This awareness has revolutionary implications. As millions of Americans watch the environment destroyed – for their grandchildren or themselves – while politicians do absolutely nothing in response, or make tiny token gestures – a growing number of Americans will demand political alternatives, and fight to see them created. The American political system as it exists today cannot cope with this inevitable happening.

The New York Times explains why: “…the American political system is not ready to agree to a [climate] treaty that would force the United States, over time, to accept profound changes in its energy [coal, oil], transport [trucking and airline industry] and manufacturing [corporate] sectors.”

In short, the U.S. government will not force corporations to make less profit by behaving more eco-friendly. This is the essence of the problem.

In order for humanity to survive climate change, the economy must be radically transformed; massive investments must be made in renewable energy, public transportation, and recycling, while dirty energy sources must be quickly swept into the dustbin of history.

But the economy is currently owned by giant, privately run corporations, that will continue destroying the earth if it earns them huge profits, and they make massive “contributions” to political parties to ensure this remains so. It’s becoming increasingly obvious that government inaction on climate change is directly linked to the “special interests” of corporations that dominate these governments.

This fact of U.S. politics is present in every other capitalist country as well, which means that international agreements on reducing greenhouse gasses will remain impossible, as each country’s corporations vie for market domination, reducing pollution simply puts them at a competitive disadvantage.

This dynamic has already caused massive delays in the UN’s already inadequate efforts at addressing climate change. The Kyoto climate agreement was the by-product of years of cooperation and planning between many nations that included legally binding agreements to reduce greenhouse gasses. The Bush and Obama administrations helped destroy these efforts.

For example, Instead of building upon the foundation of the Kyoto Protocol, the Obama administration demanded a whole new structure, something that would take years to achieve. The Kyoto framework (itself insufficient) was abandoned because it included legally binding agreements, and was based on multilateral, agreed-upon reductions of greenhouse gasses.

In an article by the Guardian entitled “US Planning to Weaken Copenhagen Climate Deal,” the Obama administration’s UN position is exposed, as he dismisses the Kyoto Protocol by proposing that “…each country set its own rules and to decide unilaterally how to meet its target.”
Obama’s proposal came straight from the mouth of U.S. corporations, who wanted to ensure that there was zero accountability, zero oversight, zero climate progress, and therefore no dent to their profits. Instead of using its massive international leverage for climate justice, the U.S. has used it to promote divisiveness and inaction, to the potential detriment of billions of people globally.

The stakes are too high to hold out any hope that governments will act boldly. The Business Week article below explains the profound changes happening to the climate:

“The average temperature for the U.S. during June was 71.2 degrees Fahrenheit (21.7 Celsius), which is 2 degrees higher than the average for the 20th century, according to the National Oceanic and Atmospheric Administration. The June temperatures made the preceding 12 months the warmest since record-keeping began in 1895, the government agency said.”

Activists who are radicalized by this global problem face a crisis of what to do about it. It is difficult to put forth a positive climate change demand, since the problem is global.  Demanding that governments “act boldly” to address climate change hasn’t worked, and lesser demands seem inadequate.

The environmental rights movement continues to go through a variety of phases: individual and small group eco-“terrorism,” causing property damage to environmentally damaging companies; corporate campaigns that target especially bad polluters with high-profile direct action; and massive education programs that have been highly successful, but fall short when it comes to winning change.

Ultimately, climate activists must come face to face with political and corporate power. Corporate-owned governments are the ones with the power to adequately address the climate change issue, and they will not be swayed by good science, common sense, basic decency, or even a torched planet.

Those in power only respond to power, and the only power capable of displacing corporate power is when people unite and act collectively, as was done in Egypt, Tunisia, and is still developing throughout Europe.

Climate groups cannot view their issue as separate from other groups that are organizing against corporate power. The social movements that have emerged to battle austerity measures are natural allies, as are anti-war and labor activists. The climate solution will inevitably require revolutionary measures, which first requires that alliances and demands are put forward that unite Labor, working people in general, community, and student groups towards collective action.

One possible immediate demand is for environmental activists to unite with Labor groups over a federal jobs program, paid for by taxing the rich, that makes massive investments in jobs that are climate related, such as solar panel production, transportation, building recycling centers, home retro-fitting, etc.

Another demand could be to insist that the government convene the most knowledgeable scientists in the area of clean energy. These scientists should be given all the resources they need in order to collectively create alternative sources of clean energy that would allow for a realistic alternative to the current polluting and toxic sources of energy.

However, any type of immediate demand will meet giant corporate resistance from both political parties. Fighting for a uniting demand will thus strengthen the movement, and for this reason it is important to link climate solutions to the creation of jobs, which are the number one concern of most Americans. This unity will in turn lead allies toward a deeper understanding of the problem, and therefore deeper solutions will emerge that challenge the whole economic structure that is deaf to the needs of humans and the climate and sacrifices everything to the private profit of a few.

Shamus Cooke is a social service worker, trade unionist, and writer for Workers Action (www.workerscompass.org). He can be reached at shamuscooke@gmail.com

http://www.businessweek.com/news/2012-07-18/record-heat-wave-pushes-u-dot-s-dot-belief-in-climate-change-to-70-percent

http://www.nytimes.com/2009/12/13/weekinreview/13broder.html

http://www.guardian.co.uk/environment/2009/sep/15/europe-us-copenhagen

Climate Change Could Open Trade Opportunities for Some Vulnerable Nations (Science Daily)

ScienceDaily (July 26, 2012) — Tanzania is one developing country that could actually benefit from climate change by increasing exports of corn to the U.S. and other nations, according to a study by researchers at Stanford University, the World Bank and Purdue University.

The study, published in the Review of Development Economics, shows the African country better known for safaris and Mt. Kilimanjaro has the potential to substantially increase its maize exports and take advantage of higher commodity prices with a variety of trading partners due to predicted dry and hot weather that could affect those countries’ usual sources for the crop. In years that major consumer countries such as the U.S., China and India are forecast to experience severe dry conditions, Tanzania’s weather will likely be comparatively wet. Similarly, in the relatively few years this century that it is expected to have severe dry weather, Tanzania could import corn from trading partners experiencing better growing conditions.

“This study highlights how government policies can influence the impact that we experience from the climate system” said study co-author Noah Diffenbaugh, an assistant professor of environmental Earth system science at Stanford’s School of Earth Sciences and a center fellow at the Stanford Woods Institute for the Environment. “Tanzania is a particularly interesting case, as it has the potential to benefit from climate change if climate model predictions of decreasing drought in East Africa prove to be correct, and if trade policies are constructed to take advantage of those new opportunities.”

Tightening restrictions on crop exports during times of climate instability may seem like a logical way to ensure domestic food availability and price stability. In fact, the study warns, trade restrictions such as those that Tanzania has instituted several times in recent years prevent countries such as Tanzania from buffering its poor citizens in bad climate years and from taking advantage of economic opportunities in good climate years.

The study, the most long-range and detailed of its kind to date uses economic, climatic and agricultural data and computational models to forecast the occurrence of severe dry years during the next nine decades in Tanzania and its key trading partners. The authors began by analyzing historical years in which Tanzania experienced grains surpluses or deficits. They found that a closed trade policy enhanced poverty in both kinds of years, by limiting the ability to offset shortfalls with imports during deficit years and limiting the ability to profit from exports during surplus years.

The authors then attempted to predict how often Tanzania and key trading partners will experience severely dry years in response to continued global warming. Among the predictions: during an average of 96 percent of the years that the U.S. and China are predicted to have extremely dry conditions, Tanzania will not experience similarly dry weather. For India, that percentage increases to 97 percent. Similarly, the study’s climate models suggest that Tanzania is likely to have adequate growing season moisture in most of the years that its key African trading partners experience severe dry weather.

Among Tanzania’s trading partners, the U.S., China, Canada and Russia are most likely to consistently experience adequate growing conditions in years when Tanzania does not. When compared with all of its key trading partners, Tanzania’s dry years during the 21st century will often coincide with non-dry years in the other countries. Having a diverse mix of trading partners could help hedge against a coincidence of severe dry weather within and outside of Africa, the study’s results suggest.

The findings are relevant to grain-growing countries around the world. Those countries stand to profit from exports in years when trading partners are enduring severe dry and / or hot weather. Likewise, they can buffer themselves against bad growing weather at home by importing from grains-rich regions less affected by such weather during that particular year.

“This study highlights the importance of trade in either buffering or exacerbating the effects of climate stresses on the poor,” says Diffenbaugh. “We find that these effects are already taking place in the current climate, and that they could become even more important in the future as the co-occurrence of good and bad years between different regions changes in response to global warming.”

Local Weather Patterns Affect Beliefs About Global Warming (Science Daily)

People living in places experiencing warmer-than-normal temperatures at the time they were surveyed were significantly more likely than others to say there is evidence for global warming. (Credit: © Rafael Ben-Ari / Fotolia)

ScienceDaily (July 25, 2012) — Local weather patterns temporarily influence people’s beliefs about evidence for global warming, according to research by political scientists at New York University and Temple University. Their study, which appears in theJournal of Politics, found that those living in places experiencing warmer-than-normal temperatures at the time they were surveyed were significantly more likely than others to say there is evidence for global warming.

“Global climate change is one of the most important public policy challenges of our time, but it is a complex issue with which Americans have little direct experience,” wrote the study’s co-authors, Patrick Egan of New York University and Megan Mullin of Temple University. “As they try to make sense of this difficult issue, many people use fluctuations in local temperature to reassess their beliefs about the existence of global warming.”

Their study examined five national surveys of American adults sponsored by the Pew Research Center: June, July, and August 2006, January 2007, and April 2008. In each survey, respondents were asked the following question: “From what you’ve read and heard, is there solid evidence that the average temperature on earth has been getting warmer over the past few decades, or not?” On average over the five surveys, 73 percent of respondents agreed that Earth is getting warmer.

Egan and Mullin wondered about variation in attitudes among the survey’s respondents, and hypothesized that local temperatures could influence perceptions. To measure the potential impact of temperature on individuals’ opinions, they looked at zip codes from respondents in the Pew surveys and matched weather data to each person surveyed at the time of each poll. They used local weather data to determine if the temperature in the location of each respondent was significantly higher or lower than normal for that area at that time of year.

Their results showed that an abnormal shift in local temperature is associated with a significant shift in beliefs about evidence for global warming. Specifically, for every three degrees Fahrenheit that local temperatures in the past week have risen above normal, Americans become one percentage point more likely to agree that there is ”solid evidence” that Earth is getting warmer. The researchers found cooler-than-normal temperatures have similar effects on attitudes — but in the opposite direction.

The study took into account other variables that may explain the results — such as existing political attitudes and geography — and found the results still held.

The researchers also wondered if heat waves — or prolonged higher-than-normal temperatures — intensified this effect. To do so, they looked at respondents living in areas that experienced at least seven days of temperatures of 10° or more above normal in the three weeks prior to interview and compared their views with those who experienced the same number of hot days, but did not experience a heat wave.

Their estimates showed that the effect of a heat wave on opinion is even greater, increasing the share of Americans believing in global warming by 5.0 to 5.9 percentage points.

However, Egan and Mullin found the effects of temperature changes to be short-lived — even in the wake of heat waves. Americans who had been interviewed after 12 or more days had elapsed since a heat wave were estimated to have attitudes that were no different than those who had not been exposed to a heat wave.

“Under typical circumstances, the effects of temperature fluctuations on opinion are swiftly wiped out by new weather patterns,” they wrote. “More sustained periods of unusual weather cause attitudes to change both to a greater extent and for a longer period of time. However, even these effects eventually decay, leaving no long-term impact of weather on public opinion.”

The findings make an important contribution to the political science research on the relationship between personal experience and opinion on a larger issue, which has long been studied with varying results.

“On issues such as crime, the economy, education, health care, public infrastructure, and taxation, large shares of the public are exposed to experiences that could logically be linked to attitude formation,” the researchers wrote. “But findings from research examining how these experiences affect opinion have been mixed. Although direct experience — whether it be as a victim of crime, a worker who has lost a job or health insurance, or a parent with children in public schools — can influence attitudes, the impact of these experiences tends to be weak or nonexistent after accounting for typical predictors such as party identification and liberal-conservative ideology.”

“Our research suggests that personal experience has substantial effects on political attitudes,” Egan and Mullin concluded. “Rich discoveries await those who can explore these questions in ways that permit clean identification of these effects.”

Egan is an assistant professor in the Wilf Family Department of Politics at NYU and Mullin is an associate professor in the Department of Political Science at Temple University

What is a carbon price and why do we need one? (The Guardian)

This Q&A is part of the Guardian’s Ultimate climate change FAQ

Grantham Research Institute and 
guardian.co.uk, Monday 16 July 2012 10.38 BST
Parliament House during a pro-carbon tax rally in Canberra, Australia

A pro-carbon tax rally in Canberra, Australia, October 2011. Photograph: Alan Porritt/AFP/Getty Images

A carbon price is a cost applied to carbon pollution to encourage polluters to reduce the amount of greenhouse gas they emit into the atmosphere. Economists widely agree that introducing a carbon price is the single most effective way for countries to reduce their emissions.

Climate change is considered a market failure by economists, because it imposes huge costs and risks on future generations who will suffer the consequences of climate change, without these costs and risks normally being reflected in market prices. To overcome this market failure, they argue, we need to internalise the costs of future environmental damage by putting a price on the thing that causes it – namely carbon emissions.

carbon price not only has the effect of encouraging lower-carbon behaviour (eg using a bike rather than driving a car), but also raises money that can be used in part to finance a clean-up of “dirty” activities (eg investment in research into fuel cells to help cars pollute less). With a carbon price in place, the costs of stopping climate change are distributed across generations rather than being borne overwhelmingly by future generations.

There are two main ways to establish a carbon price. First, a government can levy a carbon tax on the distribution, sale or use of fossil fuels, based on their carbon content. This has the effect of increasing the cost of those fuels and the goods or services created with them, encouraging business and people to switch to greener production and consumption. Typically the government will decide how to use the revenue, though in one version, the so-called fee-and-dividend model – the tax revenues are distributed in their entirety directly back to the population.

The second approach is a quota system called cap-and-trade. In this model, the total allowable emissions in a country or region are set in advance (“capped”). Permits to pollute are created for the allowable emissions budget and either allocated or auctioned to companies. The companies can trade permits between one another, introducing a market for pollution that should ensure that the carbon savings are made as cheaply as possible.

To serve its purpose, the carbon price set by a tax or cap-and-trade scheme must be sufficiently high to encourage polluters to change behaviour and reduce pollution in accordance with national targets. For example, the UK has a target to reduce carbon emissions by 80% by 2050, compared with 1990 levels, with various intermediate targets along the way. The government’s independent advisers, the Committee on Climate Change, estimates that a carbon price of £30 per tonne of carbon dioxide in 2020 and £70 in 2030 would be required to meet these goals.

Currently, many large UK companies pay a price for the carbon they emit through the EU’s emissions trading scheme. However, the price of carbon through the scheme is considered by many economists to be too low to help the UK to meet its targets, so the Treasury plans to make all companies covered by the scheme pay a minimum of £16 per tonne of carbon emitted from April 2013.

Ideally, there should be a uniform carbon price across the world, reflecting the fact that a tonne of carbon dioxide does the same amount of damage over time wherever it is emitted. Uniform pricing would also remove the risk that polluting businesses flee to so-called “pollution havens”‘ – countries where a lack of environmental regulation enables them to continue to pollute unrestrained. At the moment, carbon pricing is far from uniform but a growing number of countries and regions have, or plan to have, carbon pricing schemes in place, whether through cap-and-trade or carbon taxes. These include the European Union, Australia, South Korea, South Africa, parts of China and California.

• This article was written by Alex Bowen of the Grantham Research Institute on Climate Change and the Environment at LSE in collaboration with the Guardian

A Century Of Weather Control (POP SCI)

Posted 7.19.12 at 6:20 pm – http://www.popsci.com

 

Keeping Pilots Updated, November 1930

It’s 1930 and, for obvious reasons, pilots want regular reports on the weather. What to do? Congress’s solution was to give the U.S. Weather Bureau cash to send them what they needed. It was a lot of cash, too: $1.4 million, or “more than one third the sum it spend annually for all of its work.”

About 13,000 miles of airway were monitored for activity, and reports were regularly sent via the now quaintly named “teletype”–an early fax machine, basically, that let a typed message be reproduced. Pilots were then radioed with the information.

From the article “Weather Man Makes the Air Safe.”

 

Battling Hail, July 1947

We weren’t shy about laying on the drama in this piece on hail–it was causing millions in damage across the country and we were sick of it. Our writer says, “The war against hail has been declared.” (Remember: this was only two years after World War II, which was a little more serious. Maybe our patriotism just wouldn’t wane.)

The idea was to scatter silver iodide as a form of “cloud seeding”–turning the moisture to snow before it hails. It’s a process that’s still toyed with today.

From the article “The War Against Hail.”

 

Hunting for a Tornado “Cure,” March 1958

1957 was a record-breaking year for tornadoes, and PopSci was forecasting even rougher skies for 1958. As described by an official tornado watcher: ‘”They’re coming so fast and thick … that we’ve lost count.'”

To try to stop it, researchers wanted to learn more. Meteorologists asked for $5 million more a year from Congress to be able to study tornadoes whirling through the Midwest’s Tornado Alley, then, hopefully, learn what they needed to do to stop them.

From the article “What We’re Learning About Tornadoes.”

 

Spotting Clouds With Nimbus, November 1963

Weather satellites were a boon to both forecasters and anyone affected by extreme weather. The powerful Hurricane Esther was discovered two days before anything else spotted it, leaving space engineers “justifiably proud.” The next satellite in line was the Nimbus, which Popular Science devoted multiple pages to covering, highlighting its ability to photograph cloud cover 24 hours a day and give us better insight into extreme weather.

Spoiler: the results really did turn out great, with Nimbus satellites paving the way for modern GPS devices.

From the article “The Weather Eye That Never Blinks.”

 

Saving Money Globally With Forecasts, November 1970

Optimism for weather satellites seemed to be reaching a high by the ’70s, with Popular Science recounting all the disasters predicted–how they “saved countless lives through early hurricane warnings”–and now even saying they’d save your vacation.

What they were hoping for then was an accurate five-day forecast for the world, which they predicted would save billions and make early warnings even better.

From the article “How New Weather Satellites Will Give You More Reliable Forecasts.”

 

Extreme Weather Alerts on the Radio, July 1979

Those weather alerts that come on your television during a storm–or at least one radio version of those–were documented byPopular Science in 1979. But rather than being something that anyone could tune in to, they were specialized radios you had to purchase, which seems like a less-than-great solution to the problem. But at this point the government had plans to set up weather monitoring stations near 90 percent of the country’s population, opening the door for people to find out fast what the weather situation was.

From the article “Weather-Alert Radios–They Could Save Your Life.”

 

Stopping “Bolts From the Blue,” May 1990

Here Popular Science let loose a whooper for anyone with a fear of extreme weather: lightning kills a lot more people every year than you think, and sometimes a lightning bolt will come and hit you even when there’s not a storm. So-called “bolts from the blue” were a part of the story on better predicting lightning, a phenomenon more manic than most types of weather. Improved sensors played a major part in better preparing people before a storm.

From the article “Predicting Deadly Lightning.”

 

Infrared Views of Weather, August 1983

Early access to computers let weather scientists get a 3-D, radar-based view of weather across the country. The system culled information from multiple sources and placed it in one viewable display. (The man pictured looks slightly bored for how revolutionary it is.) The system was an attempt to take global information and make it into “real-time local predictions.”

From the article “Nowcasting: New Weather Computers Pinpoint Deadly Storms.”

 

Modernizing the National Weather Service, August 1997

A year’s worth of weather detection for every American was coming at the price of “a Big Mac, fries, and a Coke,” the deputy director of the National Weather Service said in 1997. The computer age better tied together the individual parts of weather forecasting for the NWS, leaving a unified whole that could grab complicated meteorological information and interpret it in just a few seconds.

From the article “Weather’s New Outlook.”

 

Modeling Weather With Computers, September 2001

Computer simulations, we wrote, would help us predict future storms more accurately. But it took (at the time) the largest supercomputer around to give us the kinds of models we wanted. Judging by the image, we might’ve already made significant progress on the weather modeling front.

Climate Change Strikes Especially Hard Blow to Native Americans (PBS)

CLIMATE CHANGE — July 19, 2012 at 3:42 PM EDT

BY: SASKIA DE MELKER AND REBECCA JACOBSON

Watch Native American Communities Plan for Climate Change Future on PBS. See more from PBS NewsHour.

On Thursday’s NewsHour, NewsHour correspondent Hari Sreenivasan moderated a panel discussion on how Native American tribes are coping with climate change.

The panel included four native leaders representing their communities at the First Stewards symposium:

When we began our NewsHour coverage on communities across the United States coping with climate change, we didn’t plan to focus on Native American tribes. But we soon realized that indigenous communities are on the frontlines of America’s climate-related dangers.

Native Americans make up about one percent of the United States population, but they manage more than 95 million acres of land. Their reservations lie in some of the most diverse ecosystems in the country, ranging from Alaska to the coasts of Florida. That diversity – both geographically and culturally – makes them a sort of demographic microcosm of the United States. That means the climate shifts that they are feeling now could give clues to what other Americans can expect might see in the near future.

Recent studies, including those from the National Wildlife Federation ,the EPA, and the USDA, highlight the disproportionate vulnerability of tribes to climate-related hazards such as coastal erosion, rising temperatures and extreme weather. Tribes depend on the land and natural resources for their culture and livelihood. What’s more, reservations often have high rates of poverty, unemployment and a lack of resources that would allow them to adapt to long-term climate changes.

We’ve reported on how rising seas threaten tribal land along the Louisiana coast. We’ve looked at the impact of a depleted salmon population on Northwest tribes. And we recently visited Washington state’s Quileute tribe, which has fought to reclaim land threatened by floods and sea level rise.

View photo essay

Relocating to adapt to environmental threats or disasters declines is not always a viable option for tribes, both because of the connection to their origins but also because they may lack the resources needed to move, said Larry Wasserman, environmental policy manager for the Swinomish tribe in the Pacific Northwest.

“Rather than being a mobile society that can move away from climatic changes, they need to think about how do they stay on this piece of ground and continue to live the lifestyle that they’ve been able to live, and how can their great-great-great-grandchildren do that,” Wasserman said.

Tony Foster, chairman of the Quileute Nation said that native people are in tune with the climate of their homelands and know early on when the balance of the ecosystem has been disrupted. “The Quileute has been here for over 10,000 years,” he said. “We know the layout of the land, and we know the conditions of our environment.”

“Traditional values teach us to be good ancestors,” added Micah McCarty, chairman of the Makah Tribe in Neah Bay, Washington. “Future generations are going to look back at us and say, ‘What did you do about this?'”

That forward thinking is necessary for planning for climate change which is defined over at least a 30-year range and is often modeled on time scales looking more than hundreds of years into the future.

And Jeff Mears, member and environmental area manager for the Oneida tribe in Wisconsin, said it’s important that the tribes are defined by more than their past.

Because many tribes have a unique status as sovereign nations, they can also implement their own initiatives and models for managing their environment. The Swinomish tribe, for example, has developed its own climate adaptation plan.

Tribal governments also want more say at the federal level when it comes to addressing in climate change.

There needs to be more “recognition from western science of the value of traditional ecological knowledge,” McCarty said. “So we need to look at how we can better inform the government of what tribal leaders bring to the table in regard to responding to climate change.”

And that’s the aim of a gathering to be held at the Smithsonian’s National Museum of the American Indian in Washington D.C. this week. The First Stewards symposium will bring together hundreds of indigenous tribal elders, leaders, and scientists from across America to discuss how best to confront past, present, and future adaptation to climate change.

See all of our coverage of how Native American communities are coping with climate change:

Native Lands Wash Away as Sea Levels Rise

Native Americans’ tribal lands along the Louisiana coast are washing away as sea levels rise and marshes sink. We report from Isle de Jean Charles, a community that is slowly disappearing into the sea.

The Northwest’s Salmon People Face a Salmon-less Future

For Northwest tribes, fishing for salmon is more than a food source, it’s a way of life. Now the climate may push the fish towards extinction. Together with KCTS 9 and EarthFix, NewsHour recently visited the Swinomish Indian reservation to see how they are coping.

Climate Change Threatens the ‘Twilight’ Tribe

Washington’s Quileute tribe, thrust into the spotlight by the “Twilight” series,’ has been caught in a struggle to reclaim land threatened by floods and sea level rise. Together with KCTS9 and EarthFix, NewsHour visited the tribe to hear their story.

Global CO2 Emissions Continued to Increase in 2011, With Per Capita Emissions in China Reaching European Levels (Science Daily)

ScienceDaily (July 19, 2012) — Global emissions of carbon dioxide (CO2) — the main cause of global warming — increased by 3% last year, reaching an all-time high of 34 billion tonnes in 2011. In China, the world’s most populous country, average emissions of CO2 increased by 9% to 7.2 tonnes per capita. China is now within the range of 6 to 19 tonnes per capita emissions of the major industrialised countries. In the European Union, CO2 emissions dropped by 3% to 7.5 tonnes per capita. The United States remains one of the largest emitters of CO2, with 17.3 tones per capita, despite a decline due to the recession in 2008-2009, high oil prices and an increased share of natural gas.

These are the main findings of the annual report ‘Trends in global CO2emissions’, released July 19 by the European Commission’s Joint Research Centre (JRC) and the Netherlands Environmental Assessment Agency (PBL).

Based on recent results from the Emissions Database for Global Atmospheric Research (EDGAR) and latest statistics on energy use and relevant activities such as gas flaring and cement production, the report shows that global CO2 emissions continued to grow in 2011, despite reductions in OECD countries. Weak economic conditions, a mild winter, and energy savings stimulated by high oil prices led to a decrease of 3% in CO2 emissions in the European Union and of 2% in both the United States and Japan. Emissions from OECD countries now account for only one third of global CO2 emissions — the same share as that of China and India combined, where emissions increased by 9% and 6% respectively in 2011. Economic growth in China led to significant increases in fossil fuel consumption driven by construction and infrastructure expansion. The growth in cement and steel production caused China’s domestic coal consumption to increase by 9.7%.

The 3% increase in global CO2 emissions in 2011 is above the past decade’s average annual increase of 2.7%, with a decrease in 2008 and a surge of 5% in 2010. The top emitters contributing to the 34 billion tonnes of CO2 emitted globally in 2011 are: China (29%), the United States (16%), the European Union (11%), India (6%), the Russian Federation (5%) and Japan (4%).

Cumulative CO2 emissions call for action

An estimated cumulative global total of 420 billion tonnes of CO2 were emitted between 2000 and 2011 due to human activities, including deforestation. Scientific literature suggests that limiting the rise in average global temperature to 2°C above pre-industrial levels — the target internationally adopted in UN climate negotiations — is possible only if cumulative CO2emissions in the period 2000-2050 do not exceed 1 000 to 1 500 billion tonnes. If the current global trend of increasing CO2emissions continues, cumulative emissions will surpass this limit within the next two decades.

Fortunately, this trend is being mitigated by the expansion of renewable energy supplies, especially solar and wind energy and biofuels. The global share of these so-called modern renewables, which exclude hydropower, is growing at an accelerated speed and quadrupled from 1992 to 2011. This potentially represents about 0.8 billion tonnes of CO2emissions avoided as a result of using renewable energy supplies in 2011, which is close to Germany’s total CO2emissions in 2011.

“Trends in global CO2 emissions” report:http://edgar.jrc.ec.europa.eu/CO2REPORT2012.pdf

Society’s Response to Climate Change Is Critical (Science Daily)

ScienceDaily (July 18, 2012) — Lancaster University (UK) scientists have proposed a new way of considering society’s reactions to global warming by linking societal actions to temperature change.

Using this framework to analyse climate change policies aimed at avoiding dangerous climate change, they suggest that society will have to become fifty times more responsive to global temperature change than it has been since 1990.

The researchers, Dr Andy Jarvis, Dr David Leedal and Professor Nick Hewitt from the Lancaster Environment Centre, also show that if global energy use continues to grow as it has done historically, society would have to up its decarbonization efforts from its historic (160 year) value of 0.6% per year to 13% per year.

Dr Andy Jarvis said: “In order to avoid dangerous climate change, society will have to become much more responsive to the risks and damages that growth in global greenhouse gas emissions impose.”

The research, published in Nature Climate Change on 15 July has found that the global growth of new renewable sources of energy since 1990 constitutes a climate-society feedback of a quarter percent per year in the growth rate of CO2 emissions per degree temperature rise.

Professor Nick Hewitt said “If left unmanaged, the climate damages that we experience will motivate society to act to a greater or lesser degree. This could either amplify the growth in greenhouse gas emissions as we repair these damages or dampen them through loss of economic performance. Both are unpredictable and potentially dangerous.”

Dummies guide to the latest “Hockey Stick” controversy (Real Climate)

http://www.realclimate.org

 — gavin @ 18 February 2005

by Gavin Schmidt and Caspar Amman

Due to popular demand, we have put together a ‘dummies guide’ which tries to describe what the actual issues are in the latest controversy, in language even our parents might understand. A pdf version is also available. More technical descriptions of the issues can be seen here and here.

This guide is in two parts, the first deals with the background to the technical issues raised byMcIntyre and McKitrick (2005) (MM05), while the second part discusses the application of this to the original Mann, Bradley and Hughes (1998) (MBH98) reconstruction. The wider climate science context is discussed here, and the relationship to other recent reconstructions (the ‘Hockey Team’) can be seen here.

NB. All the data that were used in MBH98 are freely available for download atftp://holocene.evsc.virginia.edu/pub/sdr/temp/nature/MANNETAL98/ (and also as supplementary data at Nature) along with a thorough description of the algorithm.
Part I: Technical issues:

1) What is principal component analysis (PCA)?

This is a mathematical technique that is used (among other things) to summarize the data found in a large number of noisy records so that the essential aspects can more easily seen. The most common patterns in the data are captured in a number of ‘principal components’ which describe some percentage of the variation in the original records. Usually only a limited number of components (‘PC’s) have any statistical significance, and these can be used instead of the larger data set to give basically the same description.

2) What do these individual components represent?

Often the first few components represent something recognisable and physical meaningful (at least in climate data applications). If a large part of the data set has a trend, than the mean trend may show up as one of the most important PCs. Similarly, if there is a seasonal cycle in the data, that will generally be represented by a PC. However, remember that PCs are just mathematical constructs. By themselves they say nothing about the physics of the situation. Thus, in many circumstances, physically meaningful timeseries are ‘distributed’ over a number of PCs, each of which individually does not appear to mean much. Different methodologies or conventions can make a big difference in which pattern comes up tops. If the aim of the PCA analysis is to determine the most important pattern, then it is important to know how robust that pattern is to the methodology. However, if the idea is to more simply summarize the larger data set, the individual ordering of the PCs is less important, and it is more crucial to make sure that as many significant PCs are included as possible.

3) How do you know whether a PC has significant information?

PC significanceThis determination is usually based on a ‘Monte Carlo’ simulation (so-called because of the random nature of the calculations). For instance, if you take 1000 sets of random data (that have the same statistical properties as the data set in question), and you perform the PCA analysis 1000 times, there will be 1000 examples of the first PC. Each of these will explain a different amount of the variation (or variance) in the original data. When ranked in order of explained variance, the tenth one down then defines the 99% confidence level: i.e. if your real PC explains more of the variance than 99% of the random PCs, then you can say that this is significant at the 99% level. This can be done for each PC in turn. (This technique was introduced by Preisendorfer et al. (1981), and is called the Preisendorfer N-rule).

The figure to the right gives two examples of this. Here each PC is plotted against the amount of fractional variance it explains. The blue line is the result from the random data, while the blue dots are the PC results for the real data. It is clear that at least the first two are significantly separated from the random noise line. In the other case, there are 5 (maybe 6) red crosses that appear to be distinguishable from the red line random noise. Note also that the first (‘most important’) PC does not always explain the same amount of the original data.

4) What do different conventions for PC analysis represent?

Some different conventions exist regarding how the original data should be normalized. For instance, the data can be normalized to have an average of zero over the whole record, or over a selected sub-interval. The variance of the data is associated with departures from the whatever mean was selected. So the pattern of data that shows the biggest departure from the mean will dominate the calculated PCs. If there is an a priori reason to be interested in departures from a particular mean, then this is a way to make sure that those patterns move up in the PC ordering. Changing conventions means that the explained variance of each PC can be different, the ordering can be different, and the number of significant PCs can be different.

5) How can you tell whether you have included enough PCs?

This is rather easy to tell. If your answer depends on the number of PCs included, then you haven’t included enough. Put another way, if the answer you get is the same as if you had used all the data without doing any PC analysis at all, then you are probably ok. However, the reason why the PC summaries are used in the first place in paleo-reconstructions is that using the full proxy set often runs into the danger of ‘overfitting’ during the calibration period (the time period when the proxy data are trained to match the instrumental record). This can lead to a decrease in predictive skill outside of that window, which is the actual target of the reconstruction. So in summary, PC selection is a trade off: on one hand, the goal is to capture as much variability of the data as represented by the different PCs as possible (particularly if the explained variance is small), while on the other hand, you don’t want to include PCs that are not really contributing any more significant information.

Part II: Application to the MBH98 ‘Hockey Stick’

1) Where is PCA used in the MBH methodology?

When incorporating many tree ring networks into the multi-proxy framework, it is easier to use a few leading PCs rather than 70 or so individual tree ring chronologies from a particular region. The trees are often very closely located and so it makes sense to summarize the general information they all contain in relation to the large-scale patterns of variability. The relevant signal for the climate reconstruction is the signal that the trees have in common, not each individual series. In MBH98, the North American tree ring series were treated like this. There are a number of other places in the overall methodology where some form of PCA was used, but they are not relevant to this particular controversy.

2) What is the point of contention in MM05?

MM05 contend that the particular PC convention used in MBH98 in dealing with the N. American tree rings selects for the ‘hockey stick’ shape and that the final reconstruction result is simply an artifact of this convention.

3) What convention was used in MBH98?

MBH98 were particularly interested in whether the tree ring data showed significant differences from the 20th century calibration period, and therefore normalized the data so that the mean over this period was zero. As discussed above, this will emphasize records that have the biggest differences from that period (either positive of negative). Since the underlying data have a ‘hockey stick’-like shape, it is therefore not surprising that the most important PC found using this convention resembles the ‘hockey stick’. There are actual two significant PCs found using this convention, and both were incorporated into the full reconstruction.

PC1 vs PC44) Does using a different convention change the answer?

As discussed above, a different convention (MM05 suggest one that has zero mean over the whole record) will change the ordering, significance and number of important PCs. In this case, the number of significant PCs increases to 5 (maybe 6) from 2 originally. This is the difference between the blue points (MBH98 convention) and the red crosses (MM05 convention) in the first figure. Also PC1 in the MBH98 convention moves down to PC4 in the MM05 convention. This is illustrated in the figure on the right, the red curve is the original PC1 and the blue curve is MM05 PC4 (adjusted to have same variance and mean). But as we stated above, the underlying data has a hockey stick structure, and so in either case the ‘hockey stick’-like PC explains a significant part of the variance. Therefore, using the MM05 convention, more PCs need to be included to capture the significant information contained in the tree ring network.

This figure shows the difference in the final result whether you use the original convention and 2 PCs (blue) and the MM05 convention with 5 PCs (red). The MM05-based reconstruction is slightly less skillful when judged over the 19th century validation period but is otherwise very similar. In fact any calibration convention will lead to approximately the same answer as long as the PC decomposition is done properly and one determines how many PCs are needed to retain the primary information in the original data.

different conventions
5) What happens if you just use all the data and skip the whole PCA step?

This is a key point. If the PCs being used were inadequate in characterizing the underlying data, then the answer you get using all of the data will be significantly different. If, on the other hand, enough PCs were used, the answer should be essentially unchanged. This is shown in the figure below. The reconstruction using all the data is in yellow (the green line is the same thing but with the ‘St-Anne River’ tree ring chronology taken out). The blue line is the original reconstruction, and as you can see the correspondence between them is high. The validation is slightly worse, illustrating the trade-off mentioned above i.e. when using all of the data, over-fitting during the calibration period (due to the increase number of degrees of freedom) leads to a slight loss of predictability in the validation step.

No PCA comparison

6) So how do MM05 conclude that this small detail changes the answer?

MM05 claim that the reconstruction using only the first 2 PCs with their convention is significantly different to MBH98. Since PC 3,4 and 5 (at least) are also significant they are leaving out good data. It is mathematically wrong to retain the same number of PCs if the convention of standardization is changed. In this case, it causes a loss of information that is very easily demonstrated. Firstly, by showing that any such results do not resemble the results from using all data, and by checking the validation of the reconstruction for the 19th century. The MM version of the reconstruction can be matched by simply removing the N. American tree ring data along with the ‘St Anne River’ Northern treeline series from the reconstruction (shown in yellow below). Compare this curve with the ones shown above.

No N. American tree rings

As you might expect, throwing out data also worsens the validation statistics, as can be seen by eye when comparing the reconstructions over the 19th century validation interval. Compare the green line in the figure below to the instrumental data in red. To their credit, MM05 acknowledge that their alternate 15th century reconstruction has no skill.

validation period

7) Basically then the MM05 criticism is simply about whether selected N. American tree rings should have been included, not that there was a mathematical flaw?

Yes. Their argument since the beginning has essentially not been about methodological issues at all, but about ‘source data’ issues. Particular concerns with the “bristlecone pine” data were addressed in the followup paper MBH99 but the fact remains that including these data improves the statistical validation over the 19th Century period and they therefore should be included.

Hockey Team *used under GFDL license8) So does this all matter?

No. If you use the MM05 convention and include all the significant PCs, you get the same answer. If you don’t use any PCA at all, you get the same answer. If you use a completely different methodology (i.e. Rutherford et al, 2005), you get basically the same answer. Only if you remove significant portions of the data do you get a different (and worse) answer.

9) Was MBH98 the final word on the climate of last millennium?

Not at all. There has been significant progress on many aspects of climate reconstructions since MBH98. Firstly, there are more and better quality proxy data available. There are new methodologies such as described in Rutherford et al (2005) or Moberg et al (2005) that address recognised problems with incomplete data series and the challenge of incorporating lower resolution data into the mix. Progress is likely to continue on all these fronts. As of now, all of the ‘Hockey Team’ reconstructions (shown left) agree that the late 20th century is anomalous in the context of last millennium, and possibly the last two millennia.

The climate of the climate change debate is changing (The Guardian)

Quantifying how greenhouse gases contribute to extreme weather is a crucial step in calculating the cost of human influence

Myles Allen

guardian.co.uk, Wednesday 11 July 2012 12.08 BST

Climate change could trap hundreds of millions in disaster areas, report claims

This week, climate change researchers were able to attribute recent examples of extreme weather to the effects of human activity on the planet’s climate systems for the first time. Photograph: Rizwan Tabassum/AFP/Getty Images

The climate may have changed this week. Not the physical climate, but the climate of the climate change debate. Tuesday marked thepublication of a series of papers examining the factors behind extreme weather events in 2011. Nothing remarkable about that, you might think, except, if all goes well, this will be the first of a regular, annual assessment quantifying how external drivers of climate contribute to damaging weather.

Some of these drivers, like volcanoes, are things we can do nothing about. But others, like rising levels of greenhouse gases, we can. And quantifying how greenhouse gases contribute to extreme weather is a crucial step in pinning down the real cost of human influence on climate. While most people think of climate change in terms of shrinking ice-sheets and slowly rising sea levels, it is weather events that actually do harm.

This week also saw a workshop in Oxford for climate change negotiators from developing countries. Again, nothing remarkable about that except, for the first time, the issue of “loss and damage” was top of the agenda. For years negotiations have been over emission reductions and sharing the costs of adaptation. Now the debate is turning to: who is going to pay for damage done?

It is a good time to ask, since the costs that can unambiguously be attributed to human-induced climate change are still relatively small. Although Munich Re estimates that weather events in 2011 cost more than $100bn and claimed many thousands of lives, only a few of these events were clearly made more likely by human influence. Others may have been made less likely, but occurred anyway – chance remains the single dominant factor in when and where a weather event occurs. For the vast majority of events, we simply don’t yet know either way.

Connecting climate change and specific weather events is only one link in the causal chain between greenhouse gas emissions and actual harm. But it is a crucial link. If, as planned, the assessment of 2011 becomes routine, we should be able to compare actual weather-related damage, in both good years and bad, with the damage that might have been in a world without human influence on climate. This puts us well on our way to a global inventory of climate change impacts. And as soon as that is available, the question of compensation will not be far behind.

The presumption in climate change negotiations is that “countries with historically high emissions” would be first in line to foot the bill for loss and damage. There may be some logic to this, but if you are an African (or Texan) farmer hit by greenhouse-exacerbated drought, is the European or American taxpayer necessarily the right place to look for compensation? As any good lawyer knows, there is no point in suing a man with empty pockets.

The only institution in the world that could deal with the cost of climate change without missing a beat is the fossil fuel industry: BP took a $30bn charge for Deepwater Horizon, very possibly more than the total cost of climate change damages last year, and was back in profit within months. Of the $5 trillion per year we currently spend on fossil energy, a small fraction would take care of all the loss and damage attributable to climate change for the foreseeable future several times over.

Such a pay-as-you-go liability regime would not address the impacts of today’s emissions on the 22nd century. Governments cannot wash their hands of this issue entirely. But we have been so preoccupied with the climate of the 22nd century that we have curiously neglected to look after the interests of those being affected by climate change today.

So rather than haggling over emission caps and carbon taxes, why not start with a simple statement of principle: standard product liability applies to anyone who sells or uses fossil fuels, including liability for any third-party side-effects. There is no need at present to say what these side-effects might be – indeed, the scientific community does not yet know. But we are getting there.

This summer is ‘what global warming looks like’ (AP) + related & reactions

Jul 3, 1:10 PM EDT

By SETH BORENSTEIN
AP Science Writer

AP PhotoAP Photo/Matthew Barakat

WASHINGTON (AP) — Is it just freakish weather or something more? Climate scientists suggest that if you want a glimpse of some of the worst of global warming, take a look at U.S. weather in recent weeks.

Horrendous wildfires. Oppressive heat waves. Devastating droughts. Flooding from giant deluges. And a powerful freak wind storm called a derecho.

These are the kinds of extremes experts have predicted will come with climate change, although it’s far too early to say that is the cause. Nor will they say global warming is the reason 3,215 daily high temperature records were set in the month of June.

Scientifically linking individual weather events to climate change takes intensive study, complicated mathematics, computer models and lots of time. Sometimes it isn’t caused by global warming. Weather is always variable; freak things happen.

And this weather has been local. Europe, Asia and Africa aren’t having similar disasters now, although they’ve had their own extreme events in recent years.

But since at least 1988, climate scientists have warned that climate change would bring, in general, increased heat waves, more droughts, more sudden downpours, more widespread wildfires and worsening storms. In the United States, those extremes are happening here and now.

So far this year, more than 2.1 million acres have burned in wildfires, more than 113 million people in the U.S. were in areas under extreme heat advisories last Friday, two-thirds of the country is experiencing drought, and earlier in June, deluges flooded Minnesota and Florida.

“This is what global warming looks like at the regional or personal level,” said Jonathan Overpeck, professor of geosciences and atmospheric sciences at the University of Arizona. “The extra heat increases the odds of worse heat waves, droughts, storms and wildfire. This is certainly what I and many other climate scientists have been warning about.”

Kevin Trenberth, head of climate analysis at the National Center for Atmospheric Research in fire-charred Colorado, said these are the very record-breaking conditions he has said would happen, but many people wouldn’t listen. So it’s I told-you-so time, he said.

As recently as March, a special report an extreme events and disasters by the Nobel Prize-winning Intergovernmental Panel on Climate Change warned of “unprecedented extreme weather and climate events.” Its lead author, Chris Field of the Carnegie Institution and Stanford University, said Monday, “It’s really dramatic how many of the patterns that we’ve talked about as the expression of the extremes are hitting the U.S. right now.”

“What we’re seeing really is a window into what global warming really looks like,” said Princeton University geosciences and international affairs professor Michael Oppenheimer. “It looks like heat. It looks like fires. It looks like this kind of environmental disasters.”

Oppenheimer said that on Thursday. That was before the East Coast was hit with triple-digit temperatures and before a derecho – a large, powerful and long-lasting straight-line wind storm – blew from Chicago to Washington. The storm and its aftermath killed more than 20 people and left millions without electricity. Experts say it had energy readings five times that of normal thunderstorms.

Fueled by the record high heat, this was among the strongest of this type of storm in the region in recent history, said research meteorologist Harold Brooks of the National Severe Storm Laboratory in Norman, Okla. Scientists expect “non-tornadic wind events” like this one and other thunderstorms to increase with climate change because of the heat and instability, he said.

Such patterns haven’t happened only in the past week or two. The spring and winter in the U.S. were the warmest on record and among the least snowy, setting the stage for the weather extremes to come, scientists say.

Since Jan. 1, the United States has set more than 40,000 hot temperature records, but fewer than 6,000 cold temperature records, according to the National Oceanic and Atmospheric Administration. Through most of last century, the U.S. used to set cold and hot records evenly, but in the first decade of this century America set two hot records for every cold one, said Jerry Meehl, a climate extreme expert at the National Center for Atmospheric Research. This year the ratio is about 7 hot to 1 cold. Some computer models say that ratio will hit 20-to-1 by midcentury, Meehl said.

“In the future you would expect larger, longer more intense heat waves and we’ve seen that in the last few summers,” NOAA Climate Monitoring chief Derek Arndt said.

The 100-degree heat, drought, early snowpack melt and beetles waking from hibernation early to strip trees all combined to set the stage for the current unusual spread of wildfires in the West, said University of Montana ecosystems professor Steven Running, an expert on wildfires.

While at least 15 climate scientists told The Associated Press that this long hot U.S. summer is consistent with what is to be expected in global warming, history is full of such extremes, said John Christy at the University of Alabama in Huntsville. He’s a global warming skeptic who says, “The guilty party in my view is Mother Nature.”

But the vast majority of mainstream climate scientists, such as Meehl, disagree: “This is what global warming is like, and we’ll see more of this as we go into the future.”

Intergovernmental Panel on Climate Change report on extreme weather: http://ipcc-wg2.gov/SREX/

U.S. weather records:

http://www.ncdc.noaa.gov/extremes/records/

Seth Borenstein can be followed at http://twitter.com/borenbears

© 2012 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed. Learn more about our Privacy Policy and Terms of Use.

*   *   *

July 3, 2012

To Predict Environmental Doom, Ignore the Past

http://www.realclearscience.com

By Todd Myers

The information presented here cannot be used directly to calculate Earth’s long-term carrying capacity for human beings because, among other things, carrying capacity depends on both the affluence of the population being supported and the technologies supporting it. – Paul Ehrlich, 1986

One would expect scientists to pause when they realize their argument about resource collapse makes the king of environmental catastrophe, Paul Ehrlich, look moderate by comparison. Ehrlich is best known for a 40-year series of wildly inaccurate predictions of looming environmental disaster. Yet he looks positively reasonable compared to a paper recently published in the scientific journal Nature titled “Approaching a state shift in Earth’s biosphere.”

The paper predicts we are rapidly approaching a moment of “planetary-scale critical transition,” due to overuse of resources, climate change and other human-caused environmental damage. As a result, the authors conclude, this will “require reducing world population growth and per-capita resource use; rapidly increasing the proportion of the world’s energy budget that is supplied by sources other than fossil fuels,” and a range of other drastic policies. If these sound much like the ideas proposed in the 1970s by Ehrlich and others, like The Club of Rome, it is not a coincidence. TheNature paper is built on Ehrlich’s assumptions and cites his work more than once.

The Nature article, however, suffers from numerous simple statistical errors and assumptions rather than evidence. Its authors do nothing to deal with the fundamental mistakes that led Ehrlich and others like him down the wrong path so many times. Instead, the paper simply argues that with improved data, this time their predictions of doom are correct.

Ultimately, the piece is a good example of the great philosopher of science Thomas Kuhn’s hypothesis, written 50 years ago, that scientists often attempt to fit the data to conform to their particular scientific paradigm, even when that paradigm is obviously flawed. When confronted with failure to explain real-world phenomena, the authors of the Nature piece have, as Kuhn described in The Structure of Scientific Revolutions, devised “numerous articulations and ad hoc modifications of their theory in order to eliminate any apparent conflict.” Like scientists blindly devoted to a failed paradigm, the Nature piece simply tries to force new data to fit a flawed concept.

“Assuming this does not change”

During the last half-century, the world has witnessed a dramatic increase in food production. According to the U.N.’s Food and Agriculture Organization, yields per acre of rice have more than doubled, corn yields are more than one-and-a-half times larger than 50 years ago, and wheat yields have almost tripled. As a result, even as human population has increased, worldwide hunger has declined.

Despite these well-known statistics, the authors of the Nature study assume not only no future technological improvements, but that none have occurred over the last 200 years. The authors simply choose one data point and then project it both into the past and into the future. The authors explain the assumption that underlies their thesis in the caption to a graphic showing the Earth approaching environmental saturation. They write:

“The percentages of such transformed lands… when divided by 7,000,000,000 (the present global human population) yield a value of approximately 2.27 acres (0.92 ha) of transformed land for each person. That value was used to estimate the amount of transformed land that probably existed in the years 1800, 1900 and 1950, and which would exist in 2025 and 2045 assuming conservative population growth and that resource use does not become any more efficient.” (emphasis added)

In other words, the basis for their argument ignores the easily accessible data from the last half century. They take a snapshot in time and mistake it for a historical trend. In contrast to their claim of no change in the efficient use of resources, it would be difficult to find a time period in the last millennium when resource use did not become more efficient.

Ironically, this is the very error Ehrlich warns against in his 1986 paper – a paper the authors themselves cite several times. Despite Ehrlich’s admonition that projections of future carrying capacity are dependent upon technological change, the authors of the Nature article ignore history to come to their desired conclusion.

A Paradigm of Catastrophe

What would lead scientists to make such simplistic assumptions and flat-line projections? Indeed, what would lead Nature editors to print an article whose statistical underpinnings are so flawed? The simple belief in the paradigm of inevitable environmental catastrophe: humans are doing irreparable damage to the Earth and every bit of resource use moves us closer to that catastrophe. The catastrophe paradigm argues a simple model that eventually we will run out of space and resources, and determining the date of ultimate doom is a simple matter of doing the math.

Believing in this paradigm also justifies exaggeration in order to stave off the serious consequences of collapse. Thus, they describe the United Nations’ likely population estimate for 2050 as “the most conservative,” without explaining why. They claim “rapid climate change shows no signs of slowing” without providing a source citation for the claim, and despite an actual slowing of climate change over the last decade.

The need to avoid perceived global catastrophe also encourages the authors to blow past warning signs that their analysis is not built on solid foundations – as if the poor history of such projections were not already warning enough. Even as they admit the interactions “between overlapping complex systems, however, are providing difficult to characterize mathematically,” they base their conclusions on the simplest linear mathematical estimate that assumes nothing will change except population over the next 40 years. They then draw a straight line, literally, from today to the environmental tipping point.

Why is such an unscientific approach allowed to pass for science in a respected international journal? Because whatever the argument does not supply, the paradigm conveniently fills in. Even if the math isn’t reliable and there are obvious counterarguments, “everyone” understands and believes in the underlying truth – we are nearing the limits of the planet’s ability to support life. In this way the conclusion is not proven but assumed, making the supporting argument an impenetrable tautology.

Such a circumstance creates the conditions of scientific revolutions, where the old paradigm fails to explain real-world phenomena and is replaced by an alternative. Given the record of failure of the paradigm of resource catastrophe, dating back to the 1970s, one would hope we are moving toward such a change. Unfortunately, Nature and the authors of the piece are clinging to the old resource-depletion model, simply trying to re-work the numbers.

Let us hope policymakers recognize the failure of that paradigm before they make costly and dangerous policy mistakes that impoverish billions in the name of false scientific assumptions.

Todd Myers is the Environmental Director of the Washington Policy Center and author of the book Eco-Fads.

*   *   *

Washington Policy Center exposed: Todd Myers

The Washington Policy Center labels itself as a non-partisan think tank. It’s a mischaractization to say the least but that is their bread and butter. Based in Seattle, with a director in Spokane, the WPC’s mission is to “promote free-market solutions through research and education.” It makes sense they have an environmental director in the form of Todd Myers who has a new book called“Eco-Fads: How The Rise Of Trendy Environmentalism Is Harming The Environment.” You know, since polar bears love to swim.


From the WPC’s newsletter:

Wherever we turn, politicians, businesses and activists are promoting the latest fashionable “green” policy or product. Green buildings, biofuels, electric cars, compact fluorescent lightbulbs and a variety of other technologies are touted as the next key step in protecting the environment and promoting a sustainable future. Increasingly, however, scientific and economic information regarding environmental problems takes a back seat to the social and personal value of being seen and perceived as “green.”

As environmental consciousness has become socially popular, eco-fads supplant objective data. Politicians pick the latest environmental agenda in the same way we choose the fall fashions – looking for what will yield the largest benefit with our public and social circles.

Eco-Fads exposes the pressures that cause politicians, businesses, the media and even scientists to fall for trendy environmental fads. It examines why we fall for such fads, even when we should know better. The desire to “be green” can cloud our judgment, causing us to place things that make us appear green ahead of actions that may be socially invisible yet environmentally responsible.

By recognizing the range of forces that have taken us in the wrong direction, Eco-Fads shows how we can begin to get back on track, creating a prosperous and sustainable legacy for our planet’s future. Order Eco-Fads today for $26.95 (tax and shipping included).

This is what the newsletter doesn’t tell you about Todd Myers.

Myers has spoken at the Heartland Institute’s International Conference on Climate Change. In case you didn’t know, the Heartland Institute has received significant funding from ExxonMobil, Phillip Morris and numerous other corporations and conservative foundations with vested interest in the so-called debate around climate change. That conference was co-sponsored by numerous prominent climate change denier groups, think tanks and lobby groups, almost all of which have received money from the oil industry.

Why not just call it the Washington Fallacy Center? For a litte more background, including ties back to the Koch Brothers, go HERE. In fact, Jack Kemp calls it “The Heritage Foundation of the Northwest.”

*   *   *

 

Did climate change ’cause’ the Colorado wildfires?

By David Roberts

29 Jun 2012 1:50 PM

http://grist.org

Photo by USAF.

The wildfires raging through Colorado and the West are unbelievable. As of yesterday there were 242 fires burning, according to the National Interagency Fire Center. Almost 350 homes have been destroyed in Colorado Springs, where 36,000 people have been evacuated from their homes. President Obama is visiting today to assess the devastation for himself.

Obviously the priority is containing the fires and protecting people. But inevitably the question is going to come up: Did climate change “cause” the fires? Regular readers know that this question drives me a little nuts. Pardon the long post, but I want to try to tackle this causation question once and for all.

What caused the Colorado Springs fire? Well, it was probably a careless toss of a cigarette butt, or someone burning leaves in their backyard, or a campfire that wasn’t properly doused. [UPDATE:Turns out it was lightning.] That spark, wherever it came from, is what triggered the cascading series of events we call “a fire.” It was what philosophers call the proximate cause, the most immediate, the closest.

All the other factors being discussed — the intense drought covering the state, the dead trees left behind by bark beetles, the high winds — are distal causes. Distal causes are less tightly connected to their effects. The dead trees didn’t make any particular fire inevitable; there can be no fire without a spark. What they did is make it more likelythat a fire would occur. Distal causes are like that: probabilistic. Nonetheless, our intuitions tell us that distal causes are in many ways more satisfactory explanations. They tell us something about themeaning of events, not just the mechanisms, which is why they’re also called “ultimate” causes. It’s meaning we usually want.

When we say, “the fires in Colorado were caused by unusually dry conditions, high winds, and diseased trees,” no one accuses us of error or imprecision because it was “really” the matches or campfires that caused them. We are not expected to say, “no individual fire can be definitively attributed to hot, windy conditions, but these are the kinds of fires we would expect to see in those conditions.” Why waste the words? We are understood to be talking about distal causes.

When we talk about, not fires themselves, but the economic and socialimpacts of fires, the range of distal causes grows even broader. For a given level of damages, it’s not enough to have dry conditions and dead trees, not even enough to have fire — you also have to take into account the density of development, the responsiveness of emergency services, and the preparedness of communities for prevention or evacuation.

So if we say, “the limited human toll of the Colorado fires is the result of the bravery and skill of Western firefighters,” no one accuses us of error or imprecision because good firefighting was only one of many contributors to the final level of damages. Everything from evacuation plans to the quality of the roads to the vagaries of the weather contributed in some way to that state of affairs. But we are understood to be identifying a distal cause, not giving a comprehensive account of causation.

What I’m trying to say is, we are perfectly comfortable discussing distal causes in ordinary language. We don’t require scientistic literalism in our everyday talk.

The reason I’m going through all this, you won’t be surprised, is to tie it back to climate change. We know, of course, that climate change was not the proximate cause of the fires. It was a distal cause; it made the fires more likely. That much we know with a high degree of confidence, as this excellent review of the latest science by Climate Communication makes clear.

One can distinguish between distal causes by their proximity to effects. Say the drought made the fires 50 percent more likely than average June conditions in Colorado. (I’m just pulling these numbers out of my ass to illustrate a point.) Climate change maybe only made the fires 1 percent more likely. As a cause, it is more distal than the drought. And there are probably causes even more distal than climate change. Maybe the exact tilt of the earth’s axis this June made the fires 0.0001 percent more likely. Maybe the location of a particular proton during the Big Bang made them 0.000000000000000001 percent more likely. You get the point.

With this in mind, it’s clear that the question as it’s frequently asked — “did climate change cause the fires?” — is not going to get us the answer we want. If it’s yes or no, the answer is “yes.” But that doesn’t tell us much. What people really want to know when they ask that question is, “how proximate a cause is climate change?”

When we ask the question like that, we start to see why climate is such a wicked problem. Human beings, by virtue of their evolution, physiology, and socialization, are designed to heed causes within a particular range between proximate and distal. If I find my kid next to an overturned glass and a puddle of milk and ask him why the milk is spilled, I don’t care about the neurons firing and the muscles contracting. That’s too proximate. I don’t care about humans evolving with poor peripheral vision. That’s too distal. I care about my kid reaching for it and knocking it over. That’s not the only level of causal explanation that is correct, but it’s the level of causal explanation that is most meaningful to me.

For a given effect — a fire, a flood, a dead forest — climate change is almost always too distal a cause to make a visceral impression on us. We’re just not built to pay heed to those 1 percent margins. It’s too abstract. The problem is, wildfires being 1 percent more likely averaged over the whole globe actually means a lot more fires, a lot more damage, loss, and human suffering. Part of managing the Anthropocene is finding ways of making distal causes visceral, giving them a bigger role in our thinking and institutions.

That’s what the “did climate change cause XYZ?” questions are always really about: how proximate a cause climate change is, how immediate its effects are in our lives, how close it is.

There is, of course, a constant temptation among climate hawks to exaggerate how proximate it is, since, all things being equal, proximity = salience. But I don’t think that simply saying “climate change caused the fires” is necessarily false or exaggerated, any more than saying “drought caused the fires” is. The fact that the former strikes many people as suspect while the latter is immediately understood mostly just means that we’re not used to thinking of climate change as a distal cause among others.

That’s why we reach for awkward language like, “fires like this are consonant with what we would expect from climate change.” Not because that’s the way we discuss all distal causes — it’s clearly not — but simply because we’re unaccustomed to counting climate change among those causes. It’s an unfamiliar habit. As it grows more familiar, I suspect we’ll quit having so many of these tedious semantic disputes.

And I’m afraid that, in coming years, it will become all-too familiar.

*   *   *

 

Perspective On The Hot and Dry Continental USA For 2012 Based On The Research Of Judy Curry and Of McCabe Et Al 2004

http://pielkeclimatesci.wordpress.com

Photo is from June 26 2012 showing start of the June 26 Flagstaff firenear Boulder Colorado

I was alerted to an excellent presentation by Judy Curry [h/t to Don Bishop] which provides an informative explanation of the current hot and dry weather in the USA. The presentation is titled

Climate Dimensions of the Water Cycle by Judy Curry

First, there is an insightful statement by Judy where she writes in slide 5

CMIP century scale simulations are designed for assessing sensitivity to greenhouse gases using emissions scenarios They are not fit for the purpose of inferring decadal scale or regional climate variability, or assessing variations associated with natural forcing and internal variability. Downscaling does not help.

We need a much broader range of scenarios for regions (historical data, simple models, statistical models, paleoclimate analyses, etc). Permit creatively constructed scenarios as long as they can’t be falsified as incompatible with background knowledge.

With respect to the current hot and dry weather, the paper referenced by Judy in her Powerpoint talk

Gregory J. McCabe, Michael A. Palecki, and Julio L. Betancourt, 2004: Pacific and Atlantic Ocean influences on multidecadal drought frequency in the United States. PNAS 2004 101 (12) 4136-4141; published ahead of print March 11, 2004, doi:10.1073/pnas.0306738101

has the abstract [highlight added]

More than half (52%) of the spatial and temporal variance in multidecadal drought frequency over the conterminous United States is attributable to the Pacific Decadal Oscillation (PDO) and the Atlantic Multidecadal Oscillation (AMO). An additional 22% of the variance in drought frequency is related to a complex spatial pattern of positive and negative trends in drought occurrence possibly related to increasing Northern Hemisphere temperatures or some other unidirectional climate trend. Recent droughts with broad impacts over the conterminous U.S. (1996, 1999–2002) were associated with North Atlantic warming (positive AMO) and northeastern and tropical Pacific cooling (negative PDO). Much of the long-term predictability of drought frequency may reside in the multidecadal behavior of the North Atlantic Ocean. Should the current positive AMO (warm North Atlantic) conditions persist into the upcoming decade, we suggest two possible drought scenarios that resemble the continental-scale patterns of the 1930s (positive PDO) and 1950s (negative PDO) drought.

They also present the figure below with the title “Impact of AMO, PDO on 20-yr drought frequency (1900-1999)”.   The figures correspond to A: Warm PDO, cool AMO; B: Cool PDO, cool AMO; C: Warm PDO, warm AMO and D:  Cool PDO, warm AMO

The current Drought Monitor analysis shows a remarkable agreement with D, as shown below

As Judy shows in her talk (slide 8) since 1995 we have been in a warm phase of the AMO and have entered a cool phase of the PDO. This corresponds to D in the above figure.  Thus the current drought and heat is not an unprecedented event but part of the variations in atmospheric-ocean circulation features that we have seen in the past.  This reinforces what Judy wrote that

[w]e need a much broader range of scenarios for regions (historical data, simple models, statistical models, paleoclimate analyses

in our assessment of risks to key resources due to climate. Insightful discussions of the importance of these circulation features are also presented, as just a few excellent examples, by Joe Daleo  and Joe Bistardi on ICECAP, by Bob Tisdale at Bob Tisdale – Climate Observations, and in posts on Anthony Watts’s weblog Watts Up With That.

 

*   *   *

Hotter summers could be a part of Washington’s future

http://www.washingtonpost.com

By  and , Published: July 5

As relentless heat continues to pulverize Washington, the conversation has evolved from when will it end to what if it never does?

Are unbroken weeks of sweltering weather becoming the norm rather than the exception?

The answer to the first question is simple: Yes, it will end. Probably by Monday.

The answer to the second, however, is a little more complicated.

Call it a qualified yes.

“Trying to wrap an analysis around it in real time is like trying to diagnose a car wreck as the cars are still spinning,” said Deke Arndt, chief of climate monitoring at the National Climatic Data Center in Asheville, N.C. “But we had record heat for the summer season on the Eastern Seaboard in 2010. We had not just record heat, but all-time record heat, in the summer season in 2011. And then you throw that on top of this [mild] winter and spring and the year to date so far, it’s very consistent with what we’d expect in a warming world.”

Nothing dreadfully dramatic is taking place — the seasons are not about to give way to an endless summer.

Heat-trapping greenhouse gases pumped into the atmosphere may be contributing to unusually hot and long heat waves — the kind of events climate scientists have long warned will become more common. Many anticipate a steady trend of ever-hotter average temperatures as human activity generates more and more carbon pollution.

To some, the numbers recorded this month and in recent years fit together to suggest a balmy future.

“We had a warm winter, a cold spring and now a real hot summer,” said Jessica Miller, 21, a visitor from Ohio, as she sat on a bench beneath the trees in Lafayette Square. “I think the overall weather patterns are changing.”

Another visitor, who sat nearby just across from the White House, shared a similar view.

“I think it’s a natural changing of the Earth’s average temperatures,” said Joe Kaufman, a Pennsylvanian who had just walked over from Georgetown.

Arndt said he expects data for the first half of this year will show that it was the warmest six months on record. Experts predict that average temperatures will rise by 3 to 5 degrees by mid-century and by 6 to 10 degrees by the end of the century.

If that worst prediction comes true, 98 degrees will become the new normal at this time of year in Washington 88 years from now.

Will every passing year till then break records?

“Not so much record-breaking every year,” Arndt said. “But we’ll break records on the warm end more often than on the cold end, that’s for sure. As we continue to warm, we will be flirting with warm records much more than with cold records, and that’s what’s played out over much of the last few years.”

If the present is our future, it may be sizzling. The current heat wave has had eight consecutive days of 95-degree weather. The temperature may reach 106 on Saturday, and the first break will come Monday, when a few days of more seasonable highs in the upper 80s are expected.

The hot streak began June 28 and peaked the next day with a 104-degree record-breaker, the hottest temperature ever recorded here in June. That broke a record of 102 set in 1874 and matched in June 2011.

 

 

Coastal N.C. counties fighting sea-level rise prediction (News Observer)

MON, MAY 28, 2012 10:50 PM

BY BRUCE HENDERSON
The News & Observer Publishing Company

State lawmakers are considering a measure that would limit how North Carolina prepares for sea-level rise, which many scientists consider one of the surest results of climate change.

Federal authorities say the North Carolina coast is vulnerable because of its low, flat land and thin fringe of barrier islands. A state-appointed science panel has reported that a 1-meter rise in sea level is likely by 2100.

The calculation, prepared for the N.C. Coastal Resources Commission, was intended to help the state plan for rising water that could threaten 2,000 square miles. Critics say it could thwart economic development on just as large a scale.

A coastal economic development group called NC-20 attacked the report, insisting the scientific research it cited is flawed. The science panel last month confirmed its findings, recommending that they be reassessed every five years.

But NC-20, named for the 20 coastal counties, appears to be winning its campaign to undermine them.

The Coastal Resources Commission agreed to delete references to planning benchmarks – such as the 1-meter prediction – and new development standards for areas likely to be inundated.

The N.C. Division of Emergency Management, which is using a $5 million federal grant to analyze the impact of rising water, lowered its worst-case scenario prediction from 1 meter (about 39 inches) to 15 inches by 2100.

Politics and economics in play

Several local governments on the coast have passed resolutions against sea-level rise policies.

When the General Assembly convened this month, Republican legislators went further.

They circulated a bill that authorizes only the coastal commission to calculate how fast the sea is rising. It said the calculations must be based only on historic trends – leaving out the accelerated rise that climate scientists widely expect this century if warming increases and glaciers melt.

The bill, a substitute for an unrelated measure the N.C. House passed last year, has not been introduced. State legislative officials say they can’t predict how it might be changed, or when or whether it will emerge.

Longtime East Carolina University geologist Stan Riggs, a science panel member who studies the evolution of the coast, said the 1-meter estimate is squarely within the mainstream of research.

“We’re throwing this science out completely, and what’s proposed is just crazy for a state that used to be a leader in marine science,” he said of the proposed legislation. “You can’t legislate the ocean, and you can’t legislate storms.”

NC-20 Chairman Tom Thompson, economic development director in Beaufort County, said his members – many of them county managers and other economic development officials – are convinced that climate changes and sea-level rises are part of natural cycles. Climate scientists who say otherwise, he believes, are wrong.

The group’s critiques quote scientists who believe the rate of sea-level rise is actually slowing. NC-20 says the state should rely on historical trends until acceleration is detected. The computer models that predict a quickening rate could be inaccurate, it says.

“If you’re wrong and you start planning today at 39 inches, you could lose millions of dollars in development and 2,000 square miles would be condemned as a flood zone,” Thompson said. “Is it really a risk to wait five years and see?”

State planners concerned

State officials say the land below the 1-meter elevation would not be zoned as a flood zone and off-limits to development. Planners say it’s crucial to allow for rising water when designing bridges, roads, and sewer lines that will be in use for decades.

“We’re concerned about it,” said Philip Prete, an environmental planner in Wilmington, which will soon analyze the potential effects of rising water on infrastructure. “For the state to tie our hands and not let us use the information that the state science panel has come up with makes it overly restrictive.”

Other states, he said, are “certainly embracing planning.”

Maine is preparing for a rise of up to 2 meters by 2100, Delaware 1.5 meters, Louisiana 1 meter and California 1.4 meters. Southeastern Florida projects up to a 2-foot rise by 2060.

Dueling studies

NC-20 says the state should plan for 8 inches of rise by 2100, based on the historical trend in Wilmington.

The science panel based its projections on records at the northern coast town of Duck, where the rate is twice as fast, and factored in the accelerated rise expected to come later. Duck was chosen, the panel said, because of the quality of its record and site on the open ocean.

The panel cites seven studies that project global sea level will rise as much as 1 meter, or more, by 2100. The Intergovernmental Panel on Climate Change estimated in 2007 a rise of no more than 23 inches, but did not factor in the melting land ice that many scientists now expect.

NC-20’s science adviser, Morehead City physicist John Droz, says he consulted with 30 sea-level experts, most of them not named in his latest critique of the panel’s work. He says the 13-member panel failed to do a balanced review of scientific literature, didn’t use the best available science and made unsupported assumptions.

“I’m not saying these people are liars,” Thompson said. “I’m saying they have a passion for sea-level rise and they can’t give it up.”

John Dorman of the N.C. Division of Emergency Management, which is preparing a study of sea-level impact, said an “intense push” by the group and state legislators led to key alterations.

Instead of assuming a 1-meter, worst-case rise, he said, the study will report the impact of seas that rise only 3.9, 7.8, 11.7 and 15.6 inches by 2100. The 1-meter analysis will be available to local governments that request it.

“It’s not the product we had put the grant out for,” Dorman said, referring to the $5 million from the Federal Emergency Management Agency that’s paying for the study. Coastal communities will still find the work useful, he predicts.

The backlash on the coast centers on the question of whether sea-level rise will accelerate, said Bob Emory, chairman of the Coastal Resources Commission.

Emory, who lives in New Bern, said the commission deleted wording from its proposed sea-level rise policy that hinted at new regulations in order to find common ground. “Any remaining unnecessarily inflammatory language that’s still in there, we want to get out,” he said.

New information will be incorporated as it comes out, he said.

“There are people who disagree on the science. There are people who worry about what impact even talking about sea-level rise will have on development,” Emory said. “It’s my objective to have a policy that makes so much sense that people would have trouble picking at it.”

In written comments, the N.C. Department of Environment and Natural Resources said the legislation that circulated earlier this month appeared consistent with the coastal commission’s policy changes.

But the department warned of the “unintended impacts” of not allowing agencies other than the coastal commission to develop sea-level rise policies. The restriction could undermine the Division of Emergency Management’s study, it said, and the ability of transportation and emergency-management planners to address rising waters.

The N.C. Coastal Federation, the region’s largest environmental group, said the bill could hurt local governments in winning federal planning grants. Insurance rates could go up, it says.

Relying solely on historical trends, the group said, is like “being told to make investment decisions strictly on past performance and not being able to consider market trends and research.”

Liderança “verde” do Brasil cética com a Rio+20 (Envolverde/IPS)

Envolverde Rio + 20
11/6/2012 – 07h41

por Fabíola Ortiz, da IPS

e11 300x225 Liderança “verde” do Brasil cética com a Rio+20

Marina Silva. Foto: Divulgação.

Rio de Janeiro, Brasil, 11/6/2012 – A agenda para a Conferência das Nações Unidas sobre Desenvolvimento Sustentável, a Rio+20, que acontece este mês no Brasil, ainda carece de prioridades e seus resultados podem ficar sepultados diante das urgências da crise econômica global, afirmou a ex-ministra brasileira de Meio Ambiente (2003-2009), Marina Silva. O documento final da Rio+20 continua sendo “fraco e geral” e não contém contribuições que superem o que foi feito nos últimos 20 anos, desde a Cúpula da Terra de 1992, opinou Marina em entrevista a jornalistas de meios internacionais.

“A discussão sobre economia verde, desenvolvimento social e governança perdeu força, por isso qualquer acordo geral, que não tenha uma atitude crítica e não incorpore instrumentos para enfrentar a deterioração do planeta, atentará contra a memória da cúpula de 1992”, criticou a ex-ministra, que também foi candidata a presidente pelo Partido Verde. Após obter 20 milhões de votos nas eleições presidenciais de 2010, Marina criou o não governamental Instituto Democracia e Sustentabilidade, o qual representará na Cúpula dos Povos na Rio+20 por Justiça Social e Ambiental, que acontecerá entre 15 e 23 deste mês, paralela à reunião oficial organizada pela Organização das Nações Unidas (ONU).

Para Marina, a linha de trabalho para a Rio+20, que em nível de chefes de Estado e de governo acontecerá entre os dias 20 e 22, continua com um grave problema de origem, que é aparecerem separadas ecologia e economia, quando deveriam estar integradas. “A União Europeia atende prioritariamente a crise econômica que a afeta, o presidente dos Estados Unidos, Barack Obama, fracassou em sua tentativa de avançar em uma agenda de clima e biodiversidade, e a China não se mobiliza e não assume compromissos”, declarou.

Apesar de cientistas de todo o mundo alertarem para os graves problemas que a humanidade enfrentará se não for detida a deterioração ambiental, os governos não incorporam em suas agendas propostas de solução ou mudança de rumo, alertou Marina. “O mundo enfrenta uma crise dramática, que se constitui de múltiplas crises: econômica, política, ambiental e de valores”, ressaltou.

A última fase de negociação prévia à Rio+20 será completada no dia 13 no Rio de Janeiro. Perante esta instância, a atual ministra de Meio Ambiente, Izabella Teixeira, contrapôs o pessimismo de Marina Silva ao afirmar que as notícias “são bastante promissoras” para o documento base do encontro. Admitiu que se está diante de um desafio importante, que é obter um consenso que exige compromissos e convergências dos governos. “Temos que facilitar e permitir que todos façam sua parte e que se sintam comprometidos com as diretrizes e os resultados da Rio+20”, afirmou à IPS.

A ministra entende que as negociações iniciadas em Nova York tiveram “avanços importantes”, por isso acredita que o legado da Rio+20 será mais amplo do que o da cúpula de duas décadas atrás, a Eco 92, e refletirá plenamente o conceito de sustentabilidade. Contudo, também reconheceu que devem ser discutidos “novos modelos econômicos ou não conseguiremos fazer a mudança para um desenvolvimento sustentável”.

Por sua vez, Marina Silva afirmou que o Brasil, como anfitrião da Rio+20, reúne as condições para fazer esse rompimento do modelo do Século 20 e atuar como ponte negociadora na busca de compromissos. “É muito importante que o Brasil assuma um papel de liderança para mediar saídas com propostas efetivas para este encontro, sob pena de acabar com a memória da Eco 92”, alertou. Em tom de crítica sobre a gestão da presidente Dilma Rousseff, Marina destacou que espera que sejam corrigidos os rumos de seu governo, para encabeçar uma “nova agenda de economia e desenvolvimento sustentável”.

O Brasil ainda não pode se considerar uma potência socioambiental, apesar de possuir 11% das reservas de água doce do mundo, 20% das espécies vivas, 60% de seu território coberto por florestas, 280 povos autóctones que falam cerca de 120 línguas diferentes além do português, enfatizou Marina. “Isto não nos transforma em uma potência ambiental por natureza, é preciso consegui-lo com atitudes políticas eficazes. Nossa agricultura tem condições para ter uma base sustentável, e não podemos repetir os mesmos equívocos cometidos pelos países industrializados”, indicou.

A aprovação do polêmico Código Florestal mostrou um retrocesso na política ambiental brasileira e coloca em discussão a liderança do governo quanto a levar adiante uma economia sustentável, advertiu Marina. “Vivemos um momento de dúvidas. Parece que podemos retroceder para uma economia semelhante à do século passado. No entanto, é possível ainda reduzir a pobreza, ter crescimento econômico e diminuir as emissões com menos devastação”, apontou.

Às vésperas da Rio+20, o Brasil passa por um desmonte de sua legislação ambiental, especialmente do Código Florestal, que segue adiante apesar de 80% dos entrevistados em diferentes pesquisas afirmarem não concordar com as mudanças realizadas. “O Brasil não precisa desmatar para se manter como um grande produtor de grãos, pois podemos duplicar a produção agrícola sem derrubar uma só árvore”, esclareceu Marina. “Temos tecnologia e conhecimento sem que seja necessário expandir a fronteira agrícola. Podemos produzir alimentos preservando a base natural de nosso desenvolvimento”, concluiu.

O termômetro para medir a Rio+20 (IPS)

Envolverde Rio + 20
12/6/2012 – 09h33

por Thalif Deen, da IPS

Slide15 O termômetro para medir a Rio+20

As inundações, como a que afetou Dominica em 2011, são um dos efeitos devastadores da mudança climática. Foto: Desmond Brown/IPS

Nações Unidas, 12/6/2012 – Quando terminar a Conferência das Nações Unidas sobre Desenvolvimento Sustentável, a Rio+20, que acontecerá entre os dias 20 e 22 deste mês, no Rio de Janeiro, qual será o critério para medir seus êxitos e fracassos? O secretário-geral da Organização das Nações Unidas (ONU), Ban Ki-moon, tem seus próprios parâmetros.

Segundo Ban, a Rio+20 deverá concluir com pelo menos um renovado compromisso político com a economia verde, uma série de metas de desenvolvimento sustentável, um contexto constitucional para implantar o novo plano de ação e associações com a sociedade civil. “Precisamos inventar um novo modelo, um modelo que ofereça crescimento e inclusão social, um modelo que respeite os recursos finitos do planeta”, afirmou Ban a jornalistas na primeira semana deste mês.

Entretanto, para Patricia Lerner, conselheira política do Greenpeace International, definir metas para 2030 não é suficiente. “A atenção deve estar concentrada nesta década, pois as ações que forem tomadas neste momento são fundamentais para prevenir a catastrófica mudança climática, salvar nossos oceanos e proteger os recursos florestais remanescentes, todos fundamentais para o desenvolvimento e o bem-estar humano”, declarou Lerner à IPS.

Por seu lado, o intergovernamental Centro do Sul, com sede em Genebra, tem expectativas maiores. Seu diretor-executivo, Martin Khor, afirmou à IPS que é imperativo reafirmar os compromissos adotados na Cúpula da Terra, realizada há 20 anos também no Rio de Janeiro. “Pelo menos isso demonstrará que os líderes, especialmente dos países industrializados, não estão retrocedendo no que acordaram”, acrescentou.

O princípio mais importante que deve ser reafirmado na Rio+20 é o de responsabilidades comuns diferenciadas, destacou Khor. Isto significa que os países do Norte devem aceitar que têm um dever maior na redução da contaminação das emissões de gases-estufa, causadores do aquecimento global. E, portanto, devem prover financiamento e tecnologia para os países do Sul, para que todos possam avançar no caminho da economia verde, acrescentou Khor, ex-diretor do escritório na Malásia da Rede do Terceiro Mundo.

Ban Ki-moon acrescentou que há 26 áreas prioritárias, que os próprios Estados-membros da ONU identificaram durante as negociações com vistas à Rio+20. Entre estas se destacam segurança alimentar, pobreza, educação, saúde, energia renovável, oceanos, água e saneamento, agricultura, igualdade de gênero e empoderamento das mulheres. “Pode demorar para se chegar a um acordo nas 26 áreas”, admitiu, mas acrescentou que ao menos se deveria pactuar no que ele considera “temas obrigatórios”.

Ban Ki-moon também exortou os Estados-membros das Nações Unidas a acordarem novas metas de desenvolvimento sustentável baseadas nos Objetivos de Desenvolvimento do Milênio, os quais a comunidade mundial se comprometeu a alcançar até 2015. “Temos apenas dois anos e meio”, ressaltou.

Khor disse à IPS que os líderes do mundo deveriam reconhecer que a crise no meio ambiente e na economia é muito mais grave hoje do que há 20 anos, e que devem ser adotados novos compromissos sistemáticos. Também cobrou um acordo que fortaleça significativamente as instituições encarregadas do desenvolvimento sustentável, de forma séria e adequada.

A Comissão das Nações Unidas sobre Desenvolvimento Sustentável era toda uma promessa em seu início, mas demonstrou ser muito frágil: reúne-se apenas duas ou três vezes ao ano. “Tem de ser reformada radicalmente ou transformada em um novo conselho ou fórum que possa encarar os desafios impostos pela crise global em suas três dimensões: ambiental, econômica e social”, sugeriu Khor.

A comissão, prosseguiu Khor, deveria manter reuniões regulares, e sua secretaria deveria ser forte, com mais pessoal e dinamismo. A Rio+20 deveria fortalecer esse órgão para que realize um acompanhamento das decisões adotadas pelos líderes políticos, enquanto o Programa das Nações Unidas para o Meio Ambiente (Pnuma) deveria, por seu lado, fortalecer suas operações, enfatizou. “É preciso haver claros compromissos para apoiar os países em desenvolvimento com o objetivo de que estes assumam maiores responsabilidades nos problemas ambientais, sociais e econômicos”, observou.

A cúpula não pode ficar atrasada em matéria de implantação, apontou Khor. “Deve haver um novo compromisso e recursos financeiros adicionais para o desenvolvimento sustentável e a transferência de tecnologia em termos favoráveis e preferenciais, como foi acordado há duas décadas no Rio e muitas vezes depois”, ressaltou. “Me preocupa o fato de nos últimos meses muitos países industrializados terem emitido sinais de que não desejam manter esses compromissos. Isto seria desastroso”, alertou.

Envolverde/IPS

Free application for smartphones with real time information on the Rio+20 conference (IISD)

Free application for smartphones with real time information on the Rio+20 conference, by IISD:

http://itunes.apple.com/us/app/enb-mobile/id529597393?ls=1&mt=8

On the web:

http://enb.iisd.mobi/uncsd-rioplus20/

 

The top five things voters need to know about conservatives and climate change (Grist.org)

By David Roberts4 Jun 2012 3:46 PM

Five! (Photo by woodleywonderworks)I’ve seen a recent surge of stories about conservatives and climate change. None of them, oddly, tell voters what they most need to know on the subject. In fact, one of them does the opposite. (Grrrr …)I respond in accordance with internet tradition: a listicle!
5. Conservatives have a long history of advancing environmental progress. In a column directed to Mitt Romney, Thomas Friedman reels off(one suspects from memory) “the G.O.P.’s long tradition of environmental stewardship that some Republicans are still proud of: Teddy Roosevelt bequeathed us national parks, Richard Nixon the Clean Air Act and the Environmental Protection Agency, Ronald Reagan the Montreal Protocol to protect the ozone layer and George H. W. Bush cap-and-trade that reduced acid rain.” This familiar litany is slightly misleading, attributing to presidents what is mostly the work of Congresses, but the basic point is valid enough: In the 20th century, Republicans have frequently played a constructive role on the environment.
4. There is a conservative approach to addressing climate change. Law professor Jonathan Adler has laid it out in the past and does so again in a much-discussed post over at The Atlantic. He suggests prizes for innovation, reduced regulatory barriers to alternative energy, a revenue-neutral carbon tax, and some measure of adaptation.It’ll be no surprise to Adler or anyone else that I believe the problem is more severe than he does; solving it — as opposed to just “doing something” — will involve a far more vigorous government role than he envisions. But he makes an eloquent, principled case for the simple notion that “embrace of limited government principles need not entail the denial of environmental claims.” Conservatives could, if they wanted, spend their time arguing for their preferred solutions rather than denying scientific results.
3. There are conservatives who believe in taking action on climate change. Even thosedismal polls we’re always talking about find 30 or 40 percent of Republicans acknowledging the threat of climate change. And support for clean air and clean energy policies remains high across the board. Heck, some — OK, a tiny handful of — conservatives are even brave enough to say so in public! It’s really only the hard nut of the GOP, anywhere from 15 to 30 percent, depending on how you measure, that is intensely and ideologically opposed to climate science and solutions alike. Oh, and almost all Republicans in Congress.
2. Mitt Romney used to say and do moderate things on green issues when he was governor of Massachusetts. He spoke in favor of the Regional Greenhouse Gas Initiative, a cap-and-trade system for Northeastern states, and introduced the Massachusetts Climate Protection Plan. He wasn’t afraid to crack down on coal plants — I never get tired of thisremarkable video:Romney also directed considerable state funding to renewable energy companies and waged open war on sprawl. It’s almost like he was running a state where that kind of stuff was popular.
1. The Republican establishment has gone nuts on climate change and the environment.This, more than anything, is what American voters need to know about the Republican Party — not what Republicans used to do, or what one or two outliers say, but what the party as an extant political force is devoted to today. The actually existing GOP wants todismantle the EPA, open more public land to coal mining and oil drilling, remove what regulatory constraints remain on fossil-fuel companies, slash the budget for clean-energy research and deployment, scrap CAFE and efficiency standards, protect inefficient light bulbs, withdraw from all international negotiations or efforts on climate, and stop the military from using less oil.
Which brings me to the piece that drives me crazy, from National Journal‘s customarily excellent Amy Harder: “Campaign Energy Messages Differ; Policies Not So Much.”Seriously?No … seriously?I know journalists don’t headline their own pieces. But the piece itself isn’t much better. Take this bit:

Whether the data is inflated or not, the message that may be coming across most to voters is that there really isn’t much difference between Obama’s policies and those likely to be pursued in a Romney administration.

Ah, so the problem is not that Obama and Romney would have similar energy policies. That’s just the message “coming across to most voters.
”Now, if you’re a journalist, and you determine that voters are receiving a wildly incorrect message, what do you do? Do you write a story about their receipt of the incorrect message? Or do you correct the message?
The fact is, Romney would not pursue the same energy policies that Obama is pursuing. At all. Not even a little bit. It’s interesting, I suppose, that Romney used to run a state (and a state party) where moderate energy policy was demanded by voters. But what matters now is that Mitt Romney serves the present-day Republican Party, which has gone crazy.
The notion that Mitt Romney will rediscover some hidden internal moderate and buck the party on this stuff is just a VSP fantasy. Ever since he started running for president (this time around, anyway), he’s been frantically trying to please the right-wing base. Friedman says Romney’s “biggest challenge in attracting independent swing voters will be overcoming a well-earned reputation for saying whatever the Republican base wants to hear.” But self-styled centrists like Friedman have been saying this kind of thing forever and there remains very little indication that any Republican politician faces a tangible cost for pandering to the right.
Romney will not be elected to follow his heart. He’ll be elected to ratify the GOP agenda. Grover Norquist, a man with as much claim to leadership of the GOP as anyone, made his feelings on the matter extremely clear at CPAC:

All we have to do is replace Obama. … We are not auditioning for fearless leader. We don’t need a president to tell us in what direction to go. We know what direction to go. … We just need a president to sign this stuff. We don’t need someone to think it up or design it. The leadership now for the modern conservative movement for the next 20 years will be coming out of the House and the Senate.…Pick a Republican with enough working digits to handle a pen to become president of the United States. This is a change for Republicans: the House and Senate doing the work with the president signing bills. His job is to be captain of the team, to sign the legislation that has already been prepared. [my emphasis]

Mitt Romney is well-aware — and if he wasn’t before, the primary taught him — that his job is to “sign the legislation that has already been prepared.” The leadership of the party is in Congress. It has declared skepticism of climate science the de facto party position. It has declared open war on clean energy, efficiency, and environmental protections. It has made clear that it will support fossil-fuel companies at every juncture.
That’s conservatives and climate for you. It’s interesting, intellectually, that there’s a history of green moderation in the party; that there’s a conceptual space where titular conservative principles overlap with climate protection; that many self-identified Republicans aren’t as crazy as their leaders; and that Romney used to pander in a different direction. But what’s relevant to voters who value climate and environmental protection is that they won’t get any under a GOP administration or a GOP Congress.

Tomgram: Bill McKibben, Climate-Change Deniers Have Done Their Job Well (TomDispatch.com)

Posted by Bill McKibben at 4:40pm, June 3, 2012.

Here’s the thing about climate-change deniers: these days before they sit down to write their blog posts, they have to turn on the AC.  After all, it might as well be July in New York (where I’m writing this), August in Chicago (where a century-old heat record was broken in late May), and hellat the Indy 500.  Infernos have been raging from New Mexico and Colorado, where the fire season started early, to the shores of Lake Superior, where dry conditions and high temperatures led to Michigan’s third largest wildfire in its history.  After a March heat wave for the record books, we now have summer in late spring, the second-named tropical storm of the season earlier than ever recorded, and significant drought conditions, especially in the South and Southwest.  In the meantime, carbon dioxide (and other greenhouse gases) continue to head for the atmosphere inrecord quantities.  And in case anyone living in a big city doesn’t know it, heat can kill.

It’s true that no single event can be pinned on climate change with absolute certainty.  But anyone who doesn’t think we’re in a fierce new world of weather extremes — and as TomDispatch regularBill McKibben has suggested, on an increasingly less hospitable planet that he calls Eaarth — is likely to learn the realities firsthand soon enough.  Not so long ago, if you really wanted to notice the effects of climate change around you, you had to be an Inuit, an Aleut, or some other native of the far north where rising temperatures and melting ice were visibly changing the landscape and wrecking ways of life — or maybe an inhabitant of Kiribati.  Now, it seems, we are all Inuit or Pacific islanders.  And the latest polling numbers indicate that Americans are finally beginning to notice in their own lives, and in numbers that may matter.

With that in mind, we really do need a new term for the people who insist that climate change is a figment of some left-wing conspiracy or a cabal of miscreant scientists.  “Denial” (or the more active “deniers”) seems an increasingly pallid designation in our new world.  Consider, for instance, that in low-lying North Carolina, a leading candidate for disaster from globally rising sea levels, coastal governments and Republicans in the state legislature are taking action: they are passing resolutions against policies meant to mitigate the damage from rising waters and insisting that official state sea-level calculations be made only on the basis of “historic trends,” with no global warming input.  That should really stop the waters!

In the meantime, this spring greenhouse-gas monitoring sites in the Arctic have recorded a startling first: 400 parts per million of carbon dioxide in the atmosphere.  It’s an ominous line to cross (and so quickly).  As in the name of McKibben’s remarkable organization, 350.org, it’s well above the safety line for what this planet and many of the species on it, including us, can take in the long term, and heat-trapping gases in the atmosphere are still on the rise.  All of this is going to get ever harder to “deny,” no matter what resolutions are passed or how measurements are restricted.  In the meantime, the climate-change deniers, McKibben reports, are finally starting to have troubles of their own. Tom

The Planet Wreckers
Climate-Change Deniers Are On the Ropes — But So Is the Planet

By Bill McKibben

It’s been a tough few weeks for the forces of climate-change denial.

First came the giant billboard with Unabomber Ted Kacynzki’s face plastered across it: “I Still Believe in Global Warming. Do You?” Sponsored by the Heartland Institute, the nerve-center of climate-change denial, it was supposed to draw attention to the fact that “the most prominent advocates of global warming aren’t scientists. They are murderers, tyrants, and madmen.” Instead it drew attention to the fact that these guys had over-reached, and with predictable consequences.

A hard-hitting campaign from a new group called Forecast the Facts persuaded many of the corporations backing Heartland to withdraw $825,000 in funding; an entire wing of the Institute, devoted to helping the insurance industry, calved off to form its own nonprofit. Normally friendly politicians like Wisconsin Republican Congressman Jim Sensenbrenner announced that they would boycott the group’s annual conference unless the billboard campaign was ended.

Which it was, before the billboards with Charles Manson and Osama bin Laden could be unveiled, but not before the damage was done: Sensenbrenner spoke at last month’s conclave, but attendance was way down at the annual gathering, and Heartland leaders announced that there were no plans for another of the yearly fests. Heartland’s head, Joe Bast, complained that his side had been subjected to the most “uncivil name-calling and disparagement you can possibly imagine from climate alarmists,” which was both a little rich — after all, he was the guy with the mass-murderer billboards — but also a little pathetic.  A whimper had replaced the characteristically confident snarl of the American right.

That pugnaciousness may return: Mr. Bast said last week that he was finding new corporate sponsors, that he was building a new small-donor base that was “Greenpeace-proof,” and that in any event the billboard had been a fine idea anyway because it had “generated more than $5 million in earned media so far.” (That’s a bit like saying that for a successful White House bid John Edwards should have had more mistresses and babies because look at all the publicity!) Whatever the final outcome, it’s worth noting that, in a larger sense, Bast is correct: this tiny collection of deniers has actually been incredibly effective over the past years.

The best of them — and that would be Marc Morano, proprietor of the website Climate Depot, and Anthony Watts, of the website Watts Up With That — have fought with remarkable tenacity to stall and delay the inevitable recognition that we’re in serious trouble. They’ve never had much to work with.  Only one even remotely serious scientist remains in the denialist camp.  That’s MIT’s Richard Lindzen, who has been arguing for years that while global warming is real it won’t be as severe as almost all his colleagues believe. But as a long article in the New York Times detailed last month, the credibility of that sole dissenter is basically shot.  Even the peer reviewers he approved for his last paper told the National Academy of Sciences that it didn’t merit publication. (It ended up in a “little-known Korean journal.”)

Deprived of actual publishing scientists to work with, they’ve relied on a small troupe of vaudeville performers, featuring them endlessly on their websites. Lord Christopher Monckton, for instance, an English peer (who has been officially warned by the House of Lords to stop saying he’s a member) began his speech at Heartland’s annual conference by boasting that he had “no scientific qualification” to challenge the science of climate change.

He’s proved the truth of that claim many times, beginning in his pre-climate-change career when he explained to readers of the American Spectator that “there is only one way to stop AIDS. That is to screen the entire population regularly and to quarantine all carriers of the disease for life.” His personal contribution to the genre of climate-change mass-murderer analogies has been to explain that a group of young climate-change activists who tried to take over a stage where he was speaking were “Hitler Youth.”

Or consider Lubos Motl, a Czech theoretical physicist who has never published on climate change but nonetheless keeps up a steady stream of web assaults on scientists he calls “fringe kibitzers who want to become universal dictators” who should “be thinking how to undo your inexcusable behavior so that you will spend as little time in prison as possible.” On the crazed killer front, Motl said that, while he supported many of Norwegian gunman Anders Breivik’s ideas, it was hard to justify gunning down all those children — still, it did demonstrate that “right-wing people… may even be more efficient while killing — and the probable reason is that Breivik may have a higher IQ than your garden variety left-wing or Islamic terrorist.”

If your urge is to laugh at this kind of clown show, the joke’s on you — because it’s worked. I mean, James Inhofe, the Oklahoma Republican who has emerged victorious in every Senate fight on climate change, cites Motl regularly; Monckton has testified four times before the U.S. Congress.

Morano, one of the most skilled political operatives of the age — he “broke the story” that became the Swiftboat attack on John Kerry — plays rough: he regularly publishes the email addresses of those he pillories, for instance, so his readers can pile on the abuse. But he plays smart, too. He’s a favorite of Fox News and of Rush Limbaugh, and he and his colleagues have used those platforms to make it anathema for any Republican politician to publicly express a belief in the reality of climate change.

Take Newt Gingrich, for instance.  Only four years ago he was willing to sit on a love seat with Nancy Pelosi and film a commercial for a campaign headed by Al Gore.  In it he explained that he agreed with the California Congresswoman and then-Speaker of the House that the time had come for action on climate. This fall, hounded by Morano, he was forced to recant again and again.  His dalliance with the truth about carbon dioxide hurt him more among the Republican faithful than any other single “failing.”  Even Mitt Romney, who as governor of Massachusetts actually took some action on global warming, has now been reduced to claiming that scientists may tell us “in fifty years” if we have anything to fear.

In other words, a small cadre of fervent climate-change deniers took control of the Republican party on the issue.  This, in turn, has meant control of Congress, and since the president can’t sign a treaty by himself, it’s effectively meant stifling any significant international progress on global warming.  Put another way, the variousright wing billionaires and energy companies who have bankrolled this stuff have gotten their money’s worth many times over.

One reason the denialists’ campaign has been so successful, of course, is that they’ve also managed to intimidate the other side. There aren’t many senators who rise with the passion or frequency of James Inhofe but to warn of the dangers of ignoring what’s really happening on our embattled planet.

It’s a striking barometer of intimidation that Barack Obama, who has a clear enough understanding of climate change and its dangers, has barely mentioned the subject for four years.  He did show a little leg to his liberal base in Rolling Stoneearlier this spring by hinting that climate change could become a campaign issue.  Last week, however, he passed on his best chance to make good on that promise when he gave a long speech on energy at an Iowa wind turbine factory without even mentioning global warming. Because the GOP has been so unreasonable, the President clearly feels he can take the environmental vote by staying silent, which means the odds that he’ll do anything dramatic in the next four years grow steadily smaller.

On the brighter side, not everyone has been intimidated.  In fact, a spirited counter-movement has arisen in recent years.  The very same weekend that Heartland tried to put the Unabomber’s face on global warming, 350.org conducted thousands of rallies around the globe to show who climate change really affects. In a year of mobilization, we also managed to block — at least temporarily — the Keystone pipeline that would have brought the dirtiest of dirty energy, tar-sands oil, from the Canadian province of Alberta to the Gulf Coast.  In the meantime, our Canadian allies are fighting hard to block a similar pipeline that would bring those tar sands to the Pacific for export.

Similarly, in just the last few weeks, hundreds of thousands have signed on to demand an end to fossil-fuel subsidies. And new polling data already show more Americans worried about our changing climate, because they’ve noticed the freakish weather of the last few years and drawn the obvious conclusion.

But damn, it’s a hard fight, up against a ton of money and a ton of inertia. Eventually, climate denial will “lose,” because physics and chemistry are not intimidated even by Lord Monckton. But timing is everything — if he and his ilk, a crew of certified planet wreckers, delay action past the point where it can do much good, they’ll be able to claim one of the epic victories in political history — one that will last for geological epochs.

Bill McKibben is Schumann Distinguished Scholar at Middlebury College, founder of the global climate campaign 350.org, a TomDispatch regular, and the author, most recently, of Eaarth: Making a Life on a Tough New Planet.

Copyright 2012 Bill McKibben

Os Negadores das Mudanças Climáticas encontram uma Radical à Altura: a Natureza!

by Alexandre Araújo Costa on Thursday, June 7, 2012 at 9:50pm (publicado no Facebook).

O título alternativo deste artigo bem poderia conter “o IPCC é fichinha” ou, de forma mais inclusiva, “Cientistas do Clima somos fichinha”. Isto porque, de fato, o embate entre nós e os negadores está longe de ser justo.

O debate científico se dá em Conferências e Congressos e, principalmente, por meio da literatura com revisão. São necessários vários meses para um artigo científico (que às vezes sintetiza resultados de anos de trabalho), após, às vezes, múltiplas idas e vindas de revisões, finalmente ser publicado (suas conclusões sendo geralmente restritas a um pequeno aspecto da ciência e, por ter de necessariamente ser apresentada de forma técnica, acessível a um público muito restrito de especialistas da própria área).  Apesar de suas imperfeições (erros em artigos científicos podem, sim, ocorrer), do olhar não raro fragmentado, e de sua aparência geralmente obscura para o grande público, são – particularmente hoje em dia – as contribuições parciais, os pequenos avanços e retrocessos, que pavimentam o caminho para sólidas conclusões científicas.

A clareza de entendimento em torno da mudança climática atual, do papel antrópico determinante e do risco envolvido foi fruto dessa acumulação de evidências. Apoiando-se em conhecimentos mais fundamentais da Física, da Química, da Astronomia, da Biologia, da Geologia e das interfaces entre elas, e sobre uma colossal quantidade de dados, análises, modelos de diversos níveis de complexidade é que a Ciência do Clima erigiu seu edifício. Nesse contexto, quando as peças se encaixam, a quantidade (as múltiplas informações parciais, mas convergentes) se transforma em qualidade (a “Big Picture”), no quebra-cabeça montado, demonstrado várias e várias vezes nos relatórios de avaliação do IPCC.

Por outro lado, os negadores não seguem as regras do debate e do método científicos. Pelo contrário, atacam-nos, sem cerimônia. É possível fazer qualquer afirmação tresloucada em um blog, em uma palestra, em um “debate” (desses que mais parecem debate eleitoral) ou em uma aparição na mídia. A liberdade para mentir, fantasiar, tergiversar nesses casos é quase infinita e para quem tem compromisso com a verdade científica, é difícil dar conta até de uma pequena parcela dessas mentiras, falsificações e tergiversações. Explicar porque determinada afirmação é falsa dá muito mais trabalho do que fazê-la. Desnudar inveracidades, desmistificar o “cherry-picking” (o ato de escolher um dado entre mil que aparentemente serve de base para uma dada afirmação), localizar sofismas não é trivial no pouco tempo ou espaço que se tem nesse terreno. É desse terreno que os negadores gostam. É por meio dele, e não de um debate verdadeiramente científico e honesto, que eles tentam envenenar a opinião pública e os tomadores de decisão. Deveriam se envergonhar. Mas não! Nesse terreno, um negador que seja um orador (ou escritor) talentoso, cujo semblante não trema, mesmo quando faz afirmações obviamente mentirosas como “o efeito estufa não existe” ou “os modelos de clima não consideram as correntes oceânicas”, deita e rola.

Poucos cientistas, portanto, terminam por entrar nessa arena de gládio, para encararem o vale-tudo dos negadores. Individualmente, nada se ganha ao fazê-lo, pelo contrário. Perde-se tempo e energia que poderia estar sendo dedicada à pesquisa e à produção científica (que infelizmente é avaliada segundo métricas quantitativas que nem sempre refletem a real contribuição à ciência). Há também os cientistas que acham que não é seu papel popularizar a ciência ou sequer combater a pseudo-ciência e a anti-ciência junto ao público. Por fim, há um fator que não se deve desprezar. Pela virulência dos ataques e pelo grau acentuado de desonestidade dos negadores, muitos dos meus pares simplesmente preferem não lutar no terreno deles. É preciso, realmente, muito estômago!

Mas felizmente, os negadores têm um adversário à altura, que não precisa, como nós, caminhar sobre ovos! Um adversário duro, bruto, que vai direto ao assunto, que não se intimida, que não faz juízo de valor, que não tem ideologia. É esse adversário, e não o IPCC e o restante da comunidade da Ciência do Clima, quem tem feito o contraponto mais cristalino aos negadores. Chama-se Natureza! Esta não tem de se preocupar em testar múltiplas vezes suas próprias hipóteses, nem em revisar, em um processo lento, uma análise sobre suas próprias leis. Ela simplesmente é. Simplesmente se comporta de acordo com suas próprias regras. Simplesmente faz! E bate duro na negação!

Chequemos, portanto, o que a Natureza nos tem afirmado. No que diz respeito às projeções do IPCC de temperatura, feitas quando da preparação do seu 4o relatório, estas têm-se confirmado de forma bastante clara, como mostra a Figura a seguir, obtida em http://www.realclimate.org/index.php/archives/2012/02/2011-updates-to-model-data-comparisons/.

Comparação das projeções de temperatura dos modelos do IPCC com observações

A região cinza nessa Figura representa a faixa de projeções do conjunto do IPCC (de tal modo que 95% das previsões se encontra dentro dela). A linha preta é a média delas. As linhas coloridas representam observações da temperatura média global, de acordo com 3 centros de pesquisa. Baixa atividade solar e ocorrência mais frequente de La Niñas (situação em que o Pacífico Equatorial esfria) nos últimos anos podem ter diminuído a velocidade do aquecimento verificada nos anos 90, mas o que assusta é que, em condições como as dos últimos anos, nós deveríamos ter observado um resfriamento do sistema terrestre! Ou seja, ficamos com o nó na garganta, esperando o que pode vir no próximo período em que uma maior atividade solar coincidir com uma maior frequência de El Niños (quando o Pacífico Equatorial se aquece)… Tudo indica que, neste caso, ao invés de aumentarem num ritmo um pouco abaixo, mas próximo ao da média do conjunto dos modelos do IPCC, as temperaturas voltem a mostrar um aumento pronunciado.

Na verdade, mesmo que tivéssemos observado uma constância nas temperaturas ou mesmo um ligeiro resfriamento nos últimos anos, isso não poderia servir de argumento para os negadores! La Niñas e sol pouco ativo deveriam ter servido para resfriar o planeta, o que obviamente não aconteceu em função da contribuição antrópica. As figuras que mostro a seguir são para explicar, de forma didática, a sobreposição dessas duas contribuições (natural e humana). Os processos naturais, a princípio, poderiam ser considerados cíclicos, ou quase cíclicos. Há muitos ciclos, de diferentes frequências e uma boa dose de “caos” (que dá um cara de aleatoriedade a alguns processos climáticos), mas por simplicidade, assumiremos uma oscilação simples, com a temperatura subindo durante alguns anos, descendo nos anos seguintes, depois voltando a subir, e assim por diante, como na figura abaixo.

Representação idealizada da variação “puramente natural” de temperatura

Mas existe a contribuição do homem, que é obviamente de aquecimento (como já discuti, acumular CO2 e outros gases de efeito estufa na atmosfera não tem como produzir outra coisa!). Isso seria representado por uma curva ascendente, isto é, com a temperatura sempre subindo. Como a contribuição antrópica tem-se acelerado, um gráfico da contribuição “puramente humana” poderia ter a aparência desta curva aqui:

Representação simplificada da contribuição antrópica para a mudança de temperatura

Somando as duas contribuições, isto é, a “natural”, representada pela primeira curva e a “antrópica”, pela segunda, o que se verifica é algo bastante interessante, ilustrando em parte o que já se viu e em parte o que se deve esperar no futuro. No começo (faixa azul), o sinal humano é muito pequeno e as oscilações naturais dominam por inteiro. No período imediatamente posterior, as oscilações naturais ainda se destacam, mas o sinal humano cresce, tornando-se discernível, mesmo sendo ainda relativamente pequeno. Em seguida (faixa laranja), temos algo como aumentos acelerados de temperatura, alternados com períodos de poucas variações. No meu ponto de vista, estamos ainda nessa fase, mas duas coisas devem ser ditas. No próximo ciclo em que o sinal humano e o sinal natural estiverem ambos contribuindo para o aquecimento (como no local indicado pela seta, em nossa caricatura abaixo), deveremos experimentar um aquecimento mais acelerado do sistema climático do que aquele verificado nos anos 90. Mais ainda! No período posterior (faixa vermelha), o sinal antrópico tende a ser dominante! Isso é representado pelo final do gráfico, em que mesmo quando tivermos condições naturais (sol menos ativo, ocorrência maior de La Niñas), o aquecimento praticamente não desacelera!

Representação do resultado da sobreposição dos efeitos natural e antrópico, indo de um estado dominado pelo fator natural até um estado dominado pelo fator antrópico

Mas são outros componentes do sistema climático terrestre que, por sofrerem menos influência de oscilações naturais de alta frequência do que a atmosfera, têm-se mostrado ainda mais veementes ao “negarem a negação”.

Comparemos primeiro as projeções de elevação do nível do mar com o que tem acontecido na realidade. É fácil verificar que apenas o modelo mais “pessimista” ou “catastrofista” (não gosto desses juízos de valor) tem acompanhado a realidade (vide http://www.global-warming-forecasts.com/resources/sea-level-increase.png , reproduzida abaixo e http://www.skepticalscience.com/images/SLR_models_obs.gif). Nessa figura, as observações (representadas pelas linhas vermelha – marégrafos – e azul – satélite) estão sistematicamente acima da faixa cinza, que contém a maioria das projeções de modelos. Outro processo que tem mostrado uma realidade pior do que a das projeções do último relatório do IPCC é a do degelo do Ártico, como mostrado em http://www.realclimate.org/images/seaice11.jpg e aqui reproduzido. O degelo real (linha vermelha) tem sido mais acelerado do que qualquer projeção dos modelos (várias outras linhas). Ainda que, nesses dois casos, outros processos que não o aquecimento global antrópico possam ter contribuído para acelerar as mudanças (elevação do nível do mar e degelo) para além das piores projeções feitas pelo IPCC, a Natureza tem falado alto. A humanidade, ridiculamente, faz ouvidos moucos.

Comparação das projeções de elevação do nível do mar dos modelos do IPCC com observações

Comparação das projeções de degelo do Ártico dos modelos do IPCC com observações

Um dos paleoclimatologistas mais respeitados do mundo e um cientista de atuação inspiradora e contangiante, o Prof. Richard Alley, da Pennsylvania State University, costuma colocar a questão de forma muito simples. A realidade pode, sim, não ser tão ruim quanto a apresentada pelo IPCC que é baseada, como mostrei em meu texto anterior (http://www.facebook.com/notes/alexandre-ara%C3%BAjo-costa/a-nega%C3%A7%C3%A3o-das-mudan%C3%A7as-clim%C3%A1ticas-e-seu-desprop%C3%B3sito-versus-a-objetividade-e-ati/399810206727544), em estimativas médias de várias quantidades, processos e fenômenos. Mas é fundamental dizer que existe a mesma chance de ser ainda pior e se o que mostrei podem não ser indícios totalmente claros nesse sentido, deveriam ao menos servir de alerta! Não agir para reduzir globalmente as emissões de gases de efeito estufa é, em tais condições, uma postura de total irracionalidade, irresponsabilidade e desprezo para com os direitos e aspirações das gerações futuras!

Cientistas da USP continuam fiéis ao IPCC (Jornal do Campus-USP) + carta de Ricardo Felício

por   e 

Especialistas defendem a credibilidade do painel climático mundial e opinam sobre o aquecimento global

Desde novembro do ano passado, o IPCC passa por uma crise de credibilidade. Na ocasião foram encontrados erros no relatório que deu ao painel o Prêmio Nobel da Paz em 2007. O mais grave deles era uma previsão sobre o degelo do Himalaia até 2035. Na mesma época vieram a público trocas de e-mails entre seus cientistas insinuando que pesquisas que negam o aquecimento global não seriam avaliadas pelo IPCC. O caso ficou conhecido como Climategate.

Os acontecimentos serviram de argumento para os céticos, aqueles que defendem que o aquecimento global é um fenômeno natural com precedentes ao longo da história e não tem relação com as ações do homem no planeta.

A polêmica ganhou novamente os holofotes da mídia em fevereiro, quando o secretário executivo do painel pediu demissão do cargo. Imediatamente o presidente da instituição, Rajendra Pachauri, anunciou que não mediria esforços para propor um conjunto de medidas que assegurem mais rigor científico nos relatórios e maior controle sobre os especialistas que os produzem.

Sem crise

Tércio Ambrizzi acredita que mais cuidado pode evitar dados incorretos. No entanto, “quando olhado no geral, o relatório do IPCC é muito sólido”, acredita. Segundo ele, mais de mil páginas são analisadas e esse volume dá margem para que algum erro escape. Quanto aos emails publicados, Ambrizzi diz que a invasão da privacidade é algo extremamente perigoso. “Em ciência não é assim que você prova que um resultado científico está errado. Você prova com a ciência”.

Paulo Artaxo compartilha dessa visão, também enfatizando o esforço que os cientistas têm com o relatório: “Ter duas citações que merecem ser corrigidas não invalida o trabalho intenso de milhares de cientistas ao longo de muitos anos”.

A professora Ilana Wainer, presidente da Comissão de Pesquisa do Instituto de Oceanografia (IO), vai além na defesa da credibilidade do IPCC. A professora diz que não vê crise “no que diz respeito à ciência, que é irrefutável e determinística”.

A cientista do IO é enfática ao afirmar que o Climategate foi baseado num verdadeiro roubo de emails de pesquisadores do centro de pesquisa do clima em East Anglia (Reino Unido). “Foram mais de 1000 emails e os ‘céticos’ tentaram desacreditar a ciência das mudanças climáticas baseados nisso e só conseguiram achar um ou outro email pessoal. Existe uma base científica sólida que sustenta a afirmação de que o aquecimento global das últimas quatro, cinco décadas vem da ação do homem”.

(ilustração: Hugo Neto)

Aquecimento global

Para Artaxo, o homem está alterando de modo significativo vários aspectos do planeta. Ele cita como exemplo a queima de combustíveis fósseis nos últimos 150 anos e as alterações no uso do solo, como a troca intensiva de florestas por plantações: “A acumulação adicional de gases de efeito estufa na atmosfera aumentou a temperatura média de nosso planeta em 0.7 graus centígrados nos últimos 150 anos”.

Ambrizzi e Ilana também são categóricos na defesa do aquecimento global e da interferência do homem. “O aquecimento global está ocorrendo e é inequívoca a participação do homem nisso”, afirma a professora do IO.

Já Aretha Sanchez, advogada e autora de pesquisa sobre mudanças climático-ambientais desenvolvida pelo Instituto de Pesquisas Energéticas e Nucleares (Ipen), afirma que mudanças climáticas são comprovadas por registros através do tempo.  “Essas alterações ocorrem por fatores externos ou internos à Terra – dentre os internos, temos a presença humana”, diz Aretha.

Iceberg gigante

Perguntada se o desprendimento do iceberg gigante na Antártida tem relação com o aquecimento global, Ilana explica que “do lado continental há acúmulo de neve/gelo; do lado oceânico ocorre um processo conhecido como calving, que é a liberação repentina e o rompimento de uma massa de gelo de uma geleira. O gelo que rompe pode ser classificado como um iceberg. O desprendimento desse grande iceberg pode ocorrer normalmente como parte do balanço de massa da geleira. O aquecimento global favorece, sim, a intensificação do calving e maior frequência de icebergs, mas não necessariamente está associado ao tamanho deles”.

*   *   *

Carta aberta de Ricardo Augusto Felicio, professor de climatologia do Departamento de Geografia da FFLCH, endereçada ao Jornal do Campus (JC) da USP

Lamentável e repugnante a matéria deste jornal da primeira quinzena de março de 2010, informando que os cientistas da USP permanecem fiéis ao
IPCC <http://www.jornaldocampus.usp.br/index.php/2010/03/cientistas-da-usp-continuam-fieis-ao-ipcc/>.

Vocês deveriam se retratar em público por tamanho absurdo. Somos muitos os pesquisadores desta instituição que negam as imbecilidades pregadas, em
forma de dogma, da patifaria imposta por ONGs, ONU e interesses de governos internacionais.

Cientista não pode ser fiel, muito menos a um órgão político da ONU que nada tem de científico. O jornal ainda peca ao falar dos 2000 cientistas. Eles não devem passar atualmente de 100 ou 200. Só em 2008, mais de 600 caíram fora, alegando que não mais participariam deste conluio. O número real expressa um avolumando contingente de membros de ONGs, políticos e burocratas que nada tem a ver com ciência. Esta é a realidade que custa a ser demonstrada aqui no Brasil.

Enquanto a briga lá fora está acirrada devido aos diversos escândalos, quase semanais, encontrados nos afazeres do IPCC e seus asseclas, a nossa
imprensa se cala, não trazendo as grandes discussões diárias sobre o assunto que vemos em outros países.

Só mesmo pseudocientistas, engajados em interesses econômicos, é que se curvam ao IPCC. E pelo que vemos, temos muitos aqui dentro.

Então lançamos o desafio, exatamente como é feito no exterior: *mostrem a evidência! *Já adiantamos que não aceitamos: “eu acho” ou “eu creio”; saída
de modelos de computador e nem dogmas.

A grande prova de que eles não tem nada é sua fuga das discussões e seus ridículos planos, atrelados ao uso do “princípio da precaução, porque na falta de plena certeza científica, devem-se tomar medidas de mitigação imediatas”.

Qual a finalidade da pesquisa científica séria e dedicada, se no final das contas a resposta já está dada de antemão – se o aquecimento global fosse verdadeiro, deveríamos tomar medidas mitigatórias, mas se ele não for comprovado (como não o é) devemos tomar *exatamente as mesmas medidas*, apenas por precaução?

Que futuro resta para a ciência climática, se ela não é mais ouvida, pois todas as decisões em nome dela já foram tomadas? Sem falar da idéia de consenso, pois todos já admitiram que o homem causa “aquecimento global”, também confundido com “mudanças climáticas”. Oras, só nestas afirmações nós
percebemos como eles são totalmente contraditórios.

Sem falar que ainda dizem que os debates já se encerraram. Como as discussões estão encerradas se elas nunca aconteceram?

Querem trocar todo o cotidiano das atividades humanas baseados em mentiras?! Isto é completamente absurdo! A patifaria tomou vida própria. Está mais do que na hora de ser devidamente neutralizada.

Gastar verbas com o Painel Brasileiro de Mudanças Climáticas – PBMC será uma fabulosa forma de sumir com dinheiro público que poderia ser muito bem empregado para fazer melhorias contra um real problema: saneamento básico no Brasil!

Quanto à imparcialidade do jornal, esta ficou muito a desejar.

Ricardo Augusto Felicio é graduado em Ciências Atmosféricas – Meteorologia pela USP, tem mestrado em Meteorologia pelo Instituto Nacional de Pesquisas
Espaciais e doutorado em Geografia (Geografia Física) pela USP

A Negação das Mudanças Climáticas e seu Despropósito versus a Objetividade e Atitude Ponderada da Comunidade Científica

by Alexandre Araújo Costa on Wednesday, June 6, 2012 at 12:56pm (postado no Facebook)

Tenho recebido respostas interessantes a meus posts anteriores em que discuto a negação das mudanças climáticas. Alguns, que concordam ou simpatizam com a visão dos negadores, compartilharam links que, democraticamente, permanecem na minha página, expondo ainda mais os erros crassos e primários da negação. Como mostro, nenhum dos pseudo-argumentos apresentados se sustenta de pé, tendo eu mesmo os refutado ou indicado links que desmistificam tudo. Em alguns casos, indicar outros materiais de sites confiáveis como http://www.realclimate.org ou http://www.skepticalscience.com torna-se mais fácil e prático, pois uma característica da incansável hidra negadora é a da reciclagem de material (pelo menos nisso, ela é ecologicamente correta…). Algo desmentido uma vez pode aparecer noutro momento e/ou noutro país como uma “nova descoberta” para mostrar que “o aquecimento global é uma farsa” e toda a ladainha negadora repetida à exaustão, no esforço de repetir tanto uma mentira até que ela pareça verdade. Infelizmente, como mostrei em http://www.facebook.com/notes/alexandre-ara%C3%BAjo-costa/a-nega%C3%A7%C3%A3o-das-mudan%C3%A7as-clim%C3%A1ticas-o-bode-e-os-gamb%C3%A1s-o-que-%C3%A9-uma-opini%C3%A3o-pondera/393911987317366, algumas dessas mentiras parecem mesmo ter forte tendência a serem perpetuadas, como as acusações grotescas contra Michael Mann e outros cientistas.

Obviamente há também os comentários de amigos que compartilham dos meus pontos de vista (ou pelo menos de boa parte deles) sobre o que está em jogo no que diz respeito à negação das mudanças climáticas. E muitos desses comentários têm servido de mote para dar prosseguimento à discussão. Assim como foi com o bode, inspirado por um desses amigos, assim é com a discussão sobre as noções de “posição ponderada” ou “posição radical” trazida por outro, que levanta a questão de que se “a posição ponderada sobre o aquecimento global é a do IPCC”, (…) “ela estaria entre as hipóteses radicais minimizadoras/negacionistas e supostas hipóteses radicais alarmistas/apocalípticas”.

O “texto do bode” dá dicas no sentido de que uma coisa não leva necessariamente à outra, dando exemplos em que, na realidade, pontos de vista intermediários seriam impossíveis ou, no mínimo, bizarrices frankensteinianas de pensamento. Há nessa suposição uma simplificação, como se pudéssemos marcar os pontos de vista como pontos geométricos em um segmento de reta e que a posição que representa uma melhor aproximação da realidade seria uma espécie de média (aritmética ou “ponderada”) dos extremos. Evidentemente, num debate que prime pela honestidade intelectual e em que os atores sejam movidos por um interesse comum, talvez o esqueminha da figura abaixo tenha alguma validade. Caso contrário, é preciso abandoná-lo como representação pictórica adequada.

É fato que não vi nenhum cientista falando de “efeito estufa desenfreado” (que aconteceu provavelmente em Vênus) como algo que possa ocorrer na Terra em um futuro tangível, ou outra teoria “radical” nesse sentido, mas quando digo que o IPCC tenta exprimir o consenso da comunidade científica, isso quer dizer que ele tenta encontrar estimativas quantitativas de um determinado fenômeno, efeito ou processo (bem como da incerteza em torno dessa estimativa), com base no que há disponível na literatura científica e/ou em bases de dados reconhecidamente validadas.

Para processos, fenômenos e componentes do sistema climático com efeito bem conhecido, geralmente a “barra de erro” (incerteza) é pequena, indicando que os valores estimados por diferentes metodologias e por diferentes grupos de pesquisa relatados em diferentes artigos na literatura são todos muito próximos. Processos sobre os quais não se tem um conhecimento quantitativo tão preciso geralmente exibem uma barra de erro maior, exprimindo exatamente esse menor grau de entendimento na forma de um maior espalhamento entre as estimativas individuais disponíveis na literatura.

Uma das estimativas mais importantes nesse sentido é a da grandeza que conhecemos como “Forçante Radiativa”, isto é, a contribuição de um determinado componente do sistema climático (seja um gás de efeito estufa ou um tipo de aerossol, ou a variabilidade solar, etc.) para alterar o balanço de energia desse sistema ao longo de um dado período. A Forçante Radiativa é a energia ganha ou perdida (respectivamente positiva ou negativa) pelo sistema climático em função das mudanças nesse componente por unidade de área por unidade de tempo (sendo medida em Watts por metro quadrado ou W/m2). Por exemplo, como vários gases de efeito estufa têm tido sua concentração aumentada desde o período pré-industrial, eles exercem uma forçante radiativa positiva desde lá até o presente pois mais energia é retida no sistema climático agora do que antes, ao invés de deixar o planeta na forma de radiação infravermelho. Por outro lado, uma atmosfera com mais aerossóis (partículas líquidas ou sólidas em suspensão) reflete mais luz, portanto, como a concentração destes também aumentou com a industrialização, os aerossóis contribuem com uma forçante radiativa negativa, pois menos energia entra no sistema na forma de luz solar. Um componente cujo comportamento não tenha mudado de maneira significativa nos últimos séculos exerce, portanto, uma forçante radiativa próxima de zero.

Dentre as componentes bem estudadas e com estimativas bem consolidadas, estão o CO2 e demais gases de efeito estufa de vida longa (metano, óxido nitroso e halocarbonetos). Em 2005, a estimativa era de que o aumento antrópico do CO2 na atmosfera (cuja concentração então era de 379 partes por milhão e hoje em dia ultrapassou 390) contribuía com uma forçante de +1,66 W/m2 (com pequena incerteza para mais ou para menos) um valor bastante significativo para o aquecimento do sistema climático terrestre. Os demais gases de efeito estufa entram com algo muito próximo de 1 W/m2 a mais.

Mesmo com incertezas relativamente maiores, o caso do Sol é um excelente exemplo de como o IPCC chega a uma estimativa “de consenso”. Em http://www.ipcc.ch/publications_and_data/ar4/wg1/en/ch2s2-7-1-2.html, é mostrado um levantamento das estimativas disponíveis na literatura científica de quanto a atividade solar teria variado desde o período conhecido como “mínimo de Maunder”, no século XVII, até os dias de hoje, algo que tanto os negadores gostam de falar. Devo dizer que as estimativas não são de artigos de pesquisadores em clima, mas de estudos por especialistas… em Sol. As estimativas são de que as mudanças na atividade solar desde então têm contribuído com quase zero (isso mesmo!) a, na estimativa maior, +0,68 W/m2, como mostrado neste link. Percebam que, baseando-se nos especialistas que estudam o Sol, o IPCC não poderia fazer outra coisa a não ser atribuir às variações de atividade solar uma forçante radiativa de aquecimento bastante modesta, de poucos décimos de W/m2. Não haveria motivo para concluir que o Sol não variou em nada se apenas dois estudos em dez vão nesse sentido, mas também não teria sentido adotar uma estimativa próxima do outro extremo. Mas é preciso deixar claro! Mesmo se escolhêssemos a dedo esse maior valor, ainda assim chegaríamos à conclusão inevitável de que a contribuição da variabilidade solar para o clima desde o século XVII é várias vezes menor do que a dos gases de efeito estufa! Considerando estimativas a partir do século XVIII (e não XVII), fica mais claro que o papel do Sol é ainda menos significativo (forçante radiativa estimada de apenas +0,12 W/m2).

Algo parecido é feito com os aerossóis de sulfato como em http://www.ipcc.ch/publications_and_data/ar4/wg1/en/ch2s2-4-4-1.html, mostrando que, como sua concentração na atmosfera também aumentou em função da industrialização, esses aerossóis exercem um efeito de resfriamento, contrabalançando parte do efeito de aquecimento dos gases de efeito estufa, com estimativas de forçante radiativa variando entre -0,12 e -0,96 W/m2.

Há valores discrepantes entre estimativas? Há! Mas, ei… não é um leilão onde um negador pode chegar e chutar qualquer valor numérico, nem muito menos, sem nenhum embasamento, chegar e dizer “o aquecimento é natural”, “o sol é que está causando o aquecimento da Terra”, etc. Mede-se, calcula-se, submete-se à apreciação da revisão por pares. Aí sim, ganha-se voz no debate científico. Os que abandonam a seriedade do método realmente não têm compromisso em se aproximar da verdade científica.

Portanto, a opinião ponderada não está “no meio” entre opiniões cientificamente fundamentadas e desvarios motivados por agenda econômica, político-ideológica ou vaidade! Não pode estar! Está “no meio” daquilo que tem valor científico! As indicações do IPCC são médias entre medidas e estimativas de verdade, documentadas e publicadas às quais se agrega uma barra de incerteza. Representa o bom senso de não considerar como verdade absoluta nenhum valor individualmente medido ou estimado por diferentes pesquisadores usando diferentes métodos (satélite, observação de superfície, modelagem, etc.). Tampouco se agarra em um valor extremo, nem de um lado, nem de outro. Essas estimativas são o ponto de partida para obter tanto a melhor avaliação (a grosso modo, a média) quanto a incerteza, que depende do espalhamento das várias estimativas, chegando a algo como http://www.ipcc.ch/publications_and_data/ar4/wg1/en/fig/figure2-20-l.png, reproduzida a seguir.

Essa figura não deixa margens para dúvidas e é por isso que o IPCC se pronunciou. Ao se somar os efeitos, o resultado é de uma forçante radiativa positiva, que não pode resultar em outra coisa senão aquecimento. De onde vem a maior parte desse sinal? Do CO2 e dos gases de efeito estufa. O silêncio seria uma postura de irresponsabilidade e covardia extremas, numa situação em que, mesmo tomando a menor estimativa de aquecimento por esses gases e a maior estimativa de resfriamento pelos aerossóis (o principal fator que atua no sentido contrário), ainda assim se obtém um número que indica que o planeta está, sim, aquecendo, e de maneira intrinsecamente ligada às atividades humanas. Esse posicionamento (que para muito além do IPCC enquanto instituição é a de mais de 97% daqueles que são pesquisadores atuantes na área, vide http://www.skepticalscience.com/global-warming-scientific-consensus-intermediate.htm), portanto, é ponderada, séria, realista. Nada tem de radical. Se a ciência alerta para riscos e implica em mudanças em nossa sociedade que podem parecer incômodas, e as pessoas preferem ignorar esse alerta, aí é outra história! Espero que fique claro, assim, porque não se pode conceder um milímetro sequer aos negadores, em seu desserviço à opinião pública, em sua ação deliberada de desinformação, em sua campanha abertamente anti-ciência.

Mas se vocês querem saber, parece haver alguém que tem radicalizado no sentido oposto ao dos negadores, sim! Esse alguém tem mostrado que algumas projeções do IPCC são, ao invés de ponderadas, subestimadas, conservadoras… Já ouviram falar de uma tal de “Natureza”? Pois bem… Mas esse será o assunto de outro artigo…

Festival interativo leva visitantes a experimentar situações de desastre ambiental (Agência Brasil)

01/6/2012 – 10h42

por Thais Leitão, da Agência Brasil

Chamada53 Festival interativo leva visitantes a experimentar situações de desastre ambientalRio de Janeiro – Uma floresta que entra em chamas colocando em risco a vida de animais e da vegetação existente; uma geleira intacta que de repente começa a derreter ou uma casa que sofre inundação. Todas essas situações, provocadas pelo desequilíbrio ambiental, podem ser experimentadas pelo público durante o Green Nation Fest, festival interativo e sensorial que começou hoje (31) na Quinta da Boa Vista, zona norte do Rio de Janeiro, e vai até 7 de junho.

De acordo com o diretor da organização não governamental (ONG) Centro de Cultura, Informação e Meio Ambiente (Cima), que organiza do evento, Marcos Didonet, o objetivo é levar experiências práticas aos visitantes e estimular o público a agir de forma mais sustentável. A Cima desenvolve há mais de 20 anos ações em parceria com instituições privadas, governamentais e multilaterais.

“O objetivo é alcançar o grande público que não está acostumado a vivenciar a questão ambiental, trazendo o assunto de forma mais interessante, agradável e prática. Para isso, nossos artistas e cientistas bolaram essas instalações capazes de promover sensações que serão ainda mais frequentes se não mudarmos nossos padrões de consumo e comportamentos cotidianos”, afirmou.

No local, também há tendas onde ocorrem oficinas lúdicas e educativas. Em uma delas, montada pelo Instituto Estadual do Ambiente (Inea), um grupo de 30 alunos da rede municipal do Rio aprendeu, hoje, a produzir carteiras usando caixas de leite e recortes de tecido.

Para a estudante Ana Beatriz Leão, 14 anos, a ideia é criativa e pode servir para presentear amigos. “É legal porque a gente geralmente joga no lixo e agora sabe que dá para fazer outras coisas com a caixa. A que eu fiz, vou dar para uma amiga que tenho certeza que vai gostar”, contou a adolescente.

Na mesma tenda, os visitantes podem conferir outros produtos feitos com material reutilizado, como uma pequena bateria produzida com latinhas de refrigerante, livros infantis com retalhos de tecidos e bonecos com caixa de sapato.

Entre os meninos, uma das atividades preferidas é o Gol de Bicicleta na qual os participantes pedalam e geram energia para seu time. A cada watt gerado, um gol é marcado para o time de preferência. Além disso, uma bateria é abastecida e leva energia para ser utilizada em outra instalação do festival.

Os amigos Gustavo Fonseca e Roberto Damião, ambos de 11 anos, também alunos da rede municipal do Rio, disseram que a experiência é “muito intensa”.

“Foi muito legal porque a gente aprendeu outra maneira de gerar energia e ainda fez gol pro Mengão”, disse Roberto, que torce pelo Flamengo.

O evento, com entrada gratuita, também oferece uma a Mostra Internacional de Cinema, com 12 longas-metragens, e seminários com convidados brasileiros e internacionais sobre economia verde e criativa, que serão abertos para debates. A programação completa pode ser conferida no site www.greennationfest.com.br.

* Publicado originalmente no site da Agência Brasil.

 

Divisão Norte-Sul paira sobre a Rio+20 (IPS)

Envolverde Rio + 20
01/6/2012 – 10h05

por Thalif Deen, da IPS

Slide2 Divisão Norte Sul paira sobre a Rio+20

Branislav Gosovic. Foto: Cortesia Branislav Gosovic

Nova York, Estados Unidos, 1/6/2012 – A Cúpula da Terra de 1992 no Rio de Janeiro se viu em grande parte desbaratada pela divisão Norte-Sul: uma batalha entre uma coalizão de nações industrializadas ricas e o Grupo dos 77 (G-77), atualmente integrado por 134 países em desenvolvimento.

De certa forma, as atuais divisões são mais profundas do que por ocasião da Conferência sobre Ambiente Humano realizada em Estocolmo em 1972, a primeira cúpula ambiental, e do que a Cúpula da Terra, duas décadas depois, disse Branislav Gosovic, ex-integrante da Comissão Brundtland sobre Meio Ambiente e que integrou a delegação do Centro Sul na conferência de 1992.

“A divisão afetará o processo e o resultado da Rio+20”, afirmou Gosovic, referindo-se à Conferência das Nações Unidas sobre Desenvolvimento Sustentável (Rio+20) no Rio de Janeiro, de 20 a 22 de junho. Branislav Gosovic é autor de The Quest for World Environmental Cooperation: The Case of the U.N. Global Environment Monitoring System (A busca da cooperação mundial para o meio ambiente: o caso do Sistema Mundial de Monitoramento Ambiental das Nações Unidas).

IPS: Como participante da Cúpula da Terra de 1992, tem alguma confiança ou está cético com relação ao resultado da Rio+20?

Branislav Gosovic: Não sou otimista quanto a êxitos ou grandes avanços. A reunião acontecerá em um momento difícil para a economia global e nacional e após 20 anos de predomínio de uma globalização neoliberal. No primeiro caso, os chefes de Estado estarão preocupados com a resposta à atual crise que não sabem como manejar nem superar. E o segundo prejudicou a agenda sobre desenvolvimento sustentável e paralisou, ou fez retroceder, alguma das políticas e dos avanços conceituais realizados no período anterior com vistas à (e na) cúpula do Rio de Janeiro.

IPS: Qual sua opinião sobre o documento da Rio+20 que é negociado?

BG: Mantém muitas ideias e muitos objetivos vivos. Entretanto, semanas antes do encontro, parágrafos entre parênteses (o que indica desacordos) e palavras ambíguas sobre questões muito importantes mostram a falta de consenso e que a comunidade internacional caminha para um período de seca. Porém, me atrevo a ser otimista quanto ao longo prazo e após um período de globalização neoliberal, dada a maturação de muitos temas e a preocupação e o agravamento dos problemas globais identificados em Estocolmo há 40 anos, que a Rio+20 possa marcar o começo de 20 anos mais promissores para a cooperação internacional rumo a “Estocolmo+60, isto é, Rio+40.

IPS: Qual a melhor forma de conseguir isso?

BG: É preciso muito trabalho, compromisso e liderança de alguns países que estão em posição de oferecê-lo e participação das forças sociais em um movimento global genuíno. E o mais importante, acarretará grandes mudanças estruturais e paradigmas sobre como a sociedade se organiza, em escala nacional e global, uma chave que abrirá a porta para cumprir muitos dos atuais objetivos esquivos ou inalcançáveis. Não há motivo de surpresa no fato de tais mudanças sofrerem resistência e serem combatidas com unhas e dentes e por todos os meios disponíveis pelos que se opõem a elas.

IPS: Acredita em uma repetição da divisão Norte-Sul de 1991 nas atuais negociações do plano de ação da Rio+20, intitulado O futuro que queremos?

BG: A divisão Norte-Sul existe há mais de 60 anos, desde os primeiros dias da Organização das Nações Unidas. Afetou e determinou o resultado de Conferência de Estocolmo e a forma como foi conceituada a agenda ambiental, como um plano de desenvolvimento sustentável. Esteve presente no informe e na primeira reunião da Comissão Brundtland, isto é, a Comissão sobre Meio Ambiente e Desenvolvimento, e depois na Rio 92 e em Johannesburgo 2002. E tal como mostra o atual rascunho do documento final, terá um papel central na Rio+20. Pode-se argumentar que as questões ambientais usufruíram a agenda internacional para o desenvolvimento e vice-versa. Os problemas ambientais globais não podem ser atendidos nem resolvidos sem a participação do Sul e dos países em desenvolvimento, e sendo sócios iguais na empresa. Não se pode fazer desaparecerem as cúpulas gêmeas sobre meio ambiente e desenvolvimento, a Conferência das Nações Unidas sobre Meio Ambiente e a Chamada Cúpula da Terra, como fazem alguns países industrializados tentando encontrar divisões e diferenças no Sul. Continuarão fazendo isso até que o Norte mude sua política e assuma sua posição de solidariedade e tenha uma adesão genuína aos princípios do Rio, de “responsabilidades comuns e diferenciadas”. Por outro lado, observa-se os esforços para transformar a agenda ambiental em uma grande oportunidade empresarial e de criação de emprego, para projetar uma imagem de determinados países em desenvolvimento importantes como principal ameaça para o meio ambiente global. Também para enfrentar nas negociações sobre mudança climática pequenos grupos de Estados em desenvolvimento vulneráveis, em um esforço de nunca acabar tendente a dividir o Grupo dos 77. Definitivamente, o conflito Norte-Sul está vivo e se movendo, dirá presente na Rio+20 e se manterá no futuro imediato.

IPS: Como se compara a Agenda 21 e o documento da Rio+20 com o histórico informe da Comissão Brundtland de 1987? Houve avanços substanciais desde então e desde a Conferência de Estocolmo?

BG: O documento da Rio+20 é resultado de um processo de negociações. Nesse sentido, não se pode comparar com o Informe Brundtland nem com o da Cúpula da Terra, ambos elaborados por equipes dedicadas a essa tarefa durante um longo tempo. Por outro lado, a maioria dos temas presentes no Informe Brundtland e na Agenda 21 podem ser encontrados no documento da Rio+20, embora redigidos de tal forma que revela a falta de consenso e de compromisso para agir. Houve avanços em numerosas áreas, mas, nas questões cruciais e nos conflitos subjacentes, quase não existiu movimento. Isso continuará sendo de interesse e desempenhará um papel importante na Rio+20. Um desses conflitos tem a ver com as divisões Norte-Sul, a agenda internacional para o desenvolvimento, e a questão relacionada da ordem global e política existente, que está sendo questionada. O outro conflito, menos visível, tem a ver com a natureza da ordem socioeconômica dominante, ou o paradigma, que é questionado por não ser sustentável do ponto de vista social nem ambiental. Este conflito estará presente no Norte e no Sul. Houve poucos progressos na prática sobre questões fundamentais desse tipo. Envolverde/IPS

(IPS)