Arquivo da tag: Previsão

Climate Change Research Is Globally Skewed (Science Daily)

Jan. 22, 2014 — The supply of climate change knowledge is biased towards richer countries — those that pollute the most and are least vulnerable to climate change — and skewed away from the poorer, fragile and more vulnerable regions of the world. That creates a global imbalance between the countries in need of knowledge and those that build it. This could have implications for the quality of the political decisions countries and regions make to prevent and adapt to climate change, warn the researchers behind the study from the University of Copenhagen.

Climate change research, shown here by number of publications, primarily concerns countries that are less vulnerable to climate change and have a higher emission of CO2. The countries are also politically stable, less corrupt, and have a higher investment in education and research. (Credit: Image courtesy of University of Copenhagen)

“80 % of all the climate articles we examined were published by researchers from developed countries, although these countries only account for 18 % of the world’s population. That is of concern because the need for climate research is vital in developing countries. It could have political and societal consequences if there are regional shortages of climate scientists and research to support and provide contextually relevant advice for policy makers in developing countries,” says Professor Niels Strange from the Center for Macroecology, Evolution and Climate, University of Copenhagen, which is supported by the Danish National Research Foundation.

Climate change research, shown here by number of publications, primarily concerns countries that are less vulnerable to climate change and have a higher emission of CO2. The countries are also politically stable, less corrupt, and have a higher investment in education and research.

Together with PhD student Maya Pasgaard from the Department of Food and Resource Economics at the University of Copenhagen, Niels Strange analysed over 15,000 scientific papers on climate research from 197 countries. The analysis clearly shows that the research is biased towards countries that are wealthier, better educated, more stable and less corrupt, emit the most carbon, and are less vulnerable to climate change.

As an example, the study shows that almost 30 % of the total number of publications concerns the United States of America, Canada and China, while India is the only highly vulnerable country in the top 10 list. However, Greenland and small island states like the Seychelles and the Maldives that are generally considered vulnerable, also find their way into the top 10 list if it is calculated per capita.

The content of climate studies is also skewed

The study shows that not only the authorship, but also the choice of topic in climate research, is geographically skewed:

Articles from Europe and North America are more often biased towards issues of climate change mitigation, such as emission reductions, compared with articles from the southern hemisphere. In contrast, climate research from Africa and South and Latin America deals more with issues of climate change adaptation and impacts such as droughts and diseases compared to Europe.

“The tendency is a geographical bias where climate knowledge is produced mainly in the northern hemisphere, while the most vulnerable countries are found in the southern hemisphere. The challenge for the scientific community is to improve cooperation and knowledge sharing across geographical and cultural barriers, but also between practitioners and academics. Ultimately, it will require financial support and political will, if we as a society are to address this imbalance in the fight against climate change,” says Maya Pasgaard. The study was recently published online in the journal Global Environmental Change.

Journal Reference:

  1. M. Pasgaard, N. Strange. A quantitative analysis of the causes of the global climate change research distributionGlobal Environmental Change, 2013; 23 (6): 1684 DOI: 10.1016/j.gloenvcha.2013.08.013

An insider’s story of the global attack on climate science (The Conversation)

23 January 2014, 6.40am AEST

Stormy weather hits New Zealand’s capital, Wellington. Flickr.com/wiifm69 (Sean Hamlin)

A recent headline – Failed doubters trust leaves taxpayers six-figure loss – marked the end of a four-year epic saga of secretly-funded climate denial, harassment of scientists and tying-up of valuable government resources in New Zealand.It’s likely to be a familiar story to my scientist colleagues in Australia, the UK, USA and elsewhere around the world.But if you’re not a scientist, and are genuinely trying to work out who to believe when it comes to climate change, then it’s a story you need to hear too. Because while the New Zealand fight over climate data appears finally to be over, it’s part of a much larger, ongoing war against evidence-based science.

From number crunching to controversy

In 1981 as part of my PhD work, I produced a seven-station New Zealand temperature series, known as 7SS, to monitor historic temperature trends and variations from Auckland to as far south as Dunedin in southern New Zealand.A decade later, in 1991-92 while at the NZ Meteorological Service, I revised the 7SS using a new homogenisation approach to make New Zealand’s temperature records more accurate, such as adjusting for when temperature gauges were moved to new sites.

The Kelburn Cable Car trundles up into the hills of Wellington. Shutterstock/amorfati.art

For example, in 1928 Wellington’s temperature gauge was relocated from an inner suburb near sea level up into the hills at Kelburn, where – due to its higher, cooler location – it recorded much cooler temperatures for the city than before.With statistical analysis, we could work out how much Wellington’s temperature has really gone up or down since the city’s temperature records began back in 1862, and how much of that change was simply due to the gauge being moved uphill. (You can read more about re-examining NZ temperatures here.) So far, so uncontroversial.But then in 2008, while working for a NZ government-owned research organisation, theNational Institute of Water and Atmospheric Research (NIWA), we updated the 7SS. And we found that at those seven stations across the country, from Auckland down to Dunedin, between 1909 and 2008 there was a warming trend of 0.91°C.Soon after that, things started to get heated.The New Zealand Climate Science Coalition, linked to a global climate change denial group, the International Climate Science Coalition, began to question the adjustments I had made to the 7SS.And rather than ever contacting me to ask for an explanation of the science, as I’ve tried to briefly cover above, the Coalition appeared determined to find a conspiracy.

“Shonky” claims

The attack on the science was led by then MP for the free market ACT New Zealand party, Rodney Hide, who claimed in the NZ Parliament in February 2010 that:

NIWA’s raw data for their official temperature graph shows no warming. But NIWA shifted the bulk of the temperature record pre-1950 downwards and the bulk of the data post-1950 upwards to produce a sharply rising trend… NIWA’s entire argument for warming was a result of adjustments to data which can’t be justified or checked. It’s shonky.

Mr Hide’s attack continued for 18 months, with more than 80 parliamentary questions being put to NIWA between February 2010 and July 2011, all of which required NIWA input for the answers.The science minister asked NIWA to re-examine the temperature records, which required several months of science time. In December 2010, the results were in. After the methodology was reviewed and endorsed by the Australian Bureau of Meteorology, it was found that at the seven stations from Auckland to Dunedin, between 1909 and 2008 there was a warming trend of 0.91°C.That is, the same result as before.But in the meantime, before NIWA even had had time to produce that report, a new line of attack had been launched.

Off to court

In July 2010, a statement of claim against NIWA was filed in the High Court of New Zealand, under the guise of a new charitable trust: the New Zealand Climate Science Education Trust (NZCSET). Its trustees were all members of the NZ Climate Science Coalition.The NZCSET challenged the decision of NIWA to publish the adjusted 7SS, claiming that the “unscientific” methods used created an unrealistic indication of climate warming.The Trust ignored the evidence in the Meteorological Service report I first authored, which stated a particular adjustment methodology had been used. The Trust incorrectly claimed this methodology should have been used but wasn’t.In July 2011 the Trust produced a document that attempted to reproduce the Meteorological Service adjustments, but failed to, instead making lots of errors.On September 7 2012, High Court Justice Geoffrey Venning delivered a 49-page ruling, finding that the NZCSET had not succeeded in any of its challenges against NIWA.

The NZ weather wars in the news. The New Zealand Herald

The judge was particularly critical about retired journalist and NZCSET Trustee Terry Dunleavy’s lack of scientific expertise.Justice Venning described some of the Trust’s evidence as tediously lengthy and said “it is particularly unsuited to a satisfactory resolution of a difference of opinion on scientific matters”.

Taxpayers left to foot the bill

After an appeal that was withdrawn at the last minute, late last year the NZCSET was ordered to pay NIWA NZ$89,000 in costs from the original case, plus further costs from the appeal.But just this month, we have learned that the people behind the NZCSET have sent it into liquidation as they cannot afford the fees, leaving the New Zealand taxpayer at a substantial, six-figure loss.Commenting on the lost time and money involved with the case, NIWA’s chief executive John Morgan has said that:

On the surface it looks like the trust was purely for the purpose of taking action, which is not what one would consider the normal use of a charitable trust.

This has been an insidious saga. The Trust aggressively attacked the scientists, instead of engaging with them to understand the technical issues; they ignored evidence that didn’t suit their case; and they regularly misrepresented NIWA statements by taking them out of context.Yet their attack has now been repeatedly rejected in Parliament, by scientists, and by the courts.The end result of the antics by a few individuals and this Trust is probably going to be a six-figure bill for New Zealanders to pay.My former colleagues have had valuable weeks tied up with wasted time in defending these manufactured allegations. That’s time that could have profitably been used investigating further what is happening with our climate.But there is a bigger picture here too.

Merchants of doubt

Doubt-mongering is an old strategy. It is a strategy that has been pursued before to combat the ideas that cigarette smoking is harmful to your health, and it has been assiduously followed by climate deniers for the past 20 years.One of the best known international proponents of such strategies is US think tank, the Heartland Institute.

The first in a planned series of anti-global warming billboards in the US, comparing “climate alarmists” with terrorists and mass murderers. The campaign was canned after a backlash. The Heartland Institute

Just to be clear: there is no evidence that the Heartland Institute helped fund the NZ court challenge. In 2012, one of the Trustees who brought the action against NIWA said Heartland had not donated anything to the case.

However, Heartland is known to have been active in NZ in the past, providing funding to theNZ Climate Science Coalition and a related International Coalition, as well as financially backing prominent climate “sceptic” campaigns in Australia.

An extract from a 1999 letter from the Heartland Institute to tobacco company Philip Morris.University of California, San Francisco, Legacy Tobacco Documents Library

The Heartland Institute also has a long record ofworking with tobacco companies, as the letter on the right illustrates. (You can read that letter and other industry documents in full here. Meanwhile, Heartland’s reply to critics of its tobacco and fossil fuel campaigns is here.)

Earlier this month, the news broke that major tobacco companies will finally admit they “deliberately deceived the American public”, in “corrective statements”that would run on prime-time TV, in newspapers and even on cigarette packs.

It’s taken a 15-year court battle with the US government to reach this point, and it shows that evidence can trump doubt-mongering in the long run.

A similar day may come for those who actively work to cast doubt on climate science.

Industry Awakens to Threat of Climate Change (New York Times)

A Coke bottling plant in Winona, Minn. The company has been affected by global droughts. Andrew Link/Winona Daily News, via Associated Press

By CORAL DAVENPORT

JAN. 23, 2014

WASHINGTON — Coca-Cola has always been more focused on its economic bottom line than on global warming, but when the company lost a lucrative operating license in India because of a serious water shortage there in 2004, things began to change.

Today, after a decade of increasing damage to Coke’s balance sheet as global droughts dried up the water needed to produce its soda, the company has embraced the idea of climate change as an economically disruptive force.

“Increased droughts, more unpredictable variability, 100-year floods every two years,” said Jeffrey Seabright, Coke’s vice president for environment and water resources, listing the problems that he said were also disrupting the company’s supply of sugar cane and sugar beets, as well as citrus for its fruit juices. “When we look at our most essential ingredients, we see those events as threats.”

Coke reflects a growing view among American business leaders and mainstream economists who see global warming as a force that contributes to lower gross domestic products, higher food and commodity costs, broken supply chains and increased financial risk. Their position is at striking odds with the longstanding argument, advanced by the coal industry and others, that policies to curb carbon emissions are more economically harmful than the impact of climate change.

“The bottom line is that the policies will increase the cost of carbon and electricity,” said Roger Bezdek, an economist who produced a report for the coal lobby that was released this week. “Even the most conservative estimates peg the social benefit of carbon-based fuels as 50 times greater than its supposed social cost.”

Some tycoons are no longer listening.

At the Swiss resort of Davos, corporate leaders and politicians gathered for the annual four-day World Economic Forum will devote all of Friday to panels and talks on the threat of climate change. The emphasis will be less about saving polar bears and more about promoting economic self-interest.

In Philadelphia this month, the American Economic Association inaugurated its new president, William D. Nordhaus, a Yale economist and one of the world’s foremost experts on the economics of climate change.

“There is clearly a growing recognition of this in the broader academic economic community,” said Mr. Nordhaus, who has spent decades researching the economic impacts of both climate change and of policies intended to mitigate climate change.

In Washington, the World Bank president, Jim Yong Kim, has put climate change at the center of the bank’s mission, citing global warming as the chief contributor to rising global poverty rates and falling G.D.P.’s in developing nations. In Europe, the Organization for Economic Cooperation and Development, the Paris-based club of 34 industrialized nations, has begun to warn of the steep costs of increased carbon pollution.

Nike, which has more than 700 factories in 49 countries, many in Southeast Asia, is also speaking out because of extreme weather that is disrupting its supply chain. In 2008, floods temporarily shut down four Nike factories in Thailand, and the company remains concerned about rising droughts in regions that produce cotton, which the company uses in its athletic clothes.

“That puts less cotton on the market, the price goes up, and you have market volatility,” said Hannah Jones, the company’s vice president for sustainability and innovation. Nike has already reported the impact of climate change on water supplies on its financial risk disclosure forms to the Securities and Exchange Commission.

Both Nike and Coke are responding internally: Coke uses water-conservation technologies and Nike is using more synthetic material that is less dependent on weather conditions. At Davos and in global capitals, the companies are also lobbying governments to enact environmentally friendly policies.

But the ideas are a tough sell in countries like China and India, where cheap coal-powered energy is lifting the economies and helping to raise millions of people out of poverty. Even in Europe, officials have begun to balk at the cost of environmental policies: On Wednesday, the European Union scaled back its climate change and renewable energy commitments, as high energy costs, declining industrial competitiveness and a recognition that the economy is unlikely to rebound soon caused European policy makers to question the short-term economic trade-offs of climate policy.

In the United States, the rich can afford to weigh in. The California hedge-fund billionaire Thomas F. Steyer, who has used millions from his own fortune to support political candidates who favor climate policy, is working with Michael R. Bloomberg, the former New York mayor, and Henry M. Paulson Jr., a former Treasury secretary in the George W. Bush administration, to commission an economic study on the financial risks associated with climate change. The study, titled “Risky Business,” aims to assess the potential impacts of climate change by region and by sector across the American economy.

“This study is about one thing, the economics,” Mr. Paulson said in an interview, adding that “business leaders are not adequately focused on the economic impact of climate change.”

Also consulting on the “Risky Business” report is Robert E. Rubin, a former Treasury secretary in the Clinton administration. “There are a lot of really significant, monumental issues facing the global economy, but this supersedes all else,” Mr. Rubin said in an interview. “To make meaningful headway in the economics community and the business community, you’ve got to make it concrete.”

Last fall, the governments of seven countries — Colombia, Ethiopia, Indonesia, South Korea, Norway, Sweden and Britain — created the Global Commission on the Economy and Climate and jointly began another study on how governments and businesses can address climate risks to better achieve economic growth. That study and the one commissioned by Mr. Steyer and others are being published this fall, just before a major United Nations meeting on climate change.

Although many Republicans oppose the idea of a price or tax on carbon pollution, some conservative economists endorse the idea. Among them are Arthur B. Laffer, senior economic adviser to President Ronald Reagan; the Harvard economist N. Gregory Mankiw, who was economic adviser to Mitt Romney’s presidential campaign; and Douglas Holtz-Eakin, the head of the American Action Forum, a conservative think tank, and an economic adviser to the 2008 presidential campaign of Senator John McCain, the Arizona Republican.

“There’s no question that if we get substantial changes in atmospheric temperatures, as all the evidence suggests, that it’s going to contribute to sea-level rise,” Mr. Holtz-Eakin said. “There will be agriculture and economic effects — it’s inescapable.” He added, “I’d be shocked if people supported anything other than a carbon tax — that’s how economists think about it.”

Soap Bubbles for Predicting Cyclone Intensity? (Science Daily)

Jan. 8, 2014 — Could soap bubbles be used to predict the strength of hurricanes and typhoons? However unexpected it may sound, this question prompted physicists at the Laboratoire Ondes et Matière d’Aquitaine (CNRS/université de Bordeaux) to perform a highly novel experiment: they used soap bubbles to model atmospheric flow. A detailed study of the rotation rates of the bubble vortices enabled the scientists to obtain a relationship that accurately describes the evolution of their intensity, and propose a simple model to predict that of tropical cyclones.

Vortices in a soap bubble. (Credit: © Hamid Kellay)

The work, carried out in collaboration with researchers from the Institut de Mathématiques de Bordeaux (CNRS/université de Bordeaux/Institut Polytechnique de Bordeaux) and a team from Université de la Réunion, has just been published in the journal NatureScientific Reports.

Predicting wind intensity or strength in tropical cyclones, typhoons and hurricanes is a key objective in meteorology: the lives of hundreds of thousands of people may depend on it. However, despite recent progress, such forecasts remain difficult since they involve many factors related to the complexity of these giant vortices and their interaction with the environment. A new research avenue has now been opened up by physicists at the Laboratoire Ondes et Matière d’Aquitaine (CNRS/Université Bordeaux 1), who have performed a highly novel experiment using, of all things, soap bubbles.

The researchers carried out simulations of flow on soap bubbles, reproducing the curvature of the atmosphere and approximating as closely as possible a simple model of atmospheric flow. The experiment allowed them to obtain vortices that resemble tropical cyclones and whose rotation rate and intensity exhibit astonishing dynamics-weak initially or just after the birth of the vortex, and increasing significantly over time. Following this intensification phase, the vortex attains its maximum intensity before entering a phase of decline.

A detailed study of the rotation rate of the vortices enabled the researchers to obtain a simple relationship that accurately describes the evolution of their intensity. For instance, the relationship can be used to determine the maximum intensity of the vortex and the time it takes to reach it, on the basis of its initial evolution. This prediction can begin around fifty hours after the formation of the vortex, a period corresponding to approximately one quarter of its lifetime and during which wind speeds intensify. The team then set out to verify that these results could be applied to real tropical cyclones. By applying the same analysis to approximately 150 tropical cyclones in the Pacific and Atlantic oceans, they showed that the relationship held true for such low-pressure systems. This study therefore provides a simple model that could help meteorologists to better predict the strength of tropical cyclones in the future.

Journal Reference:

  1. T. Meuel, Y. L. Xiong, P. Fischer, C. H. Bruneau, M. Bessafi, H. Kellay. Intensity of vortices: from soap bubbles to hurricanesScientific Reports, 2013; 3 DOI:10.1038/srep03455

Our singularity future: should we hack the climate? (Singularity Hub)

Written By: 

Posted: 01/8/14 8:31 AM

Basaltlake-coring_greenland

Even the most adamant techno-optimists among us must admit that new technologies can introduce hidden dangers: Fire, as the adage goes, can cook the dinner, but it can also burn the village down.

The most powerful example of unforeseen disadvantages stemming from technology is climate change. Should we attempt to fix a problem caused by technology, using more novel technology to hack the climate? The question has spurred heated debate.

Those in favor point to failed efforts to curb carbon dioxide emissions and insist we need other options. What if a poorly understood climatic tipping point tips and the weather becomes dangerous overnight; how will slowing emissions help us then?

“If you look at the projections for how much the Earth’s air temperature is supposed to warm over the next century, it is frightening. We should at least know the options,” said Rob Wood, a University of Washington climatologist who edited a recent special issue of the journal Climatic Change devoted to geoengineering.

Wood’s view is gaining support, as the predictions about the effects of climate change continue to grow more dire, and the weather plays its part to a tee.

But big, important questions need answers before geoengineering projects take off. Critics point to science’s flimsy understanding of the complex systems that drive the weather. And even supporters lament the lack of any experimental framework to contain disparate experiments on how to affect it.

“Proposed projects have been protested or canceled, and calls for a governance framework abound,” Lisa Dilling and Rachel Hauser wrote in a paper that appears in the special issue. “Some have argued, even, that it is difficult if not impossible to answer some research questions in geoengineering at the necessary scale without actually implementing geoengineering itself.”

Most proposed methods of geoengineering derive from pretty basic science, but questions surround how to deploy them at a planetary scale and how to measure desired and undesired effects on complex weather and ocean cycles. Research projects that would shed light on those questions would be big enough themselves potentially to affect neighboring populations, raising ethical questions as well.

stratoshieldEarlier efforts to test fertilizing the ocean with iron to feed algae that would suck carbon dioxide from the air and to spray the pollutant sulfur dioxide, which reflects solar radiation, into the atmosphere were mired in controversy. A reputable UK project abandoned its plans to test its findings in the field.

But refinements on those earlier approaches are percolating. They include efforts both to remove previously emitted carbon dioxide from the atmosphere and to reduce the portion of the sun’s radiation that enters the atmosphere.

One method of carbon dioxide removal (or CDR) would expose large quantities of carbon-reactive minerals to the air and then store the resulting compounds underground; another would use large C02 vacuums to suck the greenhouse gas directly from the air into underground storage.

Solar radiation management (or SRM) methods include everything from painting roofs white to seeding the clouds with salt crystals to make them more reflective and mimicking the climate-cooling effects of volcanic eruptions by spraying  sulfur compounds into the atmosphere.

The inevitable impact of geoengineering research on the wider population has led many scientists to compare geoengineering to genetic research. The comparison to genetic research also hints at the huge benefits geoengineering could have if it successfully wards off the most savage effects of climate change.

As with genetic research, principles have been developed to shape the ethics of the research. Still, the principles remain vague, according to a 2012 Nature editorial, and flawed, according to a philosophy-of-science take in the recent journal issue. Neither the U.S. government nor international treaties have addressed geoengineering per se, though many treaties would influence its testing implementation.

The hottest research now explores how long climate-hacks would take to work, lining up their timelines with the slow easing of global warming that would result from dramatically lowered carbon dioxide emissions, and how to weigh the costs of geoengineering projects and accommodate public debate.

Proceeding with caution won’t get fast answers, but it seems a wise way to address an issue as thorny as readjusting the global thermostat.

Funding Problems Threaten U. S. Disaster Preparedness (Science Daily)

Jan. 9, 2014 — The Sept. 11, 2001 attacks in New York City prompted large increases in government funding to help communities respond and recover after human-made and natural disasters. But, this funding has fallen considerably since the economic crisis in 2008. Furthermore, disaster funding distribution is deeply inefficient: huge cash infusions are disbursed right after a disaster, only to fall abruptly after interest wanes. These issues have exposed significant problems with our nation’s preparedness for public health emergencies.

In a report published by the Institute of Medicine, authors Jesse Pines, M.D., director of the Office of Clinical Practice Innovation at the George Washington University (GW) School of Medicine and Health Sciences (SMHS); Seth Seabury, Ph.D., associate professor of emergency medicine at the Keck School of Medicine of the University of Southern California (USC); and William Pilkington, DPA, of the Cabarrus Health Alliance, make seven recommendations to provide a road map to enhance the sustainability of preparedness efforts in the United States.

“With more limited government funding in the foreseeable future, the government needs to be smarter about how it spends its money on emergency preparedness in this country,” said Seabury, who is also with the Leonard D. Schaeffer Center for Health Policy & Economics at USC. “We need to know which communities are prepared and which aren’t, when money is spent, and whether it’s really making these communities better off in handling a disaster.”

The authors make the following recommendations:

1. The federal government should develop and assess measures of emergency preparedness both at the community-level and across communities in the U.S.

2. Measures developed by the federal government should be used to conduct a nation-wide gap analysis of community preparedness.

3. Alternative ways of distributing funding should be considered to ensure all communities have the ability to build and sustain local coalitions to support sufficient infrastructure.

4. When monies are released for projects, there should be clear metrics of grant effectiveness.

5. There should be better coordination at the federal level, including funding and grant guidance.

6. Local communities should build coalitions or use existing coalitions to build public-private partnerships with local hospitals and other businesses with a stake in preparedness.

7. Communities should be encouraged to engage in ways to finance local preparedness efforts.

“A lot of communities out there have found creative ways to get local businesses to invest in preparedness. The more locals buying into the importance of preparedness, the more resilient a community is,” said Pines, who is also a professor of emergency medicine at GW SMHS and professor of health policy at the GW School of Public Health and Health Services. “How Boston responded and recovered so effectively after the marathon bombings is a great example of a prepared community.”

The study, titled “Value-Based Models for Sustaining Emergency Preparedness Capacity and Capability in the United States,” was published by The Institute of Medicine Preparedness Forum.

Peak Oil Is Dead. Long Live Peak Oil! (The Nation)

The eulogies for peak oil came too soon. 

Michael T. Klare

January 9, 2014

Oil rig

A drilling rig near Kennedy, Texas. (AP Photo/Eric Gay)

Among the big energy stories of 2013, “peak oil”—the once-popular notion that worldwide oil production would soon reach a maximum level and begin an irreversible decline—was thoroughly discredited. The explosive development of shale oil and other unconventional fuels in the United States helped put it in its grave.

As the year went on, the eulogies came in fast and furious. “Today, it is probably safe to say we have slayed ‘peak oil’ once and for all, thanks to the combination of new shale oil and gas production techniques,” declared Rob Wile, an energy and economics reporter for Business Insider. Similar comments from energy experts were commonplace, prompting an R.I.P. headline at Time.com announcing, “Peak Oil is Dead.”

Not so fast, though. The present round of eulogies brings to mind the Mark Twain’s famous line: “The reports of my death have been greatly exaggerated.” Before obits for peak oil theory pile up too high, let’s take a careful look at these assertions. Fortunately, the International Energy Agency (IEA), the Paris-based research arm of the major industrialized powers, recently did just that—and the results were unexpected. While not exactly reinstalling peak oil on its throne, it did make clear that much of the talk of a perpetual gusher of American shale oil is greatly exaggerated. The exploitation of those shale reserves may delay the onset of peak oil for a year or so, the agency’s experts noted, but the long-term picture “has not changed much with the arrival of [shale oil].”

The IEA’s take on this subject is especially noteworthy because its assertion only a year earlier that the US would overtake Saudi Arabia as the world’s number one oil producer sparked the “peak oil is dead” deluge in the first place. Writing in the 2012 edition of its World Energy Outlook, the agency claimed not only that “the United States is projected to become the largest global oil producer” by around 2020, but also that with US shale production and Canadian tar sands coming online, “North America becomes a net oil exporter around 2030.”

That November 2012 report highlighted the use of advanced production technologies—notably horizontal drilling and hydraulic fracturing (“fracking”)—to extract oil and natural gas from once inaccessible rock, especially shale. It also covered the accelerating exploitation of Canada’s bitumen (tar sands or oil sands), another resource previously considered too forbidding to be economical to develop. With the output of these and other “unconventional” fuels set to explode in the years ahead, the report then suggested, the long awaited peak of world oil production could be pushed far into the future.

The release of the 2012 edition of World Energy Outlook triggered a global frenzy of speculative reporting, much of it announcing a new era of American energy abundance. “Saudi America” was the headline over one such hosanna in the Wall Street Journal. Citing the new IEA study, that paper heralded a coming “US energy boom” driven by “technological innovation and risk-taking funded by private capital.” From then on, American energy analysts spoke rapturously of the capabilities of a set of new extractive technologies, especially fracking, to unlock oil and natural gas from hitherto inaccessible shale formations. “This is a real energy revolution,” the Journalcrowed.

But that was then. The most recent edition of World Energy Outlook, published this past November, was a lot more circumspect. Yes, shale oil, tar sands, and other unconventional fuels will add to global supplies in the years ahead, and, yes, technology will help prolong the life of petroleum. Nonetheless, it’s easy to forget that we are also witnessing the wholesale depletion of the world’s existing oil fields and so all these increases in shale output must be balanced against declines in conventional production. Under ideal circumstances—high levels of investment, continuing technological progress, adequate demand and prices—it might be possible to avert an imminent peak in worldwide production, but as the latest IEA report makes clear, there is no guarantee whatsoever that this will occur.

Inching Toward the Peak

Before plunging deeper into the IEA’s assessment, let’s take a quick look at peak oil theory itself.

As developed in the 1950s by petroleum geologist M. King Hubbert, peak oil theory holds that any individual oil field (or oil-producing country) will experience a high rate of production growth during initial development, when drills are first inserted into a oil-bearing reservoir. Later, growth will slow, as the most readily accessible resources have been drained and a greater reliance has to be placed on less productive deposits. At this point—usually when about half the resources in the reservoir (or country) have been extracted—daily output reaches a maximum, or “peak,” level and then begins to subside. Of course, the field or fields will continue to produce even after peaking, but ever more effort and expense will be required to extract what remains. Eventually, the cost of production will exceed the proceeds from sales, and extraction will be terminated.

For Hubbert and his followers, the rise and decline of oil fields is an inevitable consequence of natural forces: oil exists in pressurized underground reservoirs and so will be forced up to the surface when a drill is inserted into the ground. However, once a significant share of the resources in that reservoir has been extracted, the field’s pressure will drop and artificial means—water, gas, or chemical insertion—will be needed to restore pressure and sustain production. Sooner or later, such means become prohibitively expensive.

Peak oil theory also holds that what is true of an individual field or set of fields is true of the world as a whole. Until about 2005, it did indeed appear that the globe was edging ever closer to a peak in daily oil output, as Hubbert’s followers had long predicted. (He died in 1989.) Several recent developments have, however, raised questions about the accuracy of the theory. In particular, major private oil companies have taken to employing advanced technologies to increase the output of the reservoirs under their control, extending the lifetime of existing fields through the use of what’s called “enhanced oil recovery,” or EOR. They’ve also used new methods to exploit fields once considered inaccessible in places like the Arctic and deep oceanic waters, thereby opening up the possibility of a most un-Hubbertian future.

In developing these new technologies, the privately owned “international oil companies” (IOCs) were seeking to overcome their principal handicap: most of the world’s “easy oil”—the stuff Hubbert focused on that comes gushing out of the ground whenever a drill is inserted—has already been consumed or is controlled by state-owned “national oil companies” (NOCs), including Saudi Aramco, the National Iranian Oil Company, and the Kuwait National Petroleum Company, among others. According to the IEA, such state companies control about 80 percent of the world’s known petroleum reserves, leaving relatively little for the IOCs to exploit.

To increase output from the limited reserves still under their control—mostly located in North America, the Arctic, and adjacent waters—the private firms have been working hard to develop techniques to exploit “tough oil.” In this, they have largely succeeded: they are now bringing new petroleum streams into the marketplace and, in doing so, have shaken the foundations of peak oil theory.

Those who say that “peak oil is dead” cite just this combination of factors. By extending the lifetime of existing fields through EOR and adding entire new sources of oil, the global supply can be expanded indefinitely. As a result, they claim, the world possesses a “relatively boundless supply” of oil (and natural gas). This, for instance, was the way Barry Smitherman of the Texas Railroad Commission (which regulates that state’s oil industry) described the global situation at a recent meeting of the Society of Exploration Geophysicists.

Peak Technology

In place of peak oil, then, we have a new theory that as yet has no name but might be called techno-dynamism. There is, this theory holds, no physical limit to the global supply of oil so long as the energy industry is prepared to, and allowed to, apply its technological wizardry to the task of finding and producing more of it. Daniel Yergin, author of the industry classics, The Prize andThe Quest, is a key proponent of this theory. He recently summed up the situation this way: “Advances in technology take resources that were not physically accessible and turn them into recoverable reserves.” As a result, he added, “estimates of the total global stock of oil keep growing.”

From this perspective, the world supply of petroleum is essentially boundless. In addition to “conventional” oil—the sort that comes gushing out of the ground—the IEA identifies six other potential streams of petroleum liquids: natural gas liquids; tar sands and extra-heavy oil; kerogen oil (petroleum solids derived from shale that must be melted to become usable); shale oil; coal-to-liquids (CTL); and gas-to-liquids (GTL). Together, these “unconventional” streams could theoretically add several trillion barrels of potentially recoverable petroleum to the global supply, conceivably extending the Oil Age hundreds of years into the future (and in the process, via climate change, turning the planet into an uninhabitable desert).

But just as peak oil had serious limitations, so, too, does techno-dynamism. At its core is a belief that rising world oil demand will continue to drive the increasingly costly investments in new technologies required to exploit the remaining hard-to-get petroleum resources. As suggested in the 2013 edition of the IEA’s World Energy Outlook, however, this belief should be treated with considerable skepticism.

Among the principal challenges to the theory are these:

1. Increasing Technology Costs: While the costs of developing a resource normally decline over time as industry gains experience with the technologies involved, Hubbert’s law of depletion doesn’t go away. In other words, oil firms invariably develop the easiest “tough oil” resources first, leaving the toughest (and most costly) for later. For example, the exploitation of Canada’s tar sands began with the strip-mining of deposits close to the surface. Because those are becoming exhausted, however, energy firms are now going after deep-underground reserves using far costlier technologies. Likewise, many of the most abundant shale oil deposits in North Dakota have now been depleted, requiring an increasing pace of drilling to maintain production levels. As a result, the IEA reports, the cost of developing new petroleum resources will continually increase: up to $80 per barrel for oil obtained using advanced EOR techniques, $90 per barrel for tar sands and extra-heavy oil, $100 or more for kerogen and Arctic oil, and $110 for CTL and GTL. The market may not, however, be able to sustain levels this high, putting such investments in doubt.

2. Growing Political and Environmental Risk: By definition, tough oil reserves are located in problematic areas. For example, an estimated 13 percent of the world’s undiscovered oil lies in the Arctic, along with 30 percent of its untapped natural gas. The environmental risks associated with their exploitation under the worst of weather conditions imaginable will quickly become more evident—and so, faced with the rising potential for catastrophic spills in a melting Arctic, expect a commensurate increase in political opposition to such drilling. In fact, a recent increase has sparked protests in both Alaska and Russia, including the much-publicized September 2013 attempt by activists from Greenpeace to scale a Russian offshore oil platform—an action that led to their seizure and arrest by Russian commandos. Similarly, expanded fracking operations have provoked a steady increase in anti-fracking activism. In response to such protests and other factors, oil firms are being forced to adopt increasingly stringent environmental protections, pumping up the cost of production further.

3. Climate-Related Demand Reduction: The techno-optimist outlook assumes that oil demand will keep rising, prompting investors to provide the added funds needed to develop the technologies required. However, as the effects of rampant climate change accelerate, more and more polities are likely to try to impose curbs of one sort or another on oil consumption, suppressing demand—and so discouraging investment. This is already happening in the United States, where mandated increases in vehicle fuel-efficiency standards are expected to significantly reduce oil consumption. Future “demand destruction” of this sort is bound to impose a downward pressure on oil prices, diminishing the inclination of investors to finance costly new development projects.

Combine these three factors, and it is possible to conceive of a “technology peak” not unlike the peak in oil output originally envisioned by M. King Hubbert. Such a techno-peak is likely to occur when the “easy” sources of “tough” oil have been depleted, opponents of fracking and other objectionable forms of production have imposed strict (and costly) environmental regulations on drilling operations, and global demand has dropped below a level sufficient to justify investment in costly extractive operations. At that point, global oil production will decline even if supplies are “boundless” and technology is still capable of unlocking more oil every year.

Peak Oil Reconsidered

Peak oil theory, as originally conceived by Hubbert and his followers, was largely governed by natural forces. As we have seen, however, these can be overpowered by the application of increasingly sophisticated technology. Reservoirs of energy once considered inaccessible can be brought into production, and others once deemed exhausted can be returned to production; rather than being finite, the world’s petroleum base now appears virtually inexhaustible.

Does this mean that global oil output will continue rising, year after year, without ever reaching a peak? That appears unlikely. What seems far more probable is that we will see a slow tapering of output over the next decade or two as costs of production rise and climate change—along with opposition to the path chosen by the energy giants—gains momentum. Eventually, the forces tending to reduce supply will overpower those favoring higher output, and a peak in production will indeed result, even if not due to natural forces alone.

Such an outcome is, in fact, envisioned in one of three possible energy scenarios the IEA’s mainstream experts lay out in the latest edition of World Energy Outlook. The first assumes no change in government policies over the next 25 years and sees world oil supply rising from 87 to 110 million barrels per day by 2035; the second assumes some effort to curb carbon emissions and so projects output reaching “only” 101 million barrels per day by the end of the survey period.

It’s the third trajectory, the “450 Scenario,” that should raise eyebrows. It assumes that momentum develops for a global drive to keep greenhouse gas emissions below 450 parts per million—the maximum level at which it might be possible to prevent global average temperatures from rising above two degrees Celsius (and so cause catastrophic climate effects). As a result, it foresees a peak in global oil output occurring around 2020 at about 91 million barrels per day, with a decline to 78 million barrels by 2035.

It would be premature to suggest that the “450 Scenario” will be the immediate roadmap for humanity, since it’s clear enough that, for the moment, we are on a highway to hell that combines the IEA’s first two scenarios. Bear in mind, moreover, that many scientists believe a global temperature increase of even two degrees Celsius would be enough to produce catastrophic climate effects. But as the effects of climate change become more pronounced in our lives, count on one thing: the clamor for government action will grow more intense, and so eventually we’re likely to see some variation of the 450 Scenario take shape. In the process, the world’s demand for oil will be sharply constricted, eliminating the incentive to invest in costly new production schemes.

The bottom line: global peak oil remains in our future, even if not purely for the reasons given by Hubbert and his followers. With the gradual disappearance of “easy” oil, the major private firms are being forced to exploit increasingly tough, hard-to-reach reserves, thereby driving up the cost of production and potentially discouraging new investment at a time when climate change and environmental activism are on the rise.

Peak oil is dead! Long live peak oil!

Brazil 2014: More than just the World Cup (The Christian Science Monitor)

From elections to transportation fare increases and potentially renewed protests, 2014 promises big stories to watch across Brazil.

By Rachel Glickhouse, Guest blogger / January 2, 2014

People watch fireworks exploding over Copacabana beach during New Year celebrations at the Pavao Pavaozinho slum in Rio de Janeiro, January 1, 2014. Pilar Olivares/Reuters

While 2013 [was] an incredibly interesting year forBrazil, 2014 promises to be even more fascinating. Beyond the World Cup, which promises to occupy much of the year’s headlines, here are some of the big issues to watch.

Transportation fare increases: Governments throughout Brazil backed down on raising bus and subway fares in 2013 after those increases helped spursome of the largest protests seen since redemocratization. Nevertheless, a fare increase could be coming in Rio as early as January.

Inflation and cost of living: In 2013, food prices rose over 9 percent and were the major cause of inflation this year. Overall, inflation this year is estimated at under 6 percent, while some estimates put next year’s inflation at a little over 6 percentSão Paulo and Rio in particular continue to see a rising cost of living.

Consumer debt: With more Brazilians gaining access to the banking system and credit, consumer debt has been a growing problem to keep an eye on. Over the past 12 months, the number of Brazilian families in debt has fluctuated between 60 and 65 percent. Around 20 percent of Brazilians are behind on their bills. Over three-quarters of Brazilians in debt point to credit cardsas the source of their debt; credit card interest rates in Brazil continue to be sky-high, reaching up to 500 percent a year.

Security: While in the past decade, the overall trend for homicides has been an increase in the Northeast and a drop in the Southeast, crimes like robberies andmuggings are rising in cities like Rio and São Paulo. Rio in particular has faced problems with crime this year after a period of seeming improvements.

Pacification in Rio: Though initial results were promising, this year has seen some cracks in Rio’s pacification strategy, such as outbreaks of violence in pacified favelas and revelations of police abuses, the most serious being the torture and murder of favela residents. One of the most important things revealed this year are statistics showing disappearances in pacified favelas rising as murders fall. We’ll see what happens with this trend next year. Fundamentally, the biggest problem with the strategy is the police force itself, as some police have traditionally been criminals themselves, either working directly with drug traffickers or operating in militias when off-duty. Without a major police reform, the strategy could see similar challenges next year.

Health and education policies: One of the major complaints of protesters [last] year was that the government is investing in the World Cup but not enough in hospitals and schools. In 2013, the government began importing Cuban doctors in a bid to bring medical services to underserved areas, which initially was met with controversy that has petered out a bit. Much more remains to be done though, so [this year] it will be interesting to see how the program goes. There were also big teachers’ strikes this year which could potentially happen again in 2014.

Corruption scandals: One of the most important things that happened in 2013 was when a group of defendants in the country’s biggest corruption case went to jail. Parts of the trial are going to drag on next year as some defendants get appeals, but a new corruption scandal would feed another one of the protesters’ complaints.

Protests: While it seems likely that there will be some demonstrations around the World Cup, it remains to be seen whether there will be a repeat of the 2013 protests. That will depend on all of the factors above.

Elections: Brazil will hold presidential and legislative elections in October, which means that federal policies will potentially be designed to appease voters as President Dilma Rousseff seeks reelection. It may not be a year to experiment with reforms or to raise taxes, but it could be a year of bread and circuses.

Infrastructure: While a lot of focus will be on finishing stadiums in time for the games, it remains to be seen how many transportation infrastructure projects, ranging from new highways to airport renovations, will be completed before June. In addition, it will be important to see which major infrastructure projects are moving in an election year, like the Belo Monte dam or the São Francisco water project.

– Rachel Glickhouse is the author of the blog Riogringa.com.

Major Reductions in Seafloor Marine Life from Climate Change by 2100 (Science Daily)

Dec. 31, 2013 — A new study quantifies for the first time future losses in deep-sea marine life, using advanced climate models. Results show that even the most remote deep-sea ecosystems are not safe from the impacts of climate change. 

Large animals (megafauna), such as this hydroid Corymorpha glacialis, are projected to suffer major declines under the latest climate change predictions. (Credit: National Oceanography Centre)

An international team of scientists predict seafloor dwelling marine life will decline by up to 38 per cent in the North Atlantic and over five per cent globally over the next century. These changes will be driven by a reduction in the plants and animals that live at the surface of the oceans that feed deep-sea communities. As a result, ecosystem services such as fishing will be threatened.

In the study, led by the National Oceanography Centre, the team used the latest suite of climate models to predict changes in food supply throughout the world oceans. They then applied a relationship between food supply and biomass calculated from a huge global database of marine life.

The results of the study are published this week in the scientific journal Global Change Biology.

These changes in seafloor communities are expected despite living on average four kilometres under the surface of the ocean. This is because their food source, the remains of surface ocean marine life that sink to the seafloor, will dwindle because of a decline in nutrient availability. Nutrient supplies will suffer because of climate impacts such as a slowing of the global ocean circulation, as well as increased separation between water masses — known as ‘stratification’ — as a result of warmer and rainier weather.

Lead author Dr Daniel Jones says: “There has been some speculation about climate change impacts on the seafloor, but we wanted to try and make numerical projections for these changes and estimate specifically where they would occur.

“We were expecting some negative changes around the world, but the extent of changes, particularly in the North Atlantic, were staggering. Globally we are talking about losses of marine life weighing more than every person on the planet put together.”

The projected changes in marine life are not consistent across the world, but most areas will experience negative change. Over 80 per cent of all identified key habitats — such as cold-water coral reefs, seamounts and canyons — will suffer losses in total biomass. The analysis also predicts that animals will get smaller. Smaller animals tend to use energy less efficiently, thereby impacting seabed fisheries and exacerbating the effects of the overall declines in available food.

The study was funded by the Natural Environment Research Council (NERC) as part of the Marine Environmental Mapping Programme (MAREMAP), and involved researchers from the National Oceanography Centre, the Memorial University of Newfoundland, Canada, the University of Tasmania, and the Laboratoire des Sciences du Climat et de l’Environnement, France.

Journal Reference:

  1. Daniel O. B. Jones, Andrew Yool, Chih-Lin Wei, Stephanie A. Henson, Henry A. Ruhl, Reg A. Watson, Marion Gehlen.Global reductions in seafloor biomass in response to climate changeGlobal Change Biology, 2013; DOI:10.1111/gcb.12480

Solution to Cloud Riddle Reveals Hotter Future: Global Temperatures to Rise at Least 4 Degrees C by 2100 (Science Daily)

Dec. 31, 2013 — Global average temperatures will rise at least 4°C by 2100 and potentially more than 8°C by 2200 if carbon dioxide emissions are not reduced according to new research published inNature. Scientists found global climate is more sensitive to carbon dioxide than most previous estimates.

Scientists have revealed the impact of clouds on climate sensitivity. Global average temperatures will rise at least 4 degrees C by 2100 and potentially more than 8 degrees C by 2200 if carbon dioxide emissions are not reduced, according to new research. (Credit: © Maksim Shebeko / Fotolia)

The research also appears to solve one of the great unknowns of climate sensitivity, the role of cloud formation and whether this will have a positive or negative effect on global warming.

“Our research has shown climate models indicating a low temperature response to a doubling of carbon dioxide from preindustrial times are not reproducing the correct processes that lead to cloud formation,” said lead author from the University of New South Wales’ Centre of Excellence for Climate System Science Prof Steven Sherwood.

“When the processes are correct in the climate models the level of climate sensitivity is far higher. Previously, estimates of the sensitivity of global temperature to a doubling of carbon dioxide ranged from 1.5°C to 5°C. This new research takes away the lower end of climate sensitivity estimates, meaning that global average temperatures will increase by 3°C to 5°C with a doubling of carbon dioxide.”

The key to this narrower but much higher estimate can be found in the real world observations around the role of water vapour in cloud formation.

Observations show when water vapour is taken up by the atmosphere through evaporation, the updraughts can either rise to 15 km to form clouds that produce heavy rains or rise just a few kilometres before returning to the surface without forming rain clouds.

When updraughts rise only a few kilometres they reduce total cloud cover because they pull more vapour away from the higher cloud forming regions.

However water vapour is not pulled away from cloud forming regions when only deep 15km updraughts are present.

The researchers found climate models that show a low global temperature response to carbon dioxide do not include enough of this lower-level water vapour process. Instead they simulate nearly all updraughts as rising to 15 km and forming clouds.

When only the deeper updraughts are present in climate models, more clouds form and there is an increased reflection of sunlight. Consequently the global climate in these models becomes less sensitive in its response to atmospheric carbon dioxide.

However, real world observations show this behaviour is wrong.

When the processes in climate models are corrected to match the observations in the real world, the models produce cycles that take water vapour to a wider range of heights in the atmosphere, causing fewer clouds to form as the climate warms.

This increases the amount of sunlight and heat entering the atmosphere and, as a result, increases the sensitivity of our climate to carbon dioxide or any other perturbation.

The result is that when water vapour processes are correctly represented, the sensitivity of the climate to a doubling of carbon dioxide — which will occur in the next 50 years — means we can expect a temperature increase of at least 4°C by 2100.

“Climate sceptics like to criticize climate models for getting things wrong, and we are the first to admit they are not perfect, but what we are finding is that the mistakes are being made by those models which predict less warming, not those that predict more,” said Prof. Sherwood.

“Rises in global average temperatures of this magnitude will have profound impacts on the world and the economies of many countries if we don’t urgently start to curb our emissions.

Journal Reference:

  1. Steven C. Sherwood, Sandrine Bony, Jean-Louis Dufresne.Spread in model climate sensitivity traced to atmospheric convective mixingNature, 2014; 505 (7481): 37 DOI: 10.1038/nature12829

Ancient Traditions: Why We Make New Year Resolutions (Science Daily)

Dec. 30, 2013 — As many of us start to think about our New Year’s resolutions (or breaking them), we may not realize that the tradition of making promises on the first day of the year is a custom started by our Roman ancestors.

Janus, the Roman god of new beginnings, was frequently shown with two faces, referring to the fact that he looks both backwards and forwards. The Romans named the first month of the Julian calendar, Januarius, in his honour. (Credit: Royal Holloway University)

“Rome’s highest officials made a resolution to remain loyal to the republic and swore oaths to the Emperor on 1st January,” said Professor Richard Alston, from the Department of Classics at Royal Holloway University.

“A grand ceremony marked the occasion, where the Roman legions would parade and sacrifices were made on the Capitoline Hill. This annual event renewed the bonds between citizens, the state and the gods.”

New Year’s Day offered all Roman citizens an opportunity to reflect on the past and look to the year ahead. People would exchange sweet fruits and honey, greet each other with blessings for the coming year and the courts only worked in the mornings, so they had a half day holiday.

“On 1 January, our Roman ancestors celebrated Janus, the god of new beginnings who had two faces — one looking into the past and another looking to the future,” Professor Alston added. “Janus represented doors and thresholds and the Romans named the month of January in his honour.

“Janus also symbolized the values of home, family, friendship and civilization, and the doors of his temple were closed when Rome was at peace and thrown open in times of war, as if the god was no longer present. Just like we do today, we also know that the Romans celebrated a mid-winter festival in which they met with friends, exchanges gifts and had a good time before the start of the year ahead.”

Global Map Predicts Locations for Giant Earthquakes (Science Daily)

Dec. 12, 2013 — A team of international researchers, led by Monash University’s Associate Professor Wouter Schellart, have developed a new global map of subduction zones, illustrating which ones are predicted to be capable of generating giant earthquakes and which ones are not.

Andaman Sea. “For the Australian region subduction zones of particular significance are the Sunda subduction zone, running from the Andaman Islands along Sumatra and Java to Sumba, and the Hikurangi subduction segment offshore the east coast of the North Island of New Zealand. Our research predicts that these zones are capable of producing giant earthquakes,” Dr Schellart said. (Credit: © vichie81 / Fotolia)

The new research, published in the journal Physics of the Earth and Planetary Interiors, comes nine years after the giant earthquake and tsunami in Sumatra in December 2004, which devastated the region and many other areas surrounding the Indian Ocean, and killed more than 200,000 people.

Since then two other giant earthquakes have occurred at subduction zones, one in Chile in February 2010 and one in Japan in March 2011, which both caused massive destruction, killed many thousands of people and resulted in billions of dollars of damage.

Most earthquakes occur at the boundaries between tectonic plates that cover the Earth’s surface. The largest earthquakes on Earth only occur at subduction zones, plate boundaries where one plate sinks (subducts) below the other into the Earth’s interior. So far, seismologists have recorded giant earthquakes for only a limited number of subduction zone segments. But accurate seismological records go back to only ~1900, and the recurrence time of giant earthquakes can be many hundreds of years.

“The main question is, are all subduction segments capable of generating giant earthquakes, or only some of them? And if only a limited number of them, then how can we identify these,” Dr Schellart said.

Dr Schellart, of the School of Geosciences, and Professor Nick Rawlinson from the University of Aberdeen in Scotland used earthquake data going back to 1900 and data from subduction zones to map the main characteristics of all active subduction zones on Earth. They investigated if those subduction segments that have experienced a giant earthquake share commonalities in their physical, geometrical and geological properties.

They found that the main indicators include the style of deformation in the plate overlying the subduction zone, the level of stress at the subduction zone, the dip angle of the subduction zone, as well as the curvature of the subduction zone plate boundary and the rate at which it moves.

Through these findings Dr Schellart has identified several subduction zone regions capable of generating giant earthquakes, including the Lesser Antilles, Mexico-Central America, Greece, the Makran, Sunda, North Sulawesi and Hikurangi.

“For the Australian region subduction zones of particular significance are the Sunda subduction zone, running from the Andaman Islands along Sumatra and Java to Sumba, and the Hikurangi subduction segment offshore the east coast of the North Island of New Zealand. Our research predicts that these zones are capable of producing giant earthquakes,” Dr Schellart said.

“Our work also predicts that several other subduction segments that surround eastern Australia (New Britain, San Cristobal, New Hebrides, Tonga, Puysegur), are not capable of producing giant earthquakes.”

Journal Reference:

  1. W.P. Schellart, N. Rawlinson. Global correlations between maximum magnitudes of subduction zone interface thrust earthquakes and physical parameters of subduction zonesPhysics of the Earth and Planetary Interiors, 2013; 225: 41 DOI: 10.1016/j.pepi.2013.10.001

Assessing the Impact of Climate Change On a Global Scale (Science Daily)

Dec. 16, 2013 — Thirty research teams in 12 different countries have systematically compared state-of-the-art computer simulations of climate change impact to assess how climate change might influence global drought, water scarcity and river flooding in the future. What they found was:

• The frequency of drought may increase by more than 20 per cent in some regions.

• Without a reduction in global greenhouse-gas emissions, 40 per cent more people are likely to be at risk of absolute water scarcity.

• Increases in river flooding are expected in more than half of the areas investigated.

• Adverse climate change impacts can combine to create global ‘hotspots’ of climate change impacts.

Dr Simon Gosling from the School of Geography at The University of Nottingham co-authored four papers in this unique global collaboration. The results are published this week — Monday 16 December 2013 — in a special feature of the Proceedings of the National Academy of Sciences (PNAS).

For the project — ‘Intersectoral Impact Model Intercomparison Project (ISI-MIP)’ — Dr Gosling contributed simulations of global river flows to help understand how climate change might impact on global droughts, water scarcity and river flooding.

Dr Gosling said: “This research and the feature in PNAS highlights what could happen across several sectors if greenhouse gas emissions aren’t cut soon. It is complementary evidence to a major report I jointly-led with the Met Office that estimated the potential impacts of unabated climate change for 23 countries. Those reports helped major economies commit to take action on climate change that is demanded by the science, at the 17th UN Climate Change Conference of the Parties (COP17) in Durban.”

One of the papers reports a likely increase in the global severity of drought by the end of the century, with the frequency of drought increasing by more than 20 per cent in some regions — South America, Caribbean, and Central and Western Europe.

This in turn has an impact on water scarcity. Another paper co-authored by Dr Gosling shows that without reductions in global greenhouse-gas emissions, 40 per cent more people are likely to be at risk of absolute water scarcity than would be the case without climate change.

Dr Gosling said: “The global-level results are concerning but they hide important regional variations. For example, while some parts of the globe might see substantial increases in available water, such as southern India, western China and parts of Eastern Africa, other parts of the globe see large decreases in available water, including the Mediterranean, Middle East, the southern USA, and southern China.”

Another paper in the PNAS feature found that while river flooding could decrease by the end of the century across about a third of the globe, increases are expected at more than half of the areas investigated, under a high greenhouse gas emissions scenario.

Dr Gosling said: “More water under climate change is not necessarily always a good thing. While it can indeed help alleviate water scarcity assuming you have the infrastructure to store it and distribute it, there is also a risk that any reductions in water scarcity are tempered by an increase in flood hazard.”

The ISI-MIP team describe how adverse climate change impacts like flood hazard, drought, water scarcity, agriculture, ecosystems, and malaria can combine to create global ‘hotspots’ of climate change impacts4. The study is the first to identify hotspots across these sectors while being based on a comprehensive set of computer simulations both for climate change and for the impacts it is causing. The researchers identified the Amazon region, the Mediterranean and East Africa as regions that might experience severe change in multiple sectors.

The findings of the ISI-MIP are amongst the scientific publications that feed into the Intergovernmental Panel on Climate Change (IPCC) Working Group II report on climate change impacts to be presented in March 2014. The IPCC Working Group I report on physical climate science was published in September 2013.

Dr Gosling’s 23-volume report, Climate: observations, projections and impacts, commissioned by the Department of Energy and Climate Change (DECC), which he jointly led with the UK Met Office, addressed an urgent international need for scientific evidence on the impact of climate change to be presented in a consistent format for different countries, particularly those that lack an adequate research infrastructure, to facilitate valid international comparisons. Since COP17, the research has prompted governments to re-consider their options for adapting to climate change.

He said: “I think the results presented in the PNAS special feature have the potential for similar impact.”

Journal Reference:

  1. F. Piontek, C. Muller, T. A. M. Pugh, D. B. Clark, D. Deryng, J. Elliott, F. d. J. Colon Gonzalez, M. Florke, C. Folberth, W. Franssen, K. Frieler, A. D. Friend, S. N. Gosling, D. Hemming, N. Khabarov, H. Kim, M. R. Lomas, Y. Masaki, M. Mengel, A. Morse, K. Neumann, K. Nishina, S. Ostberg, R. Pavlick, A. C. Ruane, J. Schewe, E. Schmid, T. Stacke, Q. Tang, Z. D. Tessler, A. M. Tompkins, L. Warszawski, D. Wisser, H. J. Schellnhuber. Multisectoral climate impact hotspots in a warming worldProceedings of the National Academy of Sciences, 2013; DOI: 10.1073/pnas.1222471110

Drought and Climate Change: An Uncertain Future? (Science Daily)

Dec. 16, 2013 — Drought frequency may increase by more than 20% in some regions of the globe by the end of the 21st century, but it is difficult to be more precise as we don’t know yet how changes in climate will impact on the world’s rivers.

The results come from a study, published in Proceedings of the National Academy of Sciences (PNAS), which examined computer simulations from an ensemble of state of the art global hydrological models driven by the latest projections from five global climate models used for the fifth assessment report of the Intergovernmental Panel on Climate Change.

The research was led by Dr Christel Prudhomme from the UK’s Centre for Ecology & Hydrology working with colleagues from the UK, USA, the Netherlands, Germany and Japan.

Increasing concentrations of greenhouse gases in the atmosphere are widely expected to influence global climate over the coming century. The impact on drought is uncertain because of the complexity of the processes but can be estimated using outputs from an ensemble of global hydrological and climate models.

The new study concluded that an increase in global severity of hydrological drought — essentially the proportion of land under drought conditions — is likely by the end of the 21st century, with systematically greater increases if no climate change mitigation policy is implemented.

Under the ‘business as usual’ scenario (an energy-intensive world due to high population growth and slower rate of technological development), droughts exceeding 40% of analysed land area were projected by nearly half of the simulations carried out. This increase in drought severity has a strong signal to noise ratio at the global scale; this mean we are relatively confident that an increase in drought will happen but we don’t know exactly by how much.

Dr Prudhomme said, “Our study shows that the different representations of terrestrial water cycle processes in global hydrological models are responsible for a much larger uncertainty in the response of hydrological drought to climate change than previously thought. We don’t know how much changed climate patterns will affect the frequency of low flows in rivers.”

One important source of uncertainty depends on whether the models allow plants to adapt to enriched carbon dioxide atmosphere. If this is accounted for, the increase in droughts due to warmer climate and changes in precipitation is mitigated by reduced evaporation from plants, because they are more efficient at capturing carbon during photosynthesis. The process of plant adaptation under an enriched carbon dioxide atmosphere is currently absent from the majority of conceptual hydrological models and only considered on a few land surface and ecology models.

Dr Prudhomme added, “When assessing the impact of climate change on hydrology it is hence critical to consider a diverse range of hydrological models to better capture the uncertainty.”

Journal Reference:

  1. C. Prudhomme, I. Giuntoli, E. L. Robinson, D. B. Clark, N. W. Arnell, R. Dankers, B. M. Fekete, W. Franssen, D. Gerten, S. N. Gosling, S. Hagemann, D. M. Hannah, H. Kim, Y. Masaki, Y. Satoh, T. Stacke, Y. Wada, D. Wisser.Hydrological droughts in the 21st century, hotspots and uncertainties from a global multimodel ensemble experimentProceedings of the National Academy of Sciences, 2013; DOI: 10.1073/pnas.1222473110

Desafios do clima (O Globo)

JC e-mail 4869, de 05 de dezembro de 2013

Artigo de Carlos Rittl* publicado no Globo. Em nosso caso, precisamos deixar de lado discurso de que já fizemos muito e mais que os outros

A 19ª Conferência das Partes da Convenção-Quadro da ONU sobre Mudança do Clima, realizada em Varsóvia, acabou com resultados fracos. Em vez de respostas à emergência climática, países como Japão e Austrália reduziram seus compromissos de corte de emissões de gases de efeito estufa. Venceu o lobby dos que não querem ação, como o setor dos combustíveis fósseis. Perdemos, todos, o recurso mais precioso que temos para resolver o problema, o tempo.

Talvez a única boa nova da COP19 tenha sido a mobilização para a COP20, em Lima, em 2014. O governo do Peru prometeu restabelecer a confiança no processo. A sociedade civil global se mobiliza para cobrar todos os governos em 2014. A COP20 será fundamental: é preciso definir quem vai “pagar a conta” das mudanças climáticas, como pagará (cortes de emissões, financiamento, tecnologia etc.) e quando pagará. E também garantir apoio a países em desenvolvimento, em especial os mais pobres e mais vulneráveis, aos efeitos das mudanças climáticas, de que forma será este apoio e a que tempo.

Em 2014, todos os países precisam apresentar sua proposta de compromisso de redução de emissões para o pós-2020. Em nosso caso, precisamos deixar de lado o discurso de que já fizemos muito e mais do que os outros, que nossa matriz energética é limpa, que reduzimos o desmatamento. Nenhum dos nossos maiores planos de desenvolvimento ou nossa política fiscal e tributária é vinculado a uma lógica econômica baseada em reduções progressivas de emissões de gases de efeito estufa.

O Plano de Expansão da Geração de Energia 2022 prevê mais de R$ 800 bilhões de investimentos em combustíveis fósseis – 72% do investimento total em energia do país. Apenas de 1% a 2% dos recursos do Plano Agrícola e Pecuário anual são investidos em agricultura de baixo carbono. Uso da terra, energia e agropecuária são responsáveis por mais de 90% das nossas emissões, como aponta o Sistema de Estimativas de Emissões de Gases de Efeito Estufa do Observatório do Clima. Com o salto da taxa do desmatamento na Amazônia em 2013, de 28% em relação a 2012 – terceiro maior aumento relativo da taxa já registrado – teremos muito provavelmente todos os setores de nossa economia contribuindo para o aumento das emissões em 2013.

O documento de atualização do Plano Nacional sobre Mudança do Clima, objeto recente de consulta pública, relaciona ações em execução, mas não é nada estratégico, com metas, prazos, orçamento, sistema de monitoramento e avaliação bem definidos.

Para colocar nossa economia no caminho inevitável e estratégico de baixas emissões de carbono no longo prazo, o país tem cumprir o que rege a Política Nacional sobre Mudança do Clima, promover a compatibilização dos princípios, objetivos e diretrizes de todas as políticas e programas governamentais com os desta Política. Ao fim de 2013, estamos longe disso.

*Carlos Rittl é secretário executivo do Observatório do Clima.

(O Globo)
http://oglobo.globo.com/opiniao/desafios-do-clima-10971024#ixzz2mc121Ii6

*   *   *

JC e-mail 4869, de 05 de dezembro de 2013

Painel afirma que mudanças climáticas trazem risco a curto e longo prazo

Relatório do Conselho Nacional de Pesquisa americano cita possível colapso do gelo no mar polar, uma potencial extinção em massa da vida vegetal e animal, e a ameaça de zonas mortas no oceano

O aquecimento global contínuo representa um risco de mudanças rápidas e drásticas em alguns sistemas humanos e naturais, advertiu nesta terça-feira um painel científico, que cita ainda o possível colapso do gelo no mar polar, uma potencial extinção em massa da vida vegetal e animal, e a ameaça de zonas mortas no oceano.

Ao mesmo tempo, alguns dos piores temores em relação a mudanças climáticas já incorporados ao imaginário popular podem ser descartadas como improváveis, pelo menos durante o próximo século, o painel concluiu. Estes incluem um repentino aumento de liberação de metano dos oceanos ou do Ártico capaz de fritar o planeta, bem como o desligamento da circulação de calor no Oceano Atlântico, que iria resfriar áreas de terras próximas – temor que inspirou o apocalíptico filme “O dia depois de amanhã”, de 2004.

O painel foi nomeado pelo Conselho Nacional de Pesquisa (National Research Council, em inglês), um grupo sem fins lucrativos de Washington que supervisiona estudos sobre as principais questões científicas. Em um relatório divulgado terça-feira, o painel pediu a criação de um sistema que alerte com antecedência a sociedade sobre mudanças capazes de produzir caos. Surpresas climáticas desagradáveis ? já ocorreram, e novas surpresas parecem inevitáveis, talvez dentro de algumas décadas, avisaram os membros do painel. Mas, segundo eles, pouco tem sido feito para se preparar para elas.

– A realidade é que o clima está mudando – disse James WC White, paleoclimatologista da Universidade de Colorado Boulder, que chefiou a comissão sobre os impactos das mudanças climáticas bruscas. – E ele vai continuar mudando, e fazer parte do cotidiano dos séculos vindouros. Talvez até mais do que isso.

A maioria dos cientistas do clima acredita que a liberação de gases do efeito estufa causada pelo homem tem tornado as enormes mudanças na terra inevitáveis, mas também espera que muitas delas evoluam num ritmo lento o suficiente para que a sociedade possa se adaptar.

O documento do painel divulgado terça-feira é o último de uma série de relatórios a considerar a possibilidade de algumas mudanças ocorrem de forma súbita, provocando estresse social ou ambiental, e até mesmo colapso. Como os relatórios anteriores, o novo considera muitas possibilidades potenciais e descarta a maioria delas como improvável – pelo menos a curto prazo. Mas alguns dos riscos são reais, aponta o painel, e em vários casos já aconteceu.

Ele citou o surto de besouros no oeste americano e no Canadá. Sem as noites muito frias no inverno que antes os matavam, os besouros destruíram dezenas de milhões de hectares de florestas. O dano foi tão grave que pode ser visto do espaço.

Da mesma forma, um declínio drástico do gelo marinho de verão ocorreu muito mais rápido no Ártico do que os cientistas esperavam. O painel advertiu que o gelo do mar Ártico pode desaparecer no verão dentro de várias décadas, com impactos severos sobre a vida selvagem e as comunidades humanas na região, além de efeitos desconhecidos para os padrões climáticos do mundo.

Entre os maiores riscos para os próximos anos, o painel prevê um aumento da taxa de extinção de plantas e animais, com as mudanças climáticas provocando a sexta extinção em massa na história da Terra. Muitos dos recifes de coral no mundo, fontes vitais de peixes que servem de alimento para milhões de pessoas, já parecem fadados a desaparecer dentro de algumas décadas.

Outro risco, visto como moderadamente provável no próximo século, é o aumento de calor na parte superior do oceano provocar a redução de oxigênio nas profundezas. No pior dos casos, haveria criação de grandes zonas com muito pouco oxigênio para a sobrevivência das criaturas do mar, com consequências desconhecidas para a ecologia global do oceano, disse o painel.

O relatório considerou a que a possibilidade de um colapso do manto de gelo da Antártica Ocidental, considerada especialmente vulnerável ao aquecimento do oceano, iria acelerar bastante a taxa de aumento do nível do mar. A curto prazo, este risco “desconhecido, mas provavelmente baixo”.

(Justin Gillis, do New York Times/O Globo)
http://oglobo.globo.com/ciencia/revista-amanha/painel-afirma-que-mudancas-climaticas-trazem-risco-curto-longo-prazo-10965038#ixzz2mbqJmetX

Notícias relacionadas à polêmica ao redor do gás de xisto

JC e-mail 4870, de 06 de dezembro de 2013

Extensa audiência pública sobre a exploração do gás de xisto causa polêmica

Encontro foi promovido ontem pela Comissão de Meio Ambiente e Desenvolvimento Sustentável da Câmara dos Deputados

Os constantes debates acerca da exploração do gás de xisto no Brasil continuam gerando polêmica. O cerne da questão gira em torno dos graves impactos sobre o ambiente e a saúde pública. A contaminação de lençóis freáticos e o uso excessivo da água são as maiores críticas feitas pela forma como o gás é explorado.

Hoje é realizada uma técnica chamada fraturação hidráulica. Nesse processo, toneladas de água misturadas a produtos químicos e areia são injetadas na rocha para se extrair o gás, após furos verticais e horizontais. A água usada volta à superfície já poluída por hidrocarbonetos, metais e aditivos químicos.

Para debater a questão, a Comissão de Meio Ambiente e Desenvolvimento Sustentável da Câmara dos Deputados promoveu mais uma audiência pública, realizada na manhã desta quinta-feira (5/12), em Brasília.

Contra a exploração, o Partido Verde quer impedir o procedimento no Brasil. “Não temos segurança tecnológica para explorar isso. Por que não damos ênfase nas energias renováveis?”, criticou o deputado Sarney Filho (PV/MA),líder do partido na Câmara dos Deputados. Ele afirmou que o PV vai propor uma moratória de cinco anos de prospecção do recurso. O parlamentar informou que o partido tomará a decisão como exemplo de países da União Européia, como a França.

“A exploração do xisto é relativamente nova. Tem ocorrido com bastante intensidade nos países afora. É preciso que se façam estudos cautelosamente”, afirmou o parlamentar. “E o Brasil, ao contrário dos Estados Unidos e França, tem alternativas. Temos produção de energia elétrica limpa, quase toda ela de hidrelétrica, temos potencial da energia solar e eólica, que está sendo subaproveitado”, explicou Sarney Filho.

Segundo Ricardo Baitelo, coordenador da Campanha de Energia da ONG Greenpeace, o uso do gás de xisto não é imprescindível neste momento. “Ainda que a demanda energética nacional aumente mais de duas vezes até 2050, temos fontes renováveis e reservas de gás convencional suficientes para suprir a demanda do setor industrial e elétrico”, defendeu.

Para Carlos Alberto Bocuhy, do Instituto Brasileiro de Proteção Ambiental, o País “não pode embarcar em uma aventura tecnológica (exploração de gás de xisto) ainda sem respostas”.

O professor da Universidade Federal de Santa Catarina Luiz Fernando Scheibe alertou que a exploração de gás não convencional, ou de xisto no País, deve ser submetida a uma avaliação ambiental estratégica antes de autorizada. A avaliação, prevista legalmente, é um instrumento mais amplo do que os estudos de impacto ambiental normalmente utilizados para o licenciamento de empreendimentos energéticos.

Também pesquisador do tema e Conselheiro da Sociedade Brasileira para o Progresso da Ciência (SBPC), Jailson de Andrade lembrou que a maioria dos estudos sobre o assunto aponta a necessidade de estudos prévios locais para exploração. Segundo ele, ainda há muita controvérsia científica quanto à questão.

“Há um estudo da National Academy of Science, nos Estados Unidos, que mostra que, em 141 poços de água potável na Pensilvânia, quanto mais próximo de áreas de exploração de gás não convencional, maior a quantidade de metano (tóxico e inflamável) na água”, informou Jailson. “A controvérsia na literatura é se isso já existia antes ou se é resultado da perfuração para obtenção de gás”, observou o pesquisador.

E acrescenta “O Brasil está em uma posição muito confortável em relação à energia, sua matriz energética é majoritamente hídrica, renovável, tem um programa de bicombustível que é o melhor do mundo, então porque entrar nesta nova era sem a menor necessidade energética que justifique isto?”.

Nomenclatura
Apesar de chamar de gás de xisto, os especialistas da área esclareceram durante o debate que a questão é em relação ao gás natural extraído de folhelho (shale gas, em inglês). Folheto é uma rocha argilosa de origem sedimentar; xisto é uma rocha metamórfica, de outra origem, portanto. Mas, há uma longa e equivocada tradição brasileira de chamar folhelho (shale) de xisto (schist), daí se falar muito em gás de xisto.

Carta
A Sociedade Brasileira para o Progresso da Ciência (SBPC) e Academia Brasileira de Ciências (ABC) enviaram uma carta à presidente Dilma Rousseff, solicitando a suspensão da licitação para a exploração do gás de xisto, até que estudos mais conclusivos sobre a questão sejam realizados.

No documento, a presidente da SBPC, Helena Nader, e o presidente da ABC, Jacob Palis, justificam sua preocupação pelo fato de que a exploração econômica do gás de xisto vir sendo muito questionada pelos danos ambientais irreversíveis que pode causar.

Por isso, eles pedem que antes da realização da licitação sejam realizados novos estudos por universidades e institutos de pesquisa públicos, sobre a real potencialidade da utilização do método da fratura hidráulica para a retirada do produto das rochas e os possíveis prejuízos ambientais causados por isso.

Governo
Otaviano da Cruz Pessoa, gerente-geral da Gerência Executiva de Exploração da Petrobras, reconheceu que, de fato, há riscos na exploração de gás de xisto. Mas, segundo ele, são riscos inerentes a qualquer atividade energética, inclusive de gás convencional.

” A única diferença do gás de xisto em relação ao tradicional é que, no caso do xisto, as rochas onde está o gás têm menos fluidos e, por isso, você tem que perfurar milhares de poços”, explicou Pessoa.

De acordo com Luciano Teixeira, representante da Agência Nacional de Petróleo, Gás Natural e Biocombustíveis (ANP), os riscos inerentes à exploração de gás de xisto são reais e devem ser melhor conhecidos e mitigados. Mas, segundo ele, a exploração comercial do produto dependerá de autorização prévia, a partir de critérios que devem ser divulgados pela agência em janeiro em uma nova regulamentação.

“Essa regulamentação tem uma base forte na questão da apresentação de estudos e documentações que venham a demonstrar que aquele operador está em condições de realizar aquela atividade e que o ambiente onde ele vai realizar a atividade vai estar protegido da melhor forma possível”, afirmou.

No entanto, Luciano Teixeira explicou que “E, com isso, conta-se com a apresentação de licenciamentos ambientais, estratégia de utilização e disposição de efluentes gerais e o monitoramento de toda a região com relação à possível degradação dos recursos hídricos.”

Segundo o representante da ANP, a atual fase de pesquisa não depende de autorização prévia. Essa etapa pode levar até oito anos, prorrogáveis por mais seis.

Leilão
Em leilão realizado, no dia 28 de novembro, pela ANP, foram arrematados 72 de 240 blocos ofertados com possibilidade de exploração de gás de xisto.A Petrobras participará da exploração em 70% das áreas, localizadas, principalmente, em Sergipe, Alagoas, Bahia e Paraná. Em um primeiro momento, as empresas estão autorizadas apenas a fazer pesquisas para avaliar a segurança econômica, ambiental e social da exploração.

(Camila Cotta, com informações de Beatriz Bulhões e da Agência Câmara)

Matérias de arquivo do Jornal da Ciência:

SBPC e ABC enviam carta à presidente Dilma Rousseff solicitando a suspensão da licitação para a exploração do gás de xisto
http://www.jornaldaciencia.org.br/Detalhe.php?id=88545

Cientistas querem adiar exploração de xisto
http://www.jornaldaciencia.org.br/Detalhe.php?id=90623

*   *   *

JC e-mail 4870, de 06 de dezembro de 2013

Petrobras diz que pode devolver blocos de xisto se a exploração for inviável

A Petrobras arrematou 70% dos 72 blocos leiloados pela ANP, na última semana

Questionado por representantes da sociedade civil presentes à audiência pública sobre exploração de gás de xisto, na Comissão de Meio Ambiente e Desenvolvimento Sustentável, nesta quinta-feira, o representante da Petrobras, Otaviano da Cruz Pessoa, disse que, se a pesquisa indicar insegurança econômica, regulatória ou ambiental, a empresa poderá devolver blocos arrematados para exploração de gás não convencional. A Petrobras arrematou 70% dos 72 blocos leiloados pela Agência Nacional de Petróleo, Gás Natural e Biocombustíveis (ANP), na última semana.

“Embora áreas tenham potencial de existência de recurso não convencional, a Petrobras prioriza o convencional”, disse. “Se, ao fim da fase exploratória (pesquisa), os recursos se mostrarem viáveis economicamente e a produção, segura e regulada, a Petrobras poderá fazer. Caso isso não se verifique, a Petrobras poderá devolver áreas”, completou.

Também questionado por parlamentares e representantes da sociedade civil sobre a necessidade de leilão para gás de xisto neste momento, mesmo com o potencial do país em outras matrizes energéticas, inclusive renováveis, o representante da ANP, Luciano Teixeira, afirmou que, nesse campo, quanto antes os estudos começarem, melhor.

“A gente não tem muito margen para esperar tudo acontecer para realizer estudos. Parte dos estudos implica ver o que temos lá e se ele é viável”, destacou Teixeira.

A audiência pública sobre a exploração de gás de xisto já se encerrou.

(Ana Raquel Macedo/Agência Câmara)

*   *   *

JC e-mail 4870, de 06 de dezembro de 2013

Conselho Nacional de Recursos Hídricos quer mais pesquisa sobre exploração de xisto

O gás de xisto ou folhelho está armazenado entre rochas no subsolo, geralmente a mais de mil metros de profundidade

O Conselho Nacional de Recursos Hídricos deve votar, no próximo dia 17 de dezembro, moção pedindo mais ênfase nas pesquisas antes que a exploração comercial de gás de xisto (tecnicamente, chamado de gás de folhelho) seja liberada no País. A informação é de Marcelo Medeiros, da Secretaria de Recursos Hídricos do Ministério do Meio Ambiente, que participa de audiência pública sobre o assunto na Comissão de Meio Ambiente e Desenvolvimento Sustentável, nesta quinta-feira.

De acordo com Medeiros, o edital do leilão da Agência Nacional de Petróleo, Gás Natural e Bicombustíveis (ANP) realizado para exploração do gás não convencional ou de xisto, na última semana, prevê que os estudos devem ser feitos por um período de cinco a oito anos, podendo ser estendido por mais seis anos. Dos 240 blocos oferecidos no leilão, 72 foram arrematados, principalmente em Sergipe, Alagoas, Bahia e Paraná.

“Não somos contra perfuração para pesquisa, tem que haver nível de conhecimento sobre questão. A exploração para pesquisa deve ser feita e, se for o caso, estendida,” avaliou Medeiros.

O representante do Ministério do Meio Ambiente destacou que, no curto prazo, os efeitos da exploração comercial de gás de xisto, a partir da técnica de fraturamento hidráulico de rochas subterrâneas, podem levar à contaminação de lençóis freáticos por gás metano (que é tóxico e explosivo) ou substâncias químicas, inclusive radioativas, usadas no processo. Há, de acordo com Marcelo Medeiros, uma preocupação sobre a quantidade de água gasta no processo e uma indefinição sobre meios seguros de destinação do líquido residual do fraturamento, possivelmente contaminado.

O gás de xisto ou folhelho está armazenado entre rochas no subsolo, geralmente a mais de mil metros de profundidade. Para extraí-lo, as rochas são quebradas ou fraturadas, com a injeção de grandes quantidades de água, areia e produtos químicos.

Mais avaliação
Segundo Fernando Roberto de Oliveira, gerente de Águas Subterrâneas da Superintendência de Implementação e Projetos da Agência Nacional de Águas (ANA), os impactos da obtenção do gás não convencional ainda precisam ser melhor avaliado antes da liberação comercial dos recursos.

“Se não tivermos conhecimento geológica local, a possibilidade de avançarmos com segurança fica comprometida. Temos que conhecer melhor a hidrogeologia”, explicou.

Um dos autores do pedido de realização da audiência, o deputado Sarney Filho (PV-MA) alertou que falta regulamentação sobre o setor. “Não sabemos efeitos que exploração pode causar nos aquíferos, meio ambiente e no social”, disse.

A audiência pública continua no Plenário 8.

(Ana Raquel Macedo/ Agência Câmara)

* * *

JC e-mail 4869, de 05 de dezembro de 2013

Comissão discute exploração de xisto e seus efeitos sobre o meio ambiente

Há preocupação com riscos de vazamentos subterrâneos, contaminação de aquíferos, danos aos reservatórios e possibilidade de abalos sísmicos

A Comissão de Meio Ambiente e Desenvolvimento Sustentável promove audiência pública hoje, às 10 horas, para discutir a exploração do xisto em território nacional e seus efeitos sobre o meio ambiente. O evento foi solicitado pelos deputados Sarney Filho (PV-MA), Penna (SP) e Pedro Uczai (PT-SC).

Os parlamentares estão preocupados com o leilão de gás de xisto proposto pela Agência Nacional de Petróleo, Gás Natural e Biocombustíveis (ANP): “A exploração desse gás no Brasil ocorre no Paraná, mas em pequena escala. Não serve de parâmetro para os projetos em grande escala que estão sendo anunciados pela ANP”.

Também chamado de gás não convencional, o gás de xisto está armazenado entre rochas no subsolo, geralmente a mais de mil metros de profundidade. Para extraí-lo, as rochas são explodidas, ou fraturadas, com a injeção de grandes quantidades de água, areia e produtos químicos. O método é chamado de fraturamento hidráulico.

Aumento do consumo nos EUA
Nos Estados Unidos, o gás de xisto corresponde, hoje, a 16% da demanda nacional de gás natural; em 2000, era apenas 1% desse total. Os empresários estimam que em 2035 essa fonte possa ocupar 46% do consumo de gás nos EUA.

Os deputados argumentam que os problemas ambientais relacionados à exploração do gás de xisto são imensos: “Conforme estudiosos há riscos de vazamentos subterrâneos; contaminação de aquíferos; danos aos reservatórios produtores de água; possibilidade de abalos sísmicos”.

Eles ressaltam que a tecnologia usual faz uso de uma grande quantidade de água e, consequentemente, também gera um grande volume de rejeitos líquidos poluídos: “O processo industrial é extremamente perigoso. Existe a grande possibilidade de explosões, incêndios, vazamentos de fluidos contaminando solo, e danos aos poços perfurados”.

Convidados
Foram convidados para discutir o tema com os deputados:
– o representante da área de Segurança Operacional e Meio Ambiente da Agência Nacional do Petróleo (ANP) Luciano Silva Pinto Teixeira;
– o gerente de Águas Subterrâneas da Superintendência de Implementação e Projetos da Agência Nacional de Águas, Fernando Roberto de Oliveira;
– o gerente geral de Interpretação e Avaliação das Bacias Terrestres da área de Exploração e Produção da Petrobras, Otaviano da Cruz Pessoa;
– o professor da Universidade Federal de Santa Catarina Luiz Fernando Shceibe;
– o especialista em efeitos ambientais na prospecção do gás de xisto Jailson de Andrade;
– o coordenador da Campanha de Energias Renováveis do Greenpeace Brasil, Ricardo Baitelo; e
– o presidente do Instituto Brasileiro de Proteção Ambiental, Carlos Alberto HailerBocuhy.
A audiência ocorrerá no Plenário 8.

(Agência Câmara)

* * *

JC e-mail 4865, de 29 de novembro de 2013

Preocupações de indígenas com exploração de gás e petróleo serão levadas a autoridades

A Funai denunciou que a ANP não levou em conta o relatório feito pela Fundação sobre o leilão de 240 blocos de petróleo e gás que está sendo realizado hoje no Rio

As sugestões e preocupações apresentadas pelos participantes da audiência que discutiu o leilão de blocos de petróleo e gás sobrepostos a terras indígenas e unidades de conservação serão colocadas em um documento e levados a diversas autoridades, como o Ministério de Minas e Energia e a Presidência da República. A iniciativa será apoiada pelo presidente da Comissão de Legislação Participativa, deputado Lincoln Portela (PR-MG).

No debate realizado nesta quinta-feira, encerrado há pouco, a Funai denunciou que a Agência Nacional de Petróleo (ANP) não levou em conta o relatório feito pela Fundação sobre o leilão de 240 blocos de petróleo e gás que está sendo realizado hoje no Rio de Janeiro.

Já representantes de comunidades indígenas disseram estar preocupados com a preservação ambiental das áreas de exploração em que há comunidades indígenas e disseram estar dispostos a entrar em guerra pela causa.

(Silvia Mugnatto/Agência Câmara)

Anthropology and the Anthropocene (Anthropology News)

By Anthropology News on December 17, 2013 at 2:44 pm

By Amelia Moore

“The Anthropocene” is a label that is gaining popularity in the natural sciences.  It refers to the pervasive influence of human activities on planetary systems and biogeochemical processes.   Devised by Earth scientists, the term is poised to formally end the Holocene Epoch as the geological categorization for Earth’s recent past, present, and indefinite future.  The term is also poised to become the informal slogan of a revitalized environmental movement that has been plagued by popular indifference in recent years.

Climate change is the most well known manifestation of anthropogenic global change, but it is only one example of an Anthropocene event.  Other examples listed by the Earth sciences include biodiversity loss, changes in planetary nutrient cycling, deforestation, the hole in the ozone layer, fisheries decline, and the spread of invasive species.  This change is said to stem from the growth of the human population and the spread of resource intensive economies since the Industrial Revolution (though the initial boundary marker is in dispute with some scientists arguing for the Post-WWII era and others for the advent of agriculture as the critical tipping point).  Whatever the boundary, the Anthropocene signifies multiple anthropological opportunities.

What stance should we, as anthropologists, take towards the Anthropocene? I argue that there are two (and likely more), equally valid approaches to the Anthropocene: anthropology in the Anthropocene and anthropology of the Anthropocene.  Anthropology in the Anthropocene already exists in the form of climate ethnography and work that documents the lived experience of global environmental change.  Arguably, ethnographies of protected areas and transnational conservation strategies exemplify this field as well.  Anthropology in the Anthropocene is characterized by an active concern for the detrimental affects of anthropogenesis on populations and communities that have been marginalized to bear the brunt of global change impacts or who have been haphazardly caught up in global change solution strategies.  This work is engaged with environmental justice and oriented towards political action.

Anthropology of the Anthropocene is much smaller and less well known than anthropology in the Anthropocene, but it will be no less crucial.  Existing work in this vein includes those who take a critical stance towards climate science and politics as social processes with social consequences.  Beyond deconstruction, these critical scholars investigate what forms scientific and political assemblages create and how they participate in remaking the world anew.  Other existing research in this mode interrogates the idea of biodiversity and the historical and cultural context for the notion of anthropogenesis itself.  In the near future, we will see more work that can enquire into both the sociocultural and socioecological implications and manifestations of Anthropocene discourse, practice and logic.

I have only created cursory sketches of anthropology in the Anthropocene and anthropology of the Anthropocene here.  However, these modes are not at all mutually exclusive, and they should inspire many possibilities for future work.  The centrality of anthropos, the idea of the human, within the logics of the Anthropocene is an invitation for anthropology to renew its engagements with the natural sciences in research collaborations and as the object of research, especially the ecological and Earth sciences.

For starters, we should consider the implications of the Anthropocene idea for our understandings of history and collectivity.  If the natural world is finally gaining recognition within the authoritative sciences as intimately interconnected with human life such that these two worlds cease to be separate arenas of thought and action or take on different salience, then both the Humanities and the natural sciences need to devise more appropriate modes of analysis that can speak to emergent socioecologies.  This has begun in anthropology with some recent works of environmental health studies, political ecology, and multispecies ethnography, but is still in its infancy.

In terms of opportunities for legal and political engagement, the Anthropocene signifies possibilities for reconceptualizing environmentalism, conservation and development.  Anthropologists should be cognizant of new design paradigms and models for organizing socioecological collectives from the urban to the small island to the riparian.  We should also be on the lookout for new political collaborations and publics creating conversations utilizing multiple avenues for communication in the academic realm and beyond.  Emergent asymmetries in local and transnational markets and the formation of new multi-sited assemblages of governance should be of special importance.

In terms of science, the Anthropocene signals new horizons for studying and participating in global change science.  The rise of interdisciplinary socioecology, the biosciences of coupled natural and human complexity, geoengineering and the biotech interest in de-extinction are just a sampling of important transformations in research practices, research objects, and the shifting boundaries between the lab and the field.  Ongoing scientific reorientation will continue to yield new arguments about emergent forms of life that will participate in the creation of future assemblages, publics, and movements.

I would also like to caution against potentially unhelpful uses of the Anthropocene idea.  The term should not become a brand signifying a specific style of anthropological research.  It should not gloss over rigid solidifications of time, space, the human, or life.  We should not celebrate creativity in the Anthropocene while ignoring instances of stark social differentiation and capital accumulation, just as we should not focus on Anthropocene assemblages as only hegemonic in the oppressive sense.   Further, we should be cautious with our utilization of the crisis rhetoric surrounding events in the Anthropocene, recognizing that crisis for some can be turned into multiple forms of opportunity for others.  Finally, we must admit the possibility that the Anthropocene may not succeed in gaining lasting traction through formal designation or popularization, and we should not overstate its significance by assuming its universal acceptance.

In the next year, the Section News Column of the Anthropology and Environment Society will explore news, events, projects, and arguments from colleagues and students experimenting with various framings of the Anthropocene in addition to its regular content.  If you would like to contribute to this column, please contact Amelia Moore at a.moore4@miami.edu.

*   *   *

ANTHROPOLOGY AND ENVIRONMENT SOCIETY

Big Data and the Science of the Anthropocene

By Anthropology News on December 17, 2013 at 2:44 pm

By Lizzy Hare

In her September Section News Column, “Anthropology and the Anthropocene,” Amelia Moore made a distinction between anthropology in the Anthropocene and anthropology of the Anthropocene. The distinction is made between those who research the effects of global change and those who investigate the concept of the Anthropocene as a social process. My own research related to the Anthropocene is not on the effects of climate change. Rather, it focuses on the process of establishing credibility, authority, and trust through scientific knowledge.  I am following the process of developing an ecosystem forecast model. This model will provide land managers and policy makers with predictions about landscape and vegetation responses to climate change. Following the model’s development serves as an entry point for exploring what counts as credible scientific knowledge about climate change, who gets to decide what counts, and how credibility is determined.  It is fair to describe my research as “anthropology of the Anthropocene.” However, framing it in this way makes it too easy to neglect the generative nature of the Anthropocene as a concept.

As it is used colloquially, the Anthropocene carries heavy connotations of destruction and degradation, and I do not want to discount the serious environmental consequences of global change, or the inequitable distribution of their effects. But the Anthropocene as a concept also has political and technological consequences. Scientists and policymakers who wish to understand, predict, and manage the consequences of this new anthropogenic geological epoch have pushed forward tremendous innovations in science and technology. The Anthropocene is thus not only about unprecedented human impact on the planet, but also about unprecedented changes in technology, such as the rise of global connectivity and computing power that made “Big Data” possible.

“Big Data” typically refers to massive data sets of quantitative data, often originally collected automatically and for non-specific purposes. Big Data’s optimistic supporters claim that they will be able to revolutionize science by using statistics to mine large sets of data rather than tackling each research question with a different set of methods and tools. While Big Data techniques have led to the success of companies like Google, it remains unclear how or even whether automated data collection and statistical analysis can produce more than large-scale correlations. Recently, however, scientists have been working to develop tools for incorporating Big Data with more traditional empirical data by using simulation models. Scientists are developing this technique for use in climate, weather, and ecological forecast models, as a way to reduce uncertainty in forecasts by constraining them with observed data.

Data assimilation is not the only way that modelers have tried to control uncertainties within climate models. Some political leaders have misconstrued climate science, and it has come under intense scrutiny by multiple government committees following the 2009 “Climategate” scandal. The critics of climate science cite the uncertainties inherent in forecasting as well as concerns that scientists with political agendas manipulate data. This specter hangs over US climate science, and one response has been to develop a quantitative scale for uncertainty in forecasts. This move is grounded in the assumption that quantification is an effective technique for neutralizing information, and it displaces concern and politics on to users of the quantitative information. This is especially attractive when trying to convey information as (potentially) dire as the consequences of climate change.

The Anthropocene as a concept asks us to pay attention to changes in the world around us. These changes have environmental, social, and political impacts. In efforts to understand the environmental changes of the Anthropocene, and to respond to changes in political and social order, both anticipated and actualized, scientists have developed new tools and techniques. Many claim that Big Data techniques are revolutionizing science, but it is probably too early to assess that claim. Techniques for assimilating Big Data into climate models are just one example of technological and scientific developments of the Anthropocene. There are certainly many more. The generative potential of this epoch should be a site for ongoing anthropological inquiry because it has the ability to drastically change the world we live in.

The effects of global change—and thus the scope of anthropology in the Anthropocene—will be vast, even more so if we take seriously the impacts that this epoch has had and will have on science and technology. The lived experience of global environmental change is not limited to encounters with environmental catastrophe. New technologies will have consequences for everyone, perhaps especially for those who cannot access them. As anthropologists, we ought to be attentive to what the Anthropocene is capable of producing, not only what it is capable of destroying.

Lizzy Hare is a doctoral student in the department of anthropology at the University of California, Santa Cruz.

Scientists Anticipated Size and Location of 2012 Costa Rica Earthquake (Science Daily)

Dec. 22, 2013 — Scientists using GPS to study changes in Earth’s shape accurately forecasted the size and location of the magnitude 7.6 Nicoya earthquake that occurred in 2012 in Costa Rica.

Andrew Newman, an associate professor in the School of Earth and Atmospheric Sciences at the Georgia Institute of Technology, performs a GPS survey in Costa Rica’s Nicoya Peninsula in 2010. (Credit: Lujia Feng)

The Nicoya Peninsula in Costa Rica is one of the few places where land sits atop the portion of a subduction zone where Earth’s greatest earthquakes take place. Costa Rica’s location therefore makes it the perfect spot for learning how large earthquakes rupture. Because earthquakes greater than about magnitude 7.5 have occurred in this region roughly every 50 years, with the previous event striking in 1950, scientists have been preparing for this earthquake through a number of geophysical studies. The most recent study used GPS to map out the area along the fault storing energy for release in a large earthquake.

“This is the first place where we’ve been able to map out the likely extent of an earthquake rupture along the subduction megathrust beforehand,” said Andrew Newman, an associate professor in the School of Earth and Atmospheric Sciences at the Georgia Institute of Technology.

The study was published online Dec. 22, 2013, in the journalNature Geoscience. The research was supported by the National Science Foundation and was a collaboration of researchers from Georgia Tech, the Costa Rica Volcanological and Seismological Observatory (OVSICORI) at Universidad Nacional, University California, Santa Cruz, and the University of South Florida.

Subduction zones are locations where one tectonic plate is forced under another one. The collision of tectonic plates during this process can unleash devastating earthquakes, and sometimes devastating tsunamis. The magnitude 9.0 earthquake off the coast of Japan in 2011 was due to just such a subduction zone eaerthquake. The Cascadia subduction zone in the Pacific Northwest is capable of unleashing a similarly sized quake. Damage from the Nicoya earthquake was not as bad as might be expected from a magnitude 7.6 quake.

“Fortunately there was very little damage considering the earthquake’s size,” said Marino Protti of OVSICORI and the study’s lead author. “The historical pattern of earthquakes not only allowed us to get our instruments ready, it also allowed Costa Ricans to upgrade their buildings to be earthquake safe.”

Plate tectonics are the driving force for subduction zones. As tectonic plates converge, strain temporarily accumulates across the plate boundary when portions of the interface between these tectonic plates, called a megathrust, become locked together. The strain can accumulate to dangerous levels before eventually being released as a massive earthquake.

“The Nicoya Peninsula is an ideal natural lab for studying these events, because the coastline geometry uniquely allows us to get our equipment close to the zone of active strain accumulation,” said Susan Schwartz, professor of earth sciences at the University of California, Santa Cruz, and a co-author of the study.

Through a series of studies starting in the early 1990s using land-based tools, the researchers mapped regions where tectonic plates were completely locked along the subduction interface. Detailed geophysical observations of the region allowed the researchers to create an image of where the faults had locked.

The researchers published a study a few months before the earthquake, describing the particular locked patch with the clearest potential for the next large earthquake in the region. The team projected the total amount of energy that could have developed across that region and forecasted that if the locking remained similar since the last major earthquake in 1950, then there is presently enough energy for an earthquake on the order of magnitude 7.8 there.

Because of limits in technology and scientific understanding about processes controlling fault locking and release, scientists cannot say much about precisely where or when earthquakes will occur. However, earthquakes in Nicoya have occurred about every 50 years, so seismologists had been anticipating another one around 2000, give or take 20 years, Newman said. The earthquake occurred in September of 2012 as a magnitude 7.6 quake.

“It occurred right in the area we determined to be locked and it had almost the size we expected,” Newman said.

The researchers hope to apply what they’ve learned in Costa Rica to other environments. Virtually every damaging subduction zone earthquake occurs far offshore.

“Nicoya is the only place on Earth where we’ve actually been able to get a very accurate image of the locked patch because it occurs directly under land,” Newman said. “If we really want to understand the seismic potential for most of the world, we have to go offshore.”

Scientists have been able to reasonably map portions of these locked areas offshore using data on land, but the resolution is poor, particularly in the regions that are most responsible for generating tsunamis, Newman said. He hopes that his group’s work in Nicoya will be a driver for geodetic studies on the seafloor to observe such Earth deformation. These seafloor geodetic studies are rare and expensive today.

“If we want to understand the potential for large earthquakes, then we really need to start doing more seafloor observations,” Newman said. “It’s a growing push in our community and this study highlights the type of results that one might be able to obtain for most other dangerous environments, including offshore the Pacific Northwest.”

Journal Reference:

  1. Marino Protti, Victor González, Andrew V. Newman, Timothy H. Dixon, Susan Y. Schwartz, Jeffrey S. Marshall, Lujia Feng, Jacob I. Walter, Rocco Malservisi, Susan E. Owen. Nicoya earthquake rupture anticipated by geodetic measurement of the locked plate interfaceNature Geoscience, 2013; DOI: 10.1038/ngeo2038

Países pobres estão 100 anos atrás dos ricos em preparação climática (CarbonoBrasil)

16/12/2013 – 11h52

por Jéssica Lipinski, do CarbonoBrasil

mapa1 Países pobres estão 100 anos atrás dos ricos em preparação climática

Novos dados do Índice de Adaptação Global da Universidade de Notre Dame enfatizam disparidades entre países pobres e o risco em relação à resiliência climática; Brasil aparece em 68º lugar, com classificação considerada média-alta

Um novo relatório publicado por pesquisadores da Universidade de Notre Dame afirma que levará mais de um século para que os países em desenvolvimento atinjam o nível de preparação climática que as nações desenvolvidas já possuem.

Índice de Adaptação Global da Universidade de Notre Dame (ND-GAIN), lançado nesta quinta-feira (12) avaliou 175 países e se foca em questões como a vulnerabilidade das nações às mudanças climáticas, ao aquecimento global e a eventos climáticos extremos, como secas severas, tempestades devastadoras e desastres naturais.

Alguns exemplos de países nessa trajetória de 100 anos incluem o Camboja, o Quênia e o Haiti. “Devido ao recente tufão nas Filipinas, algumas pessoas podem estar se perguntando onde essa nação insular fraqueja em termos de prontidão”, comentou Nitesh Chawla, diretor do Centro Interdisciplinar para Ciência de Rede e Aplicações.

“De acordo com os dados, as Filipinas estão mais de 40 anos atrás dos países mais desenvolvidos em preparação climática. Embora isso seja menor do que os países mais pobres, mostra que as Filipinas ainda tem um longo caminho pela frente”, continuou Chawla.

Já alguns dos países emergentes mais industrializados, como o Brasil, apresentaram uma classificação considerada média-alta, apresentando um nível relativamente satisfatório de resiliência. Nosso país ficou em 68º lugar no geral, sendo classificado em 56º em vulnerabilidade e em 79º em preparação.

“Sabíamos que havia disparidades entre os países mais ricos e mais pobres quando se tratava de adaptação e preparação às mudanças climáticas”, colocou Jessica Hellmann, bióloga da Universidade de Notre Dame.

“Mas não sabíamos que levaria mais de 100 anos para que os países mais pobres atingissem os níveis de preparação que os países mais ricos já alcançaram”, acrescentou ela.

Mas os especialistas que trabalharam no relatório declararam que, de acordo com as pesquisas, nem mesmo os países desenvolvidos são exatamente à prova de mudanças climáticas e do aquecimento global.

Pelo contrário, o documento sugere que, embora eles estejam exercendo esforços para aumentar sua resiliência aos fenômenos naturais e eventos climáticos extremos que acontecem em seus territórios, ainda há espaço para melhorias.

“Esses dados são preocupantes, porque eles evidenciam o quão despreparadas algumas das nações mais vulneráveis realmente estão. Mas eles também mostram que os países mais desenvolvidos não estão fazendo o suficiente, o que levanta sérias questões sobre políticas públicas, não importa quão bem desenvolvida uma economia nacional possa ser”, observou Hellmann.

Os pesquisadores esperam que as descobertas ajudem os líderes mundiais a estabeleceram prioridades globais, regionais e nacionais, assim como estimulem a preparação para as mudanças climáticas.

* Publicado originalmente no site CarbonoBrasil.

Plano Clima: Versão final deve ser apresentada no primeiro trimestre de 2014 (Ministério do Meio Ambiente)

13/12/2013 – 12h16

por Tinna Oliveira, do MMA

klink Plano Clima: Versão final deve ser apresentada no primeiro trimestre de 2014

Klink: propostas da sociedade foram incorporadas. Foto: Martim Garcia/MMA

Reunião presencial marca fim da consulta pública do Plano Clima

A sociedade civil contribuiu, por meio de consulta pública, para a atualização do Plano Nacional sobre Mudança do Clima (Plano Clima), o principal instrumento para a implantação da Política Nacional sobre Mudança do Clima. A consulta pública eletrônica ficou aberta de 25 de setembro a 8 de novembro. Nessa quinta-feira (12) aconteceu a última reunião presencial. Durante o período, qualquer cidadão brasileiro pode oferecer suas contribuições, por meio do formulário disponível na internet. Do total de 27 formulários enviados, foram totalizadas 111 contribuições da consulta pública eletrônica. A versão final do plano revisado deve ser apresentada no primeiro trimestre de 2014.

O Ministério do Meio Ambiente (MMA) é o coordenador do Grupo Executivo (GEx) do Comitê Interministerial sobre Mudança do Clima (CIM). Apresentado em 2008 pelo governo federal, o Plano Clima visa incentivar o desenvolvimento e o aprimoramento das ações de mitigação no Brasil, colaborando com o esforço mundial de redução das emissões de gases de efeito estufa, bem como objetiva a criação das condições internas para lidar com os impactos da mudança global do clima (adaptação).

Avaliação

O secretário de Mudanças Climáticas e Qualidade Ambiental do MMA, Carlos Klink, destacou que a consulta pública permitiu incorporar os avanços que aconteceram no Brasil na questão de mudanças do clima e a suas articulações com a negociação internacional. “Isso mostra o tamanho da ambição que o tema mudanças do clima tem dentro do país, pois não é só uma questão internacional, mas também a sociedade brasileira está muito engajada”, enfatizou.

Klink lembra que existem nove planos para mitigação e já está sendo construído o Plano Nacional de Adaptação, previsto para ser concluído até 2015. O tema de mudanças do clima está em destaque no País. “Estamos nos tornando um exemplo internacionalmente e, aqui no Brasil, está criando raízes muito fortes em todos os setores da sociedade”, explicou. Para o secretário, a governança permite um diálogo para construção e elaboração de todos esses planos, com envolvimento de todos os setores dentro e fora do governo. “O documento reflete esse avanço e mostra de maneira sintética esse tremendo trabalho de coordenação”, salientou.

Etapas

A atualização do Plano Clima passou por várias etapas. Desde janeiro, foram realizadas 17 reuniões do Grupo Executivo e sete reuniões do Fórum Brasileiro de Mudanças Climáticas (FBMC). “O fórum é o canal entre a sociedade e o governo para essa questão clima, por isso a gente sempre estimulou que a sociedade usasse o Fórum nas discussões”, explicou o diretor de Climáticas do MMA, Adriano Santhiago.

Segundo ele, vários setores trouxeram contribuições que foram incorporadas no texto apresentado durante a consulta eletrônica. A contribuição da população foi encerrada nesta reunião presencial, na qual participaram representantes do governo, da academia, do setor produtivo e da sociedade civil. O próximo passo é uma discussão governamental para fechar o documento final.

Em 2009, o Congresso Nacional aprovou a Política Nacional sobre Mudança do Clima, com o ineditismo da adoção de vários compromissos nacionais voluntários de redução de emissões. Além disso, foi criado o Fundo Nacional sobre Mudança do Clima e lançados diversos planos setoriais. Outros pontos que merecem destaque são a redução substancial do desmatamento no país, a mudança do perfil das emissões nacionais de gases de efeito estufa e a transformação substantiva da forma como diversos setores, governamentais ou não, se engajaram no esforço para enfrentar a mudança do clima..

* Publicado originalmente no site Ministério do Meio Ambiente.

Fronteiras da biotecnologia (O Estado de S.Paulo)

JC e-mail 4872, de 10 de dezembro de 2013

Artigo de Xico Graziano publicado no Estadão

Plantas transgênicas vieram para ficar. E prevalecer. Suas variedades passaram a dominar a safra de grãos no Brasil. Na corrida tecnológica, ninguém segura a engenharia genética. A ciência vence o medo obscurantista.

Lavouras geneticamente modificadas de soja, milho e algodão, nessa ordem, lideram, com dois terços, a semeadura da área nacional. Produtividade, facilidade no trato, economia de defensivos: aqui as razões principais que explicam seu notável desempenho. Problemas agronômicos, como resistência de ervas invasoras a herbicidas ou ressurgência de pragas, existem, mas se assemelham aos das lavouras convencionais. Não se comprovou alguma tragédia ambiental, tampouco dano à saúde humana, decorrente do uso específico de transgênicos.

Há séculos o melhoramento genético tradicional tem modificado os organismos. As variedades atualmente plantadas ou criadas pouco se parecem com suas ancestrais: o frango deixou de ser caipira, o milho tornou-se ereto, as frutas perdem suas sementes. Nenhum alimento continua “natural”. O patamar da evolução mudou, porém, quando os cientistas descobriram a possibilidade de modificar artificialmente o DNA das espécies. Sem cruzamento sexual.

Tudo começou em 1972. Pesquisadores perceberam que parasitas do gênero Agrobacterium transferiam partes de seu germoplasma para as plantas hospedeiras, estimulando nestas a produção de açúcar, do qual se alimentavam. Quer dizer, ocorria na natureza um mecanismo de transgenia. Dez anos depois, em Gent (Bélgica), cientistas conseguiram pioneiramente efetuar a transgênese em laboratório. Em seguida, certas bactérias foram geneticamente modificadas visando à produção de insulina humana. Os diabéticos comemoraram. A ciência havia dado um tremendo salto no conhecimento.

Desde então as equipes de ponta, em oficinas públicas e privadas, passaram a investir na engenharia genética, turbinando mundialmente a biotecnologia. Esta se destacou, inicialmente, na manipulação de microrganismos. Depois, em 1996, chegou ao campo, com o lançamento de uma variedade de soja resistente à aplicação de herbicida. Começou a grande polêmica. Ativistas ambientais denunciaram a “comida Frankenstein”. Religiosos condenaram os cientistas por manipularem a vida. A opinião pública ficou confusa.

Tal temor, compreensível, resultou na proposta de uma “moratória” de cinco anos, precaução adotada pela União Europeia em 1999. Esse período se considerava suficiente para buscar o esclarecimento das dúvidas sobre a nova tecnologia. O tempo passou, a engenharia genética evoluiu, os preconceitos religiosos e ideológicos cederam lugar às evidências científicas. Novas transgenias surgiram, barreiras foram caindo. Hoje, na agricultura, as variedades modernas, geneticamente alteradas, se fazem presentes em 50 países, plantadas por 17,3 milhões de agricultores, ocupando 10% da terra arável do mundo. Não é mais uma experiência.

Novidades biotecnológicas continuam surgindo. Entre animais, desenvolvem-se cabras transgênicas que produzem em seu leite uma proteína típica da teia de aranha, capaz de gerar polímeros altamente resistentes. Nos vegetais, entusiasma a possibilidade da geração de plantas que suportam “stress hídrico”. Na Embrapa, um gene de cafeeiros resistentes à seca foi introduzido em plantas de fumo, fazendo-as suportar a falta de água no solo. Em Israel, cientistas do Instituto de Tecnologia alteraram os genes de alface, impedindo que suas folhas murchem após a colheita. Sensacional.

Técnicas chamadas “DNA recombinante” invadem a medicina. Utilizando-as, o Instituto Butantã (São Paulo) desenvolveu recente vacina contra a hepatite B; também pela intervenção no genoma viral surgem vacinas contra influenza, dengue, coqueluche e tuberculose. Na Faculdade de Medicina da USP em Ribeirão Preto estuda-se uma vacina transgênica para combater câncer. Porcos geneticamente modificados em Munique (Alemanha) provocaram fraca reação do sistema imunológico humano, abrindo caminho para os xenotransplantes.

Bactérias, leveduras e fungos geneticamente modificados têm sido utilizados na fabricação de alimentos há tempos. Esses microrganismos atuam diretamente nos processos de fermentação, gerando queijos, massas, cerveja; ajudam até na definição do aroma em bebidas e comidas. Etanol celulósico, a partir do bagaço da cana ou de capim, virá de leveduras geneticamente modificadas. Na indústria, o sabão em pó contêm enzimas, oriundas de bactérias transgênicas, que facilitam a degradação de gordura nos tecidos.

Na fronteira da biotecnologia desenvolve-se aqui, na Embrapa, uma incrível técnica – dos promotores constitutivos – capaz de restringir a manifestação de certas proteínas transgênicas em folhas e frutos das plantas modificadas. Ou seja, a planta será transgênica, mas seus frutos, ou grãos, escapam do DNA alterado. O avanço da engenharia genética, base da biotecnologia, é extraordinário em todos os ramos, dando a impressão de que o melhor ainda está por vir.

Por que, então, diante de tanto sucesso ainda há restrições contra os transgênicos, taxando-os de produtos do mal? Boa pergunta. A resposta encontra-se no preconceito criado lá atrás. A rigor, hoje em dia os produtos transgênicos, submetidos a legislação super-rigorosa, são bastante seguros para o consumo. Já outros alimentos, embora “convencionais”, mais parecem uma bomba química: salgadinhos, latarias, maioneses, doces insossos, essas gororobas, sim, impunemente destroem nossa saúde.

Conclusão: transgênico ou convencional, pouco importa. Vale o alimento ser saudável.

Xico Graziano é agrônomo, foi secretário de Agricultura e secretário do Meio Ambiente do Estado de São Paulo

http://www.estadao.com.br/noticias/impresso,fronteiras–da-biotecnologia-,1106577,0.htm

Geoengineering Approaches to Reduce Climate Change Unlikely to Succeed (Science Daily)

Dec. 5, 2013 — Reducing the amount of sunlight reaching the planet’s surface by geoengineering may not undo climate change after all. Two German researchers used a simple energy balance analysis to explain how Earth’s water cycle responds differently to heating by sunlight than it does to warming due to a stronger atmospheric greenhouse effect. Further, they show that this difference implies that reflecting sunlight to reduce temperatures may have unwanted effects on Earth’s rainfall patterns.

Heavy rainfall events can be more common in a warmer world. (Credit: Annett Junginger, distributed via imaggeo.egu.eu)

The results are now published in Earth System Dynamics, an open access journal of the European Geosciences Union (EGU).

Global warming alters Earth’s water cycle since more water evaporates to the air as temperatures increase. Increased evaporation can dry out some regions while, at the same time, result in more rain falling in other areas due to the excess moisture in the atmosphere. The more water evaporates per degree of warming, the stronger the influence of increasing temperature on the water cycle. But the new study shows the water cycle does not react the same way to different types of warming.

Axel Kleidon and Maik Renner of the Max Planck Institute for Biogeochemistry in Jena, Germany, used a simple energy balance model to determine how sensitive the water cycle is to an increase in surface temperature due to a stronger greenhouse effect and to an increase in solar radiation. They predicted the response of the water cycle for the two cases and found that, in the former, evaporation increases by 2% per degree of warming while in the latter this number reaches 3%. This prediction confirmed results of much more complex climate models.

“These different responses to surface heating are easy to explain,” says Kleidon, who uses a pot on the kitchen stove as an analogy. “The temperature in the pot is increased by putting on a lid or by turning up the heat — but these two cases differ by how much energy flows through the pot,” he says. A stronger greenhouse effect puts a thicker ‘lid’ over Earth’s surface but, if there is no additional sunlight (if we don’t turn up the heat on the stove), extra evaporation takes place solely due to the increase in temperature. Turning up the heat by increasing solar radiation, on the other hand, enhances the energy flow through Earth’s surface because of the need to balance the greater energy input with stronger cooling fluxes from the surface. As a result, there is more evaporation and a stronger effect on the water cycle.

In the new Earth System Dynamics study the authors also show how these findings can have profound consequences for geoengineering. Many geoengineering approaches aim to reduce global warming by reducing the amount of sunlight reaching Earth’s surface (or, in the pot analogy, reduce the heat from the stove). But when Kleidon and Renner applied their results to such a geoengineering scenario, they found out that simultaneous changes in the water cycle and the atmosphere cannot be compensated for at the same time. Therefore, reflecting sunlight by geoengineering is unlikely to restore the planet’s original climate.

“It’s like putting a lid on the pot and turning down the heat at the same time,” explains Kleidon. “While in the kitchen you can reduce your energy bill by doing so, in the Earth system this slows down the water cycle with wide-ranging potential consequences,” he says.

Kleidon and Renner’s insight comes from looking at the processes that heat and cool Earth’s surface and how they change when the surface warms. Evaporation from the surface plays a key role, but the researchers also took into account how the evaporated water is transported into the atmosphere. They combined simple energy balance considerations with a physical assumption for the way water vapour is transported, and separated the contributions of surface heating from solar radiation and from increased greenhouse gases in the atmosphere to obtain the two sensitivities. One of the referees for the paper commented: “it is a stunning result that such a simple analysis yields the same results as the climate models.”

Journal Reference:

  1. A. Kleidon, M. Renner. A simple explanation for the sensitivity of the hydrologic cycle to global climate changeEarth System Dynamics Discussions, 2013; 4 (2): 853 DOI: 10.5194/esdd-4-853-2013

The India Problem (Slate)

Why is it thwarting every international climate agreement?

NOV. 27 2013 12:44 PM

By 

Haze in Mumbai, 2009

India has stalled international greenhouse gas accords because climate change isn’t a winning election issue in the developing country. 

Photo by Arko Datta/Reuters

Apowerful but unpredictable force is rising in the battle over the future of the climate. It’s the type of powerful force that’s felt when 1.2 billion people clamor for more electricity—many of them trying to light, heat, and refrigerate their ways out of poverty; others throwing rupees at excessive air conditioning and other newfound luxuries. And it’s the type of unpredictable force that’s felt when the government of those 1.2 billion is in election mode, clamoring for votes by brazenly blocking progress at international climate talks.

Hundreds of millions of Indians live in poverty, wielding a tiny per-person carbon footprint when compared with residents of the West and coming out on top of environmental sustainability surveys. But the country is home to so many people that steady economic growth is turning it into a climate-changing powerhouse. It has developed a gluttonous appetite for coal, one of the most climate-changing fuels and the source of nearly two-thirds of the country’s power. India recently overtook Russia to become the world’s third-biggest greenhouse gas polluter, behind China and the United States. (If you count the European Union as a single carbon-belching bloc, then India comes in fourth).

India has been obstructing progress on international climate talks, culminating during the two weeks of U.N. Framework Convention on Climate Change negotiations that ended Saturday in Warsaw. The Warsaw talks were the latest annual get-together for nearly 200 countries trying to thrash out a new climate treaty to replace the Kyoto Protocol.

India’s erraticism at international climate talks is frustrating the West. But it is also starting to anger some developing nations struggling to cope with violent weather, droughts, and floods blamed on climate change.

India’s stance during climate talks is that developed countries should be legally committed to addressing global warming by reducing their greenhouse gas emissions, and that developing countries should do what they say they can do to help out.

But once-clear distinctions between developed and developing countries are blurring. A growing number of developing countries—including low-lying island states in the Pacific and some countries in Africa and Latin America with which India has long been allied—are eyeing the vast, growing, climate-changing pollution being pumped out by China and India. They are wondering why those two countries, and others in the “developing” camp, shouldn’t also be committed to reducing their emissions.

The Warsaw meetings ended with India and China thwarting efforts by the United States, Europe, and others to commit all countries to measures to address greenhouse gas pollution. Instead, countries agreed in Warsaw to announce their “intended contributions” to slow down global warming in 2015, in advance of final meetings planned in Paris to agree on the new climate treaty.

“Developing countries are a varied group at this stage, and there is a growing frustration about the inability to move forward from some of these countries,” said Jake Schmidt, international climate policy director for the Natural Resources Defense Council, who attended the Warsaw meetings. “Some of their anger is directed at the U.S. and Europe, but more and more of their anger is quietly being directed at friends in the developing world that they see as stalling progress.”

And no country has done more than India to stall progress on international climate negotiations during the past two months.

It began last month in Bangkok, when negotiators met to update the Montreal Protocol. Signed in the late 1980s, the protocol saved the ozone layer by ending the use of chlorofluorocarbons in refrigerants, household goods, and industrial products. The problem was, manufacturers often swapped out CFCs for a closely related group of chemicals called hydrofluorocarbons. HFCs don’t hurt the ozone layer, but it turns out that they are potent greenhouse gases. With climate change now the most important global environmental challenge, the United States and a long list of other countries have proposed amending the Montreal Protocol to phase out the use of HFCs.

All seemed to be going well with the plans for those amendments. India and the other members of the Group of 20 endorsed the proposal during September meetings in Russia. A couple of weeks later, Indian Prime Minister Manmohan Singh reiterated the country’s support for the amendments during meetings with President Obama.

But when international representatives gathered for meetings in Bangkok to actually make the amendments, they were surprised and angered to find the negotiations blocked by India. The country’s environment officials told Indian media that they were worried about the costs associated with switching over to new coolants. What may have worried them even more was the fear of being accused of opening the door for foreign air conditioning and fridge companies to take over domestic markets.

If there’s one thing that no Indian government up for re-election in the current political climate would want, it’s to be seen giving an inch to America on trade.

Then came Warsaw. Extensive negotiations around agriculture had been scheduled for the first of the two weeks of meetings. Farming causes about a fifth of greenhouse gas emissions, due in part to land clearing, energy use, and the methane that bubbles up from rice paddies and is belched out by cattle.

But that’s not what drew farming representatives to Warsaw. Farmers are the hardest hit by changes in the weather—which should help them secure a chunk of the hundreds of billions of dollars in climate aid that a new climate treaty is expected to deliver for poor countries. But India, which is home to farms that are struggling to cope with changing rainfall patterns, spearheaded a maneuver that blocked agricultural negotiations from moving forward. Its negotiators feared that negotiations over farmer adaptation efforts would lead to requests that those farmers also reduce their carbon footprints.

“India has been very clear that agriculture is the mainstay of our population, and we don’t want any mitigation targets there,” said Indrajit Bose, a climate change program manager at the influential Delhi-based Centre for Science and Environment, who attended the Warsaw meetings. “It’s a red line for India, and I think we agree with that.”

During the second week of Warsaw talks, India again blocked progress on HFC reductions, and it worked with China to water down the meeting’s most important agreement on the final day of talks.

Despite instances of Chinese obstructionism at Warsaw, China and the United States have been making headlines during the past week for their blossoming mutual commitment to tackling climate change. Now India appears to be supplanting China as the developing world’s chief climate agitator, even as it takes real steps to boost renewable energy production at home and meet voluntary goals to reduce the “emission intensity” of its economy. (Meanwhile, Japan, Australia, and Canada are taking America’s mantle as the developed world’s chief climate antagonists.)

The India problem isn’t limited to climate talks. Early this year India helped dilute an international agreement that had been crafted to reduce mercury pollution—a major problem with coal-fired power plants.

Before the country’s environment minister was replaced during a mid-2011 Cabinet reshuffle, India had been hailed as a constructive leader during international climate talks. Now it’s being accused of foot-dragging, obstructionism, and flip-flopping.

Recent Indian shenanigans on the global climate stage are partly a reflection of the fact that a federal election will be held in the spring. Such elections are held every five years, and frantic campaigning by long lists of parties occupies many of the months that precede them. In India, despite the country’s acute vulnerability to climate change, the climate is simply not an election issue. BBC polling suggests that 39 percent of Indians have never heard about “climate change.” Indian voters are calling for more affordable energy—not for a reduction in greenhouse gas emissions.

And India, like other developing countries, has been angered by what appears to be reluctance by developed countries to lend a meaningful financial hand as the climate goes awry. A cruel irony of climate change is that the poor countries that did the least to warm the planet are often the hardest hit, vulnerable to rising tides, crop-wilting droughts, and powerful storms. During the talks in Warsaw, Western countries were suddenly balking at previously promised climate aid that would have been worth $100 billion a year by 2020. And developed countries have fobbed off developing countries’ appeals for additional compensation, so-called loss-and-damage payments, when climate change has harmed their people and economies.

It’s not just the electioneering in India that’s causing problems for global climate talks. Another problem seems to be how little press attention the country receives on foreign shores. “There’s not a lot of focus on India anywhere,” said Manish Ram, a renewable-energy analyst for Greenpeace India who attended the Warsaw meetings. “That’s one of the reasons India gets away with doing what it’s been doing.”