Unease among Brazil’s farmers as Congress votes on GM terminator seeds (The Guardian)

Environmentalists warn approval could shatter global agreement not to use technology, with devastating repercussions

 in Rio de Janeiro and 
theguardian.com, Thursday 12 December 2013 16.34 GMT

Brazil national congress

Brazil’s national Congress is under pressure from landowning groups to green light GM ‘terminator’ seeds. Photograph: Ruy Barbosa Pinto/Getty Images/Flickr RF

Brazil is set to break a global moratorium on genetically-modified “terminator” seeds, which are said to threaten the livelihoods of millions of small farmers around the world.

The sterile or “suicide” seeds are produced by means of genetic use restriction technology, which makes crops die off after one harvest without producing offspring. As a result, farmers have to buy new seeds for each planting, which reduces their self-sufficiency and makes them dependent on major seed and chemical companies.

Environmentalists fear that any such move by Brazil – one of the biggest agricultural producers on the planet – could produce a domino effect that would result in the worldwide adoption of the controversial technology.

Major seed and chemical companies, which together own more than 60% of the global seed market, all have patents on terminator seed technologies. However, in the 1990s they agreed not to employ the technique after a global outcry by small farmers, indigenous groups and civil society groups.

In 2000, 193 countries signed up to the UN Convention on Biological Diversity, which recommended a de facto moratorium on this technology.

The moratorium is under growing pressure in Brazil, where powerful landowning groups have been pushing Congress to allow the technology to be used for the controlled propogation of certain plants used for medicines and eucalyptus trees, which provide pulp for paper mills.

The landowning groups want to plant large areas with fast growing GMtrees and other non-food GM crops that could theoretically spread seeds over wide areas. The technology, they argue, would be a safeguard, ensuring that no second generation pollution of GM traits takes place. They insist that terminator seeds would only be used for non-food crops.

Their efforts to force a bill to this effect through Congress, ongoing since 2007, have been slowed due to resistance from environmentalists.

The proposed measure has been approved by the legislature’s agricultural commission, rejected by the environmental commission, and now sits in the justice and citizenship commission. It is likely to go to a full Congressional vote, where it could be passed as early as next Tuesday, or soon after the Christmas recess.

Environment groups say there would be global consequences. “Brazil is the frontline. If the agro-industry breaks the moratorium here, they’ll break it everywhere,” said Maria José Guazzelli, of Centro Ecológico, which represents a coalition of Brazilian NGOs.

This week they presented a protest letter signed by 34,000 people to thwart the latest effort to move the proposed legislation forward. “If this bill goes through, it would be a disaster. Farmers would no longer be able to produce their own seeds. That’s the ultimate aim of the agro-industry,” she said.

The international technology watchdog ETC, which was among the earliest proponents of a ban on terminator technology in the 1990s, fears this is part of a strategy to crack the international consensus.

“If the bill is passed, [we expect] the Brazilian government to take a series of steps that will orchestrate the collapse of the 193-country consensus moratorium when the UN Convention on Biological Diversity meets for its biennial conference in Korea in October 2014,” said executive director Pat Mooney.

But Eduardo Sciarra, Social Democratic party leader in the Brazilian Congress, said the proposed measure did not threaten farmers because it was intended only to set controlled guidelines for the research and development of “bioreactor” plants for medicine.

“Gene use restriction technology has its benefits. This bill allows the use of this technology only where it is good for humanity,” he said.

The technology was developed by the US Department of Agriculture and the world’s largest seed and agrochemical firms. Syngenta, Bayer, BASF, Dow, Monsanto and DuPont together control more than 60% of the global commercial seed market and 76% of the agrochemical market. All are believed to hold patents on the technology, but none are thought to have developed the seeds for commercial use.

Massive protests in the 1990s by Indian, Latin American and south-east Asian peasant farmers, indigenous groups and their supporters put the companies on the back foot, and they were reluctantly forced to shelve the technology after the UN called for a de-facto moratorium in 2000.

Now, while denying that they intend to use terminator seeds, the companies argue that the urgent need to combat climate change makes it imperative to use the technology. In addition, they say that the technology could protect conventional and organic farmers by stopping GM plants spreading their genes to wild relatives – an increasing problem in the US, Argentina and other countries where GM crops are grown on a large scale.

A Monsanto spokesman in Brazil said the company was unaware of the developments and stood by a commitment made in 1999 not to pursue terminator technology. “I’m not aware of so-called terminator seeds having been developed by any organisation, and Monsanto stands firmly by our commitment and has no plans or research relating to this,” said Tom Helscher.

On its website, however, the company’s commitment only appears to relate to “food crops”, which does not encompass the tree and medicinal products under consideration in Brazil.

• Additional research by Anna Kaiser

Background to a controversy

Ever since GM companies were found to be patenting “gene-use restriction” or “terminator” technologies in the 1990s, they have been accused of threatening biodiversity and seeking to make farmers dependent on big industry for their livelihoods.

In many developing countries, where up to 80% of farmers each year choose their best plants and save their own seed, terminator technology is a byword for all genetic modification, raising fears that sterile GM strains could contaminate wild plants and regular crops – with devastating consequences.

The GM companies, which claimed in the 1990s that they wanted to introduce the seeds only to stop farmers stealing their products, were forced to shelve the technology in the face of massive protests in India, Latin Amercia and south-east Asia.

In the face of growing international alarm, the 193 countries signed up to the UN Convention on Biological Diversity unanimously agreed in 2000 that there should be a de facto international moratorium. This was strengthened at the Conference of the Parties in 2006, under the presidency of Brazil.

Since then, the moratorium has held firm. But the GM companies have shifted their arguments, saying that gene-use restriction technologies now allow seeds to reproduce, but could “switch off” the GM traits. This, they argue, would reduce the possibility of the seeds spreading sterility. In addition, they say the technology could protect organic and conventional farmers from the spread of transgenes to wild relatives and weeds, which plagues GM farmers in the US and elsewhere.

The fear now is that the global moratorium could quickly unravel if Brazil, one of the most important agricultural countries in the world, overturns its national law to ban terminator technology. Other countries, pressed strongly by the powerful GM lobby, would probably follow, leading inevitably to more protests.

Geoengineering Approaches to Reduce Climate Change Unlikely to Succeed (Science Daily)

Dec. 5, 2013 — Reducing the amount of sunlight reaching the planet’s surface by geoengineering may not undo climate change after all. Two German researchers used a simple energy balance analysis to explain how Earth’s water cycle responds differently to heating by sunlight than it does to warming due to a stronger atmospheric greenhouse effect. Further, they show that this difference implies that reflecting sunlight to reduce temperatures may have unwanted effects on Earth’s rainfall patterns.

Heavy rainfall events can be more common in a warmer world. (Credit: Annett Junginger, distributed via imaggeo.egu.eu)

The results are now published in Earth System Dynamics, an open access journal of the European Geosciences Union (EGU).

Global warming alters Earth’s water cycle since more water evaporates to the air as temperatures increase. Increased evaporation can dry out some regions while, at the same time, result in more rain falling in other areas due to the excess moisture in the atmosphere. The more water evaporates per degree of warming, the stronger the influence of increasing temperature on the water cycle. But the new study shows the water cycle does not react the same way to different types of warming.

Axel Kleidon and Maik Renner of the Max Planck Institute for Biogeochemistry in Jena, Germany, used a simple energy balance model to determine how sensitive the water cycle is to an increase in surface temperature due to a stronger greenhouse effect and to an increase in solar radiation. They predicted the response of the water cycle for the two cases and found that, in the former, evaporation increases by 2% per degree of warming while in the latter this number reaches 3%. This prediction confirmed results of much more complex climate models.

“These different responses to surface heating are easy to explain,” says Kleidon, who uses a pot on the kitchen stove as an analogy. “The temperature in the pot is increased by putting on a lid or by turning up the heat — but these two cases differ by how much energy flows through the pot,” he says. A stronger greenhouse effect puts a thicker ‘lid’ over Earth’s surface but, if there is no additional sunlight (if we don’t turn up the heat on the stove), extra evaporation takes place solely due to the increase in temperature. Turning up the heat by increasing solar radiation, on the other hand, enhances the energy flow through Earth’s surface because of the need to balance the greater energy input with stronger cooling fluxes from the surface. As a result, there is more evaporation and a stronger effect on the water cycle.

In the new Earth System Dynamics study the authors also show how these findings can have profound consequences for geoengineering. Many geoengineering approaches aim to reduce global warming by reducing the amount of sunlight reaching Earth’s surface (or, in the pot analogy, reduce the heat from the stove). But when Kleidon and Renner applied their results to such a geoengineering scenario, they found out that simultaneous changes in the water cycle and the atmosphere cannot be compensated for at the same time. Therefore, reflecting sunlight by geoengineering is unlikely to restore the planet’s original climate.

“It’s like putting a lid on the pot and turning down the heat at the same time,” explains Kleidon. “While in the kitchen you can reduce your energy bill by doing so, in the Earth system this slows down the water cycle with wide-ranging potential consequences,” he says.

Kleidon and Renner’s insight comes from looking at the processes that heat and cool Earth’s surface and how they change when the surface warms. Evaporation from the surface plays a key role, but the researchers also took into account how the evaporated water is transported into the atmosphere. They combined simple energy balance considerations with a physical assumption for the way water vapour is transported, and separated the contributions of surface heating from solar radiation and from increased greenhouse gases in the atmosphere to obtain the two sensitivities. One of the referees for the paper commented: “it is a stunning result that such a simple analysis yields the same results as the climate models.”

Journal Reference:

  1. A. Kleidon, M. Renner. A simple explanation for the sensitivity of the hydrologic cycle to global climate changeEarth System Dynamics Discussions, 2013; 4 (2): 853 DOI: 10.5194/esdd-4-853-2013

The Oracle of the T Cell (Science Daily)

Dec. 5, 2013 — A platform that simulates how the body defends itself: The T cells of the immune system decide whether to trigger an immune response against foreign substances.

The virtual T cell allows an online simulation of the response of this immune cell to external signals. (Credit: University of Freiburg)

Since December 2013, scientists from around the world can use the “virtual T cell” to test for themselves what happens in the blood cell when receptor proteins are activated on the surface. Prof. Dr. Wolfgang Schamel from the Institute of Biology III, Facutly of Biology, the Cluster of Excellence BIOSS Centre for Biological Signalling Studies and the Center of Chromic Immunodeficiency of the University of Freiburg is coordinating the European Union-funded project SYBILLA, “Systems Biology of T-Cell Activation in Health and Disease.” This consortium of 17 partners from science and industry has been working since 2008 to understand the T cell as a system. Now the findings of the project are available to the public on an interactive platform. Simulating the signaling pathways in the cell enables researchers to develop new therapeutic approaches for cancer, autoimmune diseases, and infectious diseases.

The T cell is activated by vaccines, allergens, bacteria, or viruses. The T cell receptor identifies these foreign substances and sets off intracellular signaling cascades. This response is then modified by many further receptors. In the end, the network of signaling proteins results in cell division, growth, or the release of messengers that guide other cells of the immune system. The network initiates the attack on the foreign substances. Sometimes, however, the process of activation goes awry: The T cells mistakenly attack the body’s own cells, as in autoimmune diseases, or they ignore harmful cells like cancer cells.

The online platform developed by Dr. Utz-Uwe Haus and Prof. Dr. Robert Weismantel from the Department of Mathematics of ETH Zurich in collaboration with Dr. Jonathan Lindquist and Prof. Dr. Burkhart Schraven from the Institute of Molecular and Clinical Immunology of the University of Magdeburg and the Helmholtz Center for Infection Research in Braunschweig allows researchers to click through the signaling network of the T cells: Users can switch on twelve receptors, including the T cell receptor, identify the signals on the surface of other cells, or bind messengers.

The mathematical model then calculates the behavior of the network out of the 403 elements in the system. The result is a combination of the activity of 52 proteins that predict what will happen with the cell: They change the way in which the DNA is read and thus also that which the cell produces. Now researchers can find weak points for active substances that could be used to treat immune diseases or cancer by switching on and off particular signals in the model. Every protein and every interaction between proteins is described in detail in the network, backed up with references to publications. In addition, users can even extend the model themselves to include further signaling proteins.

The India Problem (Slate)

Why is it thwarting every international climate agreement?

NOV. 27 2013 12:44 PM

By 

Haze in Mumbai, 2009

India has stalled international greenhouse gas accords because climate change isn’t a winning election issue in the developing country. 

Photo by Arko Datta/Reuters

Apowerful but unpredictable force is rising in the battle over the future of the climate. It’s the type of powerful force that’s felt when 1.2 billion people clamor for more electricity—many of them trying to light, heat, and refrigerate their ways out of poverty; others throwing rupees at excessive air conditioning and other newfound luxuries. And it’s the type of unpredictable force that’s felt when the government of those 1.2 billion is in election mode, clamoring for votes by brazenly blocking progress at international climate talks.

Hundreds of millions of Indians live in poverty, wielding a tiny per-person carbon footprint when compared with residents of the West and coming out on top of environmental sustainability surveys. But the country is home to so many people that steady economic growth is turning it into a climate-changing powerhouse. It has developed a gluttonous appetite for coal, one of the most climate-changing fuels and the source of nearly two-thirds of the country’s power. India recently overtook Russia to become the world’s third-biggest greenhouse gas polluter, behind China and the United States. (If you count the European Union as a single carbon-belching bloc, then India comes in fourth).

India has been obstructing progress on international climate talks, culminating during the two weeks of U.N. Framework Convention on Climate Change negotiations that ended Saturday in Warsaw. The Warsaw talks were the latest annual get-together for nearly 200 countries trying to thrash out a new climate treaty to replace the Kyoto Protocol.

India’s erraticism at international climate talks is frustrating the West. But it is also starting to anger some developing nations struggling to cope with violent weather, droughts, and floods blamed on climate change.

India’s stance during climate talks is that developed countries should be legally committed to addressing global warming by reducing their greenhouse gas emissions, and that developing countries should do what they say they can do to help out.

But once-clear distinctions between developed and developing countries are blurring. A growing number of developing countries—including low-lying island states in the Pacific and some countries in Africa and Latin America with which India has long been allied—are eyeing the vast, growing, climate-changing pollution being pumped out by China and India. They are wondering why those two countries, and others in the “developing” camp, shouldn’t also be committed to reducing their emissions.

The Warsaw meetings ended with India and China thwarting efforts by the United States, Europe, and others to commit all countries to measures to address greenhouse gas pollution. Instead, countries agreed in Warsaw to announce their “intended contributions” to slow down global warming in 2015, in advance of final meetings planned in Paris to agree on the new climate treaty.

“Developing countries are a varied group at this stage, and there is a growing frustration about the inability to move forward from some of these countries,” said Jake Schmidt, international climate policy director for the Natural Resources Defense Council, who attended the Warsaw meetings. “Some of their anger is directed at the U.S. and Europe, but more and more of their anger is quietly being directed at friends in the developing world that they see as stalling progress.”

And no country has done more than India to stall progress on international climate negotiations during the past two months.

It began last month in Bangkok, when negotiators met to update the Montreal Protocol. Signed in the late 1980s, the protocol saved the ozone layer by ending the use of chlorofluorocarbons in refrigerants, household goods, and industrial products. The problem was, manufacturers often swapped out CFCs for a closely related group of chemicals called hydrofluorocarbons. HFCs don’t hurt the ozone layer, but it turns out that they are potent greenhouse gases. With climate change now the most important global environmental challenge, the United States and a long list of other countries have proposed amending the Montreal Protocol to phase out the use of HFCs.

All seemed to be going well with the plans for those amendments. India and the other members of the Group of 20 endorsed the proposal during September meetings in Russia. A couple of weeks later, Indian Prime Minister Manmohan Singh reiterated the country’s support for the amendments during meetings with President Obama.

But when international representatives gathered for meetings in Bangkok to actually make the amendments, they were surprised and angered to find the negotiations blocked by India. The country’s environment officials told Indian media that they were worried about the costs associated with switching over to new coolants. What may have worried them even more was the fear of being accused of opening the door for foreign air conditioning and fridge companies to take over domestic markets.

If there’s one thing that no Indian government up for re-election in the current political climate would want, it’s to be seen giving an inch to America on trade.

Then came Warsaw. Extensive negotiations around agriculture had been scheduled for the first of the two weeks of meetings. Farming causes about a fifth of greenhouse gas emissions, due in part to land clearing, energy use, and the methane that bubbles up from rice paddies and is belched out by cattle.

But that’s not what drew farming representatives to Warsaw. Farmers are the hardest hit by changes in the weather—which should help them secure a chunk of the hundreds of billions of dollars in climate aid that a new climate treaty is expected to deliver for poor countries. But India, which is home to farms that are struggling to cope with changing rainfall patterns, spearheaded a maneuver that blocked agricultural negotiations from moving forward. Its negotiators feared that negotiations over farmer adaptation efforts would lead to requests that those farmers also reduce their carbon footprints.

“India has been very clear that agriculture is the mainstay of our population, and we don’t want any mitigation targets there,” said Indrajit Bose, a climate change program manager at the influential Delhi-based Centre for Science and Environment, who attended the Warsaw meetings. “It’s a red line for India, and I think we agree with that.”

During the second week of Warsaw talks, India again blocked progress on HFC reductions, and it worked with China to water down the meeting’s most important agreement on the final day of talks.

Despite instances of Chinese obstructionism at Warsaw, China and the United States have been making headlines during the past week for their blossoming mutual commitment to tackling climate change. Now India appears to be supplanting China as the developing world’s chief climate agitator, even as it takes real steps to boost renewable energy production at home and meet voluntary goals to reduce the “emission intensity” of its economy. (Meanwhile, Japan, Australia, and Canada are taking America’s mantle as the developed world’s chief climate antagonists.)

The India problem isn’t limited to climate talks. Early this year India helped dilute an international agreement that had been crafted to reduce mercury pollution—a major problem with coal-fired power plants.

Before the country’s environment minister was replaced during a mid-2011 Cabinet reshuffle, India had been hailed as a constructive leader during international climate talks. Now it’s being accused of foot-dragging, obstructionism, and flip-flopping.

Recent Indian shenanigans on the global climate stage are partly a reflection of the fact that a federal election will be held in the spring. Such elections are held every five years, and frantic campaigning by long lists of parties occupies many of the months that precede them. In India, despite the country’s acute vulnerability to climate change, the climate is simply not an election issue. BBC polling suggests that 39 percent of Indians have never heard about “climate change.” Indian voters are calling for more affordable energy—not for a reduction in greenhouse gas emissions.

And India, like other developing countries, has been angered by what appears to be reluctance by developed countries to lend a meaningful financial hand as the climate goes awry. A cruel irony of climate change is that the poor countries that did the least to warm the planet are often the hardest hit, vulnerable to rising tides, crop-wilting droughts, and powerful storms. During the talks in Warsaw, Western countries were suddenly balking at previously promised climate aid that would have been worth $100 billion a year by 2020. And developed countries have fobbed off developing countries’ appeals for additional compensation, so-called loss-and-damage payments, when climate change has harmed their people and economies.

It’s not just the electioneering in India that’s causing problems for global climate talks. Another problem seems to be how little press attention the country receives on foreign shores. “There’s not a lot of focus on India anywhere,” said Manish Ram, a renewable-energy analyst for Greenpeace India who attended the Warsaw meetings. “That’s one of the reasons India gets away with doing what it’s been doing.”

No Qualms About Quantum Theory (Science Daily)

Nov. 26, 2013 — A colloquium paper published inThe European Physical Journal D looks into the alleged issues associated with quantum theory. Berthold-Georg Englert from the National University of Singapore reviews a selection of the potential problems of the theory. In particular, he discusses cases when mathematical tools are confused with the actual observed sub-atomic scale phenomena they are describing. Such tools are essential to provide an interpretation of the observations, but cannot be confused with the actual object of studies.

The author sets out to demystify a selected set of objections targeted against quantum theory in the literature. He takes the example of Schrödinger’s infamous cat, whose vital state serves as the indicator of the occurrence of radioactive decay, whereby the decay triggers a hammer mechanism designed to release a lethal substance. The term ‘Schrödinger’s cat state’ is routinely applied to superposition of so-called quantum states of a particle. However, this imagined superposition of a dead and live cat has no reality. Indeed, it confuses a physical object with its description. Something as abstract as the wave function − which is a mathematical tool describing the quantum state − cannot be considered a material entity embodied by a cat, regardless of whether it is dead or alive.

Other myths debunked in this paper include the provision of proof that quantum theory is well defined, has a clear interpretation, is a local theory, is not reversible, and does not feature any instant action at a distance. It also demonstrates that there is no measurement problem, despite the fact that the measure is commonly known to disturb the system under measurement. Hence, since the establishment of quantum theory in the 1920s, its concepts are now clearer, but its foundations remain unchanged.

Journal Reference:

  1. Berthold-Georg Englert. On quantum theoryThe European Physical Journal D, 2013; 67 (11) DOI: 10.1140/epjd/e2013-40486-5

Engineering Education May Diminish Concern for Public Welfare Issues (Science Daily)

Nov. 20, 2013 — Collegiate engineering education may foster a “culture of disengagement” regarding issues of public welfare, according to new research by a sociologist at Rice University.

imagesFor the first-of-its-kind study, the researcher used survey data from four U.S. colleges to examine how students’ public-welfare beliefs change during their college engineering education and whether the curricular emphases of their engineering programs are related to students’ beliefs about public welfare. The study found that engineering students leave college less concerned about public welfare than when they entered.

Study author Erin Cech, an assistant professor of sociology who has B.S. degrees in both electrical engineering and sociology, said that many people inside and outside engineering have emphasized the importance of training ethical, socially conscious engineers, but she wonders if engineering education in the U.S. actually encourages young engineers to take seriously their professional responsibility to public welfare.

“There’s an overarching assumption that professional engineering education results in individuals who have a deeper understanding of the public welfare concerns of their profession,” Cech said. “My study found that this is not necessarily the case for the engineering students in my sample.”

Cech said that as part of their education, engineering students learn the profession’s code of ethics, which includes taking seriously the safety, health and welfare of the public. However, she said, it appears that there is something about engineering education that results in students becoming more cynical and less concerned with public policy and social engagement issues.

“The way many people think about the engineering profession as separate from social, political and emotional realms is not an accurate assessment,” Cech said. “People have emotional and social reactions to engineered products all the time, and those products shape people’s lives in deep ways; so it stands to reason that it is important for engineers to be conscious of broader ethical and social issues related to technology.”

Cech said that this “culture of disengagement” is rooted in how engineering education frames engineering problem-solving.

“Issues that are nontechnical in nature are often perceived as irrelevant to the problem-solving process,” Cech said. “There seems to be very little time or space in engineering curricula for nontechnical conversations about how particular designs may reproduce inequality — for example, debating whether to make a computer faster, more technologically savvy and expensive versus making it less sophisticated and more accessible for customers.”

Cech said ignoring these issues does a disservice to students because practicing engineers are required to address social welfare concerns on a regular basis, even if it involves a conflict of interest or whistleblowing.

“If students are not prepared to think through these issues of public welfare, then we might say they are not fully prepared to enter the engineering practice,” Cech said.

Cech became interested in this research topic as an undergraduate electrical engineering student.

“Because I went through engineering education myself, I care deeply about this topic,” she said. “I want to advance the conversation about how engineering education can be the best it can possibly be.”

The study included more than 300 students who entered engineering programs as freshmen in 2003 at four U.S. universities in the Northeast. Rice students were not included in the study. Participants were surveyed in the spring of each year and at 18 months after graduation. In the surveys, students were asked to rate the importance of professional and ethical responsibilities and their individual views on the importance of improving society, being active in their community, promoting racial understanding and helping others in need. In addition, the students were asked how important the following factors are to their engineering programs: ethical and/or social issues, policy implications of engineering, and broad education in humanities and social sciences.

“Culture of Disengagement in Engineering Education?” will appear in an upcoming issue of the journal Science, Technology and Human Values. The research was funded by the National Science Foundation.

Climate change pledges: rich nations face fury over moves to renege (The Guardian)

Typhoon Haiyan raises fear over global warming threat as Philippines leads attack on eve of key talks

 in Warsaw

The Observer, Sunday 17 November 2013

Typhoon Haiyan

Survivors of Typhoon Haiyan form a queue to receive relief goods at a devasted coastal area in Leyte. Photograph: Dondi Tawatao/Getty Images

Developing nations have launched an impassioned attack on the failure of the world’s richest countries to live up to their climate change pledges in the wake of the disaster in the Philippines.

With more than 3,600 people now believed to have been killed byTyphoon Haiyan, moves by several major economies to backtrack on commitments over carbon emissions have put the world’s poorest and most wealthy states on a collision course, on the eve of crucial high-level talks at a summit of world powers.

Yeb Sano, the Philippines’ lead negotiator at the UN climate change summit being held this weekend in Warsaw, spoke of a major breakdown in relations overshadowing the crucial talks, which are due to pave the way for a 2015 deal to bring down global emissions.

The diplomat, on the sixth day of a hunger strike in solidarity for those affected by Haiyan, including his own family, told the Observer: “We are very concerned. Public announcements from some countries about lowering targets are not conducive to building trust. We must acknowledge the new climate reality and put forward a new system to help us manage the risks and deal with the losses to which we cannot adjust.”

Munjurul Hannan Khan, representing the world’s 47 least affluent countries, said: “They are behaving irrationally and unacceptably. The way they are talking to the most vulnerable countries is not acceptable. Today the poor are suffering from climate change. But tomorrow the rich countries will be. It starts with us but it goes to them.”

Recent decisions by the governments of AustraliaJapan and Canada to downgrade their efforts over climate change have caused panic among those states most affected by global warming, who fear others will follow as they rearrange their priorities during the downturn.

In the last few days, Japan has announced it will backtrack on its pledge to reduce its emission cuts from 25% to 3.8% by 2020 on the basis that it had to close its nuclear reactors after the 2011 earthquake and tsunami.

Australia, which is not sending a minister to this weekend’s talks,signalled it may weaken its targets and is repealing domestic carbon lawsfollowing the election of a conservative government.

Canada has pulled out of the Kyoto accord, which committed major industrial economies to reducing their annual CO2 emissions to below 1990 levels.

China’s lead negotiator at the Warsaw talks, Su Wei, said: “I do not have any words to describe my dismay at Japan’s decision.” He criticised Europe for showing a lack of ambition to cut emissions further, adding: “They talk about ratcheting up ambition, but rather they would have to ratchet up to ambition from zero ambition.”

When the highest-level talks start at the summit on Monday, due to be attended by representatives from 195 countries, including energy secretary Ed Davey, the developing world will seek confirmation from states such as Britain that they will not follow the path of Japan and others. David Cameron’s comments this weekend in which he backed carbon emission cuts and suggested that there was growing evidence of a link between manmade climate change and disasters such as Typhoon Haiyan, will inevitably be used to pressure others to offer similar assurances.

The developing world also wants the rich western nations to commit to establishing a compensation scheme for future extreme weather events, as the impact of global warming is increasingly felt. And they want firm signals that rich countries intend to find at least $100bn a year by 2020 to help them to adapt their countries to severe climate extremes.

China and 132 nations that are part of the G77 block of developing countries have expressed dismay that rich countries had refused to discuss a proposal for scientists to calculate emissions since the start of the Industrial Revolution.

Ambassador Jose Antonio Marcondes de Carvalho of Brazil, who initially proposed the talks, said: “We were shocked, very much surprised by their rejection and dismissal. It is puzzling. We need to understand why they have rejected it.

“Developing countries are doing vastly more to reduce their emissions than Annexe 1 [rich] countries.”

Members of the Disaster Emergencies Committee, which co-ordinates British aid efforts, also warned leaders that the disaster offers a glimpse of the future if urgent action is not taken.

Aid agencies including Christian Aid, Cafod, Care International, Oxfam and Tearfund said ministers meeting in the Polish capital must act urgently because climate change is likely to make such extreme weather events more common in the future, putting millions more lives at risk.

A Climate-Change Victory (Slate)

If global warming is slowing, thank the Montreal Protocol.

By 

An aerosol spray can.

No CFCs, please. (Photo by iStock)

Climate deniers like to point to the so-called global warming “hiatus” as evidence that humans aren’t changing the climate. But according a new study, exactly the opposite is true: The recent slowdown in global temperature increases is partially the result of one of the few successful international crackdowns on greenhouse gases.

Back in 1988, more than 40 countries, including the United States, signed the Montreal Protocol, an agreement to phase out the use of ozone-depleting gases like chlorofluorocarbons. (Today the protocol has nearly 200 signatories.) According to the Environmental Protection Agency, CFC emissions are down 90 percent since the protocol, a drop that the agency calls “one of the largest reductions to date in global greenhouse gas emissions.” That’s a blessing for the ozone layer, but also for the climate. CFCs are a potent heat-trapping gas, and a new analysis published in Nature Geoscience finds that slashing them has been a major driver of the much-discussed slowdown in global warming.

Without the protocol, environmental economist Francisco Estrada of the Universidad Nacional Autónoma de México reports, global temperatures today would be about a tenth of a degree Celsius higher than they are. That’s roughly an eighth of the total warming documented since 1880.

Estrada and his co-authors compared global temperature and greenhouse gas emissions records over the last century and found that breaks in the steady upward march of both coincided closely. At times when emissions leveled off or dropped, such as during the Great Depression, the trend was mirrored in temperatures; likewise for when emissions climbed.

“With these breaks, what’s interesting is that when they’re common that’s pretty indicative of causation,” said Pierre Perron, a Boston University economist who developed the custom-built statistical tests used in the study.

The findings put a new spin on investigation into the cause of the recent “hiatus.” Scientists have suggested that several temporary natural phenomena, including thedeep ocean sucking up more heat, are responsible for this slowdown. Estrada says his findings show that a recent reduction in heat-trapping CFCs as a result of the Montreal Protocol has also played an important role.

“Paradoxically, the recent decrease in warming, presented by global warming skeptics as proof that humankind cannot affect the climate system, is shown to have a direct human origin,” Estrada writes in the study.

The chart below, from a column accompanying the study, illustrates that impact. The solid blue line shows the amount of warming relative to pre-industrial levels attributed to CFCs and other gases regulated by the Montreal Protocol; the dashed blue line is an extrapolation of what the level would be without the agreement. Green represents warming from methane; Estrada suggests that leveling out may be the result of improved farming practices in Asia. The diamonds are annual global temperature averages, with the red line fitted to them. The dashed red line represents Estrada’s projection of where global temperature would be without these recent mitigation efforts.

131115_CDESK_chart

Courtesy of Francisco Estrada via Mother Jones

Estrada said his study doesn’t undermine the commonly accepted view among climate scientists that the global warming effect of greenhouse gases can take years or decades to fully manifest. Even if we cut off all emissions today, we’d still very likely see warming into the future, thanks to the long shelf life of carbon dioxide, the principal climate-change culprit. The study doesn’t let CO2 off the hook: The reduction in warming would likely have been even greater if CO2 had leveled off as much as CFCs and methane. Instead, Estrada said, it has increased 20 percent since the protocol was signed.

Still, the study makes clear that efforts to reduce greenhouse gas emissions—like arecent international plan to phase out hydrofluorocarbons, a group of cousin chemicals to CFCs that are used in air conditioners and refrigerators, and the Obama administration’s move this year to impose strict new limits on emissions from power plants—can have a big payoff.

“The Montreal Protocol was really successful,” Estrada said. And as policymakers and climate scientists gather in Warsaw, Poland, for the latest U.N. climate summit next week, “this shows that international agreements can really work.”

A internet e o “orgasmo democrático” (Outras Palavras)

06/11/2013 – 10h05

por Marcos Nunes Carreiro, do Outras Palavras

rede A internet e o “orgasmo democrático”

A emergente participação em rede não produzirá novas ideologias unitárias ou revoluções, mas poderá destruir o velho jogo da governança representativa.

Muito se fala de como as redes sociais vêm modificando o pensamento social e ampliando a capacidade de reflexão, sobretudo dos jovens, em razão da participação fundamental da internet nas manifestações e protestos que tomaram o Brasil nos últimos meses. As mani­festações já viraram pauta nas escolas e com certeza serão conhecidas das próximas gerações. Mas, afinal, qual é o papel político-social das redes sociais e da internet?

Há quem diga que o momento atual do Brasil é de orgasmo democrático, ao ver milhares de pessoas saindo às ruas em razão da situação político-econômica do país. E é realmente instigante acompanhar a efervescência da sociedade, até para quem não tem ânimo de participar. Todavia, há discordância quanto ao termo “orgasmo democrático”. O professor da Faculdade de Comu­nicação da Universidade Federal de Goiás (UFG), Magno Medeiros, por exemplo, diz que orgasmo é um fenômeno fugaz e de satisfação imediata, ao contrário do que vive o Brasil atualmente.

Para ele, o que ocorre, na verdade, é a erupção de uma dor crônica, sedimentada há várias décadas em torno da insatisfação em relação aos direitos de cidadania. “Direitos básicos, como ter um transporte urbano decente, como ter o direito de ser bem tratado na rede pública de saúde, como ter uma educação de qualidade e de acesso democrático a todos. O Brasil experimentou, nos últimos anos, avanços consideráveis no campo da redução das desigualdades sociais e da minimização dos bolsões de pobreza, mas os setores sociais pobres e miseráveis, que emergiram para a classe C, querem mais do que apenas consumir bens básicos como geladeira, fogão, computador, celular, etc. Eles querem ser tratados com dignidade”, diz.

Ideologia social

O autor da expressão que titula a matéria é o italiano Massimo Di Felice, doutor em Ciências da Comunicação pela Universidade de São Paulo (USP) e PHD em sociologia pela Universidade Paris Descartes V, Sorbonne. Di Felice é professor da Escola de Comu­nica­ção e Artes da USP, onde fundou o Centro de Pesquisa Atopos e coordena as pesquisas “Redes digitais e sustentabilidade” e “Net-ativismo: ações colaborativas em redes digitais”.

O termo “orgasmo democrático” surgiu quando o professor foi questionado sobre como, antes, o que reunia milhares de pessoas eram ideologias políticas, e hoje já não é assim. Seria então possível afirmar que vivemos a época de um processo de criação democrática de ideologia social? Segundo Di Felice, a razão política ocidental moderna europeia, positivista e portadora de uma concepção unitária da história, criou as democracias nacionais representativas, que se articulavam pelo agenciamento da conflitualidade através dos partidos políticos e dos sindicatos. E a estrutura comunicativa dessas instituições, correspondente aos fluxos comunicativos da mídia analógica – imprensa, TV e jornais –, é centralizada e vertical, além de maniqueísta, isto é, divide e organiza o mundo em mocinhos e vilões, direita e esquerda, revolucionários e reacionários etc.

Contudo, as redes digitais criaram outros tipos de fluxo comunicativo, descentralizados, que permitem o acesso às informações e a participação de todos na construção de significados. “A razão política moderna é fálica e cristã, busca dominar o mundo, rotula pensamentos enquanto os simplifica, necessita de inimigos e promete a salvação. Já a lógica virtual é plural, se alimenta do presente e não possui ideologia, além de viver o presente ato impulsivo”, analisa.

Ele diz ser normal que a sociedade queira identificar e julgar os movimentos, rotulando-os por exemplo de “fascistas”, pois, segundo ele, a razão ordenadora odeia o novo e o que não compreende. “Porém, julgar os diversos não-movimentos que nasceram pelas redes (espontâneos e não unitários) é como julgar a emoção e a conectividade orgiástica (‘orghia’ em grego significa “sentir com”). A democracia do Brasil está passando de sua dimensão pública televisiva, eleitoral e representativa, para a dimensão digital-conectiva. O país está experimentando um orgasmo democrático. A lógica é, como diria Michel Maffesoli, dionisíaca e não ideológica.”

Segundo Di Felice, do ponto de vista sociopolítico, as arquiteturas informativas digitais e as redes sociais estão trazendo, no mundo inteiro, alterações qualitativas que podem ser classificadas em dez pontos: 1. A possibilidade técnica do acesso de todos a todas as informações; 2. O debate coletivo em rede sobre a questões de interesse público; 3. O fim do monopólio do controle e do agenciamento das informações por parte dos monopólios econômicos e políticos das empresas de comunicação; 4. O fim dos pontos de vista centrais e das ideologias políticas modernas (seja de esquerda ou direita) que tinham a pretensão de controlar e agenciar a conflitualidade social; 5. O fim dos partidos políticos e da cultura representativa de massa que ordenavam e controlavam a participação dos cidadãos, limitando-a ao voto a cada quatro anos.

A partir do sexto ponto, o professor classifica aquilo que trata da evolução sistêmica: 6. O advento de uma lógica social conectiva que se expressa na capacidade que as redes sociais digitais têm de reunir, em tempo real, uma grande quantidade de setores diversos e heterogêneos da população em torno de temáticas de interesse comum; 7. A passagem de um tipo de imaginário político baseado na representação identitária e dialética (esquerda-direita; progressistas-reacionários, etc.) para uma lógica experiencial, conectiva e tecno-colaborativa, que se articula não mais através das ideologias, mas através da experiência entre indivíduos, informações e territórios; 8. O advento de um novo tipo de gestão pública e de democracia; 9. A transformação da relação entre político e cidadão e do papel dos eleitos, que passam a ser considerados não mais como representantes do poder absoluto, mas porta-vozes e meros executores da vontade popular que os vigia a cada decisão; 10. A passagem de um imaginário político, baseado em uma esfera pública na qual a participação dos cidadãos era apenas opinativa, para formas de deliberação coletiva e práticas de decisão colaborativas que se articulam autonomamente nas redes. Acompanhe a entrevista:

Massimo Di Felice 350x200 A internet e o “orgasmo democrático”

Massimo Di Felice

Os protestos são organizados nas redes, mas nota-se que há líderes surgindo nas ruas. Como o senhor vê isso?

Os movimentos nascem nas redes, atuam em ruas, mas não em ruas comuns. Eles atuam em “ruas conectadas” e reproduzindo em tempo real, nas redes, os acontecimentos das manifestações. Através da computação móvel, debatem e buscam soluções continuamente, expressando uma original forma de relação tecno-humana e inaugurando o advento de uma dimensão meta-geográfica e atópica (do greco a-topos: lugar indescritível, lugar estranho, fora do comum). Embora o sociólogo espanhol Manuel Castells defenda que os movimentos sociais contemporâneos nascem nas redes e que somente depois, nas ruas, ganham maior visibilidade, não me parece ser esta a sua descrição mais apropriada. Ao contrário: o que está acontecendo em todas as ruas, em diversos países do mundo, é o advento de uma dimensão imersiva e informativa do conflito, que se exprime numa espacialidade plural, conectiva e informativa. Os manifestantes habitam espaços estendidos, decidem suas estratégias e seus movimentos nas ruas através da interação contínua nas “social networks” e da troca instantânea de informações. Não somente se deslocam conectados, mas a manifestação é tal e acontece de fato somente se é postada na rede, tornando-se novamente digital, isto é, informação. Não é mais possível pensar em espaços físicos versus espaços informativos. Os conflitos são informativos. Jogos de trocas entre corpos e circuitos informativos, experimentações do surgimento de uma carne informatizada, que experimenta as suas múltiplas dimensões: a informativa digital e a sangrenta material, golpeada e machucada. Ambas são reais e nenhuma é separada da outra, mas cada uma ganha a sua “veracidade” no seu agenciamento com a outra.

Todos esses dias de junho, em São Paulo, e em muitas outras capitais, jogamos games coletivos – todos fomos conectados a circuitos de informações, espaços e curtos-circuitos que alteravam nossos movimentos segundo as imagens e as interações dos demais membros do jogo. Todos experimentamos a nossa plural e interativa condição habitativa. O sangue dos manifestantes, golpeados pelos policiais, não caía apenas no chão das ruas, mas se derramava em espacialidades informativas. A polícia, através da computação móvel e das conexões instantâneas, tornou-se mídia, cúmplice de um ato informativo, e os manifestantes experimentaram o prazer de transformar seus corpos em informação. Transformar a polícia em mídia foi uma das grandes contribuições destes movimentos, que não possuem líderes nem direção única. Todas as tentativas oportunistas de direcionar e organizar os conjuntos de movimentos serão desmascaradas. Estamos falando da sociedade civil conectada e não deste ou daquele movimento social. Os atores destes movimentos, portanto, não são apenas os humanos, menos ainda alguns líderes. Não estamos falando de movimentos tradicionais que aconteciam nos espaços urbanos e industriais. Estamos, de fato, já em outro mundo.

Fora das redes, ainda há muita gente sem entender o que as manifestações significam, ou como elas surgiram. No ambiente virtual, há maior entendimento sobre o tema?

As manifestações do Brasil são expressões de uma transformação qualitativa que desde o advento da internet altera a forma de participação e o significado da ação social. O Centro de Pesquisa Atopos, da Universidade de São Paulo, está finalizando uma pesquisa internacional sobre o tema, com o apoio da Fapesp (Fundação de Amparo à Pesquisa do Estado de São Paulo).

A pesquisa analisou as principais formas de net- ativismo em quatro países (Brasil, França, Itália e Portugal). Os resultados são interessantes e mostram claramente alguns elementos comuns que, mesmo em contextos diferentes, se reproduzem e aparecem como caraterísticas parecidas. Isso sublinha, mais uma vez, a importância das redes de conectividade e as caraterísticas tecno-informativas dessas expressões de conflitualidade que surgem na origem, na organização e nas formas de atuação destes movimentos. Em síntese, as principais caraterísticas comuns a todos eles são as seguintes: 1. O net-ativismo se coloca fora da tradição política moderna, pois expressa um novo tipo de conflitualidade que não tem como objetivo a disputa pelo poder. Todos os movimentos que marcam as diversas formas de conflitualidade contemporânea (os Zapatistas, os Indignados, Occupy Wall Street, Anonymous, M15 etc.) não têm como objetivo tornar-se partidos políticos e concorrer nas eleições. São todos explicitamente apartidários e contra a classe política. Reúnem-se todos contra a corrupção, os abusos e a incapacidade dessas mesmas classes políticas e de seus representantes; 2. São movimentos e ações que não estão organizados de forma tradicional, isto é, não são homogêneos, compostos por pessoas que se reconhecem na mesma ideologia ou em torno do mesmo projeto político. Ao contrário: são formas de protesto compostas por diversos atores e nos quais, como numa arquitetura reticular, as contraposições não são dialéticas e não inviabilizam a ação; 3. Possuem uma forma organizativa informal e, sobretudo, sem líderes e sem hierarquias; 4. O anonimato é um valor, não somente porque permite a defesa perante ações repressivas, mas porque é a forma através da qual é defendida a não-identidade, coletiva ou individual, de seus membros e das ações. Na tradição das ações net-ativistas, a ausência de identidade e a não visibilidade é o meio através do qual a conflitualidade não se institucionaliza, tornando-se, assim, irreconhecível, não identificável e capaz de conservar a sua própria eficácia conflitiva; 5. São movimentos ou ações temporários e, portanto, não duradouros, cujas finalidades e ambições máximas são o próprio desaparecimento.

Estes e outros elementos que encontramos em todas as ações net-ativistas são parte, já, de uma tradição que possui textos e reflexões que vão desde o cyberpunk até as contribuições de Hakim Bey, a guerrilha midiática de Luther Blisset, até a conflitualidade informativa zapatista. Os Anonymous e os Indignados e as diversas formas de conflitualidade digital contemporâneas são, na sua especificidade, a continuação disso. Não há uniformidade, nem pertença de nenhum tipo, mas inspiração.

A questão informativa é a grande façanha da tecnologia?

Na teoria da opinião pública, estamos assistindo a uma grande passagem do líder de opinião para o empreendedor cognitivo. O líder de opinião ganhava seu poder de persuasão através do poder midiático que lhe permitia, de forma privilegiada, através da TV ou das páginas de um jornal, alcançar grande parte da população de um país. Esta figura, geralmente um comentarista, um cientista político, um profissional da comunicação, um político ou uma personalidade pública, é hoje substituído no interior das novas dinâmicas dos fluxos informativos por outro tipo de informante e de mediador. Este é aquele que, por ter vivenciado ou por ter sido o próprio protagonista de um acontecimento, distribui, através das mídias digitais, diretamente, sem mediações, o acontecimento.

É o caso dos manifestantes que postaram tudo o que aconteceu nas ruas durante as manifestações. Nenhum comentarista ou líder de opinião conseguiu competir e disputar com eles outra versão dos acontecimentos. Eles, os manifestantes, fizeram a cobertura do evento com seus celulares, suas câmeras baratas, a partir do próprio lugar dos acontecimentos, ao vivo. A maioria das informações que circulavam foi produzida por eles. Isso foi possível porque existe uma tecnologia que permite que isso seja possível. Isto é, também um fato político que quebra em pedaços décadas de estudos sociológicos sobre a relação entre mídia e política, entre mídia e poder. A grande transformação que as redes digitais produzem é a interatividade. As pessoas conectadas buscam suas informações, as ordenam, obtêm mais fontes e elementos para avaliá-las. Digamos que, tendencialmente, a população é mais consciente, pois tem acesso direto a uma quantidade infinita de informações sobre qualquer tipo de assunto, tornando-se eles mesmos editores e criadores de conteúdo. Da mesma maneira, pelos mesmos dinamismos informativos, eles se tornam políticos, administradores e transformadores de suas cidades ou de suas localidades.

O senhor é europeu, mas vive há muitos anos na América Latina. Como difere o processo de expressão massiva entre os dois continentes?

Absolutamente não se distingue. Os movimentos possuem todos eles as mesmas características. Em cada país temos situações específicas e atores diferentes, mas que atuam de maneira análoga: através das redes digitais. Possuem a mesma específica forma de organização coletiva: não institucionalizada e sem hierarquia. Expressam as mesmas reivindicações: contra a corrupção dos partidos políticos, por maior transparência e eficiência, melhor qualidade dos serviços públicos. Desconfiam todos de seus representantes e querem decidir diretamente sobre os assuntos que lhes interessam.

Quais as consequências dessa posição que as manifestações assumem?

A rede é o “Além do Homem” do filósofo alemão Friedrich Nietzsche. Não é fácil, no seu interior, construir éticas coletivas, nem majoritárias, pois o seu dinamismo é emergente e sua forma, temporária. A participação em rede não irá produzir novas ideologias unitárias, menos ainda revoluções, pois sua razão não é abstrata e universal, mas particular e conectiva, mutante e incoerente. Apenas poderá destruir o velho jogo vampiresco da governança representativa e partidária, pois esta não é mais representativa e gera um sistema baseado na corrupção, em que a corrupção não é exceção, mas regra e norma do jogo.

As ideologias políticas que prometiam a igualdade e a salvação do mundo fracassaram, não apenas em seu intento socioeconômico igualitário, mas naquele mais importante: de produzir um novo imaginário social e cultural que nos tornasse parte de uma sociedade mais justa, na qual pudéssemos nos tornar melhores do que somos. A não-ética coletiva das redes não será um decálogo de normas e uma visão de mundo organizada e proferida pela boca das vanguardas, ou dos líderes iluminados, sempre prontos a surfar uma nova onda, mas será muito mais humildemente particular. Não mudará o mundo, mas resolverá através da conectividade problemas concretos e específicos, que têm a ver com a qualidade do ar, o direito à informação, o preço do transporte público, a qualidade do atendimento nos hospitais, a qualidade da educação. Isto é: tudo aquilo que partido nenhum jamais conseguiu fazer.

Para certa esquerda, está em marcha o acirramento de um fascismo nas manifestações, cujo sintoma é a rejeição de partidos nas passeatas. Uma ala da direita, com o apoio da imprensa, também contesta as manifestações como sendo “armação” da esquerda.

É visível para todos o oportunismo e o desespero de uma cultura política da modernidade que se descobriu, de repente, obsoleta e fora da história. Nenhum partido de esquerda consegue hoje representar os anseios e as utopias sequer de uma parte significativa da população. Eles se encontram na singular e cômica situação do menino escoteiro que, para cumprir sua boa ação, tenta convencer a velhinha a atravessar a rua para poder ajudá-la. Só que a velhinha não quer cruzar a rua, mas deseja ir em outra direção. A lógica dialética, eurocêntrica e cristã, baseada na contraposição entre o bem e o mal, marca toda a cultura política da esquerda – que hoje se configura como uma religião laica, não mais racional nem propositiva, mas histérica.

O advento dos movimentos e das manifestações expressou com clareza o desaparecimento do papel de vanguarda, e a incapacidade histórica de análise e de abertura à diversidade e ao livre debate dos partidos. Como na lógica da salvação religiosa, o bom e o justo existem e justificam a sua função somente enquanto existe o mal. A caça às bruxas é uma exigência, a última tentativa de justificar sua função, e uma necessidade ainda de sua presença em defesa dos mais “fracos” e “necessitados”. Não excluo que, em casos não representativos, tenhamos tido a presença de grupos de alguns poucos e isolados indivíduos de direita. Mas a reação e a caça às bruxas que foi gerada é de natureza histérica e a-racional, a última tentativa de voltar no tempo e na história – um passado ameaçador em que havia necessidade de uma ordem, de uma ideologia e de uma vanguarda que representasse o confortador papel da figura paterna.

* Massimo Di Felice estará presente esta semana no I Congresso Internacional de Net-Ativismo, na USP, ao lado de outros pesquisadores renomados: Pierre Lévy, Michel Maffesoli, José Bragança de Miranda e Alberto Abruzzese. 

** Publicado originalmente no site Outras Palavras.

Geoengineering the Climate Could Reduce Vital Rains (Science Daily)

Oct. 31, 2013 — Although a significant build-up in greenhouse gases in the atmosphere would alter worldwide precipitation patterns, a widely discussed technological approach to reduce future global warming would also interfere with rainfall and snowfall, new research shows.

Rice field in Bali. (Credit: © pcruciatti / Fotolia)

The international study, led by scientists at the National Center for Atmospheric Research (NCAR), finds that global warming caused by a massive increase in greenhouse gases would spur a nearly 7 percent average increase in precipitation compared to preindustrial conditions.

But trying to resolve the problem through “geoengineering” could result in monsoonal rains in North America, East Asia, and other regions dropping by 5-7 percent compared to preindustrial conditions. Globally, average precipitation could decrease by about 4.5 percent.

“Geoengineering the planet doesn’t cure the problem,” says NCAR scientist Simone Tilmes, lead author of the new study. “Even if one of these techniques could keep global temperatures approximately balanced, precipitation would not return to preindustrial conditions.”

As concerns have mounted about climate change, scientists have studied geoengineering approaches to reduce future warming. Some of these would capture carbon dioxide before it enters the atmosphere. Others would attempt to essentially shade the atmosphere by injecting sulfate particles into the stratosphere or launching mirrors into orbit with the goal of reducing global surface temperatures.

The new study focuses on the second set of approaches, those that would shade the planet. The authors warn, however, that Earth’s climate would not return to its preindustrial state even if the warming itself were successfully mitigated.

“It’s very much a pick-your-poison type of problem,” says NCAR scientist John Fasullo, a co-author. “If you don’t like warming, you can reduce the amount of sunlight reaching the surface and cool the climate. But if you do that, large reductions in rainfall are unavoidable. There’s no win-win option here.”

The study appears in an online issue of the Journal of Geophysical Research: Atmospheres, published this week by the American Geophysical Union. An international team of scientists from NCAR and 14 other organizations wrote the study, which was funded in part by the National Science Foundation (NSF), NCAR’s sponsor. The team used, among other tools, the NCAR-based Community Earth System Model, which is funded by NSF and the Department of Energy.

Future carbon dioxide, with or without geoengineering

The research team turned to 12 of the world’s leading climate models to simulate global precipitation patterns if the atmospheric level of carbon dioxide, a leading greenhouse gas, reached four times the level of the preindustrial era. They then simulated the effect of reduced incoming solar radiation on the global precipitation patterns.

The scientists chose the artificial scenario of a quadrupling of carbon dioxide levels, which is on the high side of projections for the end of this century, in order to clearly draw out the potential impacts of geoengineering.

In line with other research, they found that an increase in carbon dioxide levels would significantly increase global average precipitation, although there would likely be significant regional variations and even prolonged droughts in some areas.

Much of the reason for the increased rainfall and snowfall has to do with greater evaporation, which would pump more moisture into the atmosphere as a result of more heat being trapped near the surface.

The team then took the research one step further, examining what would happen if a geoengineering approach partially reflected incoming solar radiation high in the atmosphere.

The researchers found that precipitation amounts and frequency, especially for heavy rain events, would decrease significantly. The effects were greater over land than over the ocean, and particularly pronounced during months of heavy, monsoonal rains. Monsoonal rains in the model simulations dropped by an average of 7 percent in North America, 6 percent in East Asia and South America, and 5 percent in South Africa. In India, however, the decrease was just 2 percent. Heavy precipitation further dropped in Western Europe and North America in summer.

A drier atmosphere

The researchers found two primary reasons for the reduced precipitation.

One reason has to do with evaporation. As Earth is shaded and less solar heat reaches the surface, less water vapor is pumped into the atmosphere through evaporation.

The other reason has to do with plants. With more carbon dioxide in the atmosphere, plants partially close their stomata, the openings that allow them to take in carbon dioxide while releasing oxygen and water into the atmosphere. Partially shut stomata release less water, so the cooled atmosphere would also become even drier over land.

Tilmes stresses that the authors did not address such questions as how certain crops would respond to a combination of higher carbon dioxide and reduced rainfall.

“More research could show both the positive and negative consequences for society of such changes in the environment,” she says. “What we do know is that our climate system is very complex, that human activity is making Earth warmer, and that any technological fix we might try to shade the planet could have unforeseen consequences.”

The University Corporation for Atmospheric Research manages the National Center for Atmospheric Research under sponsorship by the National Science Foundation. Any opinions, findings and conclusions, or recommendations expressed in this publication are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.

Journal Reference:

  1. Simone Tilmes, John Fasullo, Jean-Francois Lamarque, Daniel R. Marsh, Michael Mills, Kari Alterskjaer, Helene Muri, Jón E. Kristjánsson, Olivier Boucher, Michael Schulz, Jason N. S. Cole, Charles L. Curry, Andy Jones, Jim Haywood, Peter J. Irvine, Duoying Ji, John C. Moore, Diana B. Karam, Ben Kravitz, Philip J. Rasch, Balwinder Singh, Jin-Ho Yoon, Ulrike Niemeier, Hauke Schmidt, Alan Robock, Shuting Yang, Shingo Watanabe. The hydrological impact of geoengineering in the Geoengineering Model Intercomparison Project (GeoMIP)Journal of Geophysical Research: Atmospheres, 2013; 118 (19): 11,036 DOI:10.1002/jgrd.50868

Patient in ‘Vegetative State’ Not Just Aware, but Paying Attention, Study Suggests (Science Daily)

Oct. 31, 2013 — A patient in a seemingly vegetative state, unable to move or speak, showed signs of attentive awareness that had not been detected before, a new study reveals. This patient was able to focus on words signalled by the experimenters as auditory targets as successfully as healthy individuals. If this ability can be developed consistently in certain patients who are vegetative, it could open the door to specialised devices in the future and enable them to interact with the outside world.

This scan depicts patterns of the vegetative patient’s electrical activity over the head when they attended to the designated words, and when they when they were distracted by novel but irrelevant words. (Credit: Clinical Neurosciences)

The research, by scientists at the Medical Research Council Cognition and Brain Sciences Unit (MRC CBSU) and the University of Cambridge, is published today, 31 October, in the journalNeuroimage: Clinical.

For the study, the researchers used electroencephalography (EEG), which non-invasively measures the electrical activity over the scalp, to test 21 patients diagnosed as vegetative or minimally conscious, and eight healthy volunteers. Participants heard a series of different words — one word a second over 90 seconds at a time — while asked to alternatingly attend to either the word ‘yes’ or the word ‘no’, each of which appeared 15% of the time. (Some examples of the words used include moss, moth, worm and toad.) This was repeated several times over a period of 30 minutes to detect whether the patients were able to attend to the correct target word.

They found that one of the vegetative patients was able to filter out unimportant information and home in on relevant words they were being asked to pay attention to. Using brain imaging (fMRI), the scientists also discovered that this patient could follow simple commands to imagine playing tennis. They also found that three other minimally conscious patients reacted to novel but irrelevant words, but were unable to selectively pay attention to the target word.

These findings suggest that some patients in a vegetative or minimally conscious state might in fact be able to direct attention to the sounds in the world around them.

Dr Srivas Chennu at the University of Cambridge, said: “Not only did we find the patient had the ability to pay attention, we also found independent evidence of their ability to follow commands — information which could enable the development of future technology to help patients in a vegetative state communicate with the outside world.

“In order to try and assess the true level of brain function and awareness that survives in the vegetative and minimally conscious states, we are progressively building up a fuller picture of the sensory, perceptual and cognitive abilities in patients. This study has added a key piece to that puzzle, and provided a tremendous amount of insight into the ability of these patients to pay attention.”

Dr Tristan Bekinschtein at the MRC Cognition and Brain Sciences Unit said: “Our attention can be drawn to something by its strangeness or novelty, or we can consciously decide to pay attention to it. A lot of cognitive neuroscience research tells us that we have distinct patterns in the brain for both forms of attention, which we can measure even when the individual is unable to speak. These findings mean that, in certain cases of individuals who are vegetative, we might be able to enhance this ability and improve their level of communication with the outside world.”

This study builds on a joint programme of research at the University of Cambridge and MRC CBSU where a team of researchers have been developing a series of diagnostic and prognostic tools based on brain imaging techniques since 1998. Famously, in 2006 the group was able to use fMRI imaging techniques to establish that a patient in a vegetative state could respond to yes or no questions by indicating different, distinct patterns of brain activity.

Journal Reference:

  1. Srivas Chennu, Paola Finoia, Evelyn Kamau, Martin M. Monti, Judith Allanson, John D. Pickard, Adrian M. Owen, Tristan A. Bekinschtein. Dissociable endogenous and exogenous attention in disorders of consciousnessNeuroImage: Clinical, 2013; DOI: 10.1016/j.nicl.2013.10.008

Scientists Eye Longer-Term Forecasts of U.S. Heat Waves (Science Daily)

Oct. 27, 2013 — Scientists have fingerprinted a distinctive atmospheric wave pattern high above the Northern Hemisphere that can foreshadow the emergence of summertime heat waves in the United States more than two weeks in advance.

This map of air flow a few miles above ground level in the Northern Hemisphere shows the type of wavenumber-5 pattern associated with US drought. This pattern includes alternating troughs (blue contours) and ridges (red contours), with an “H” symbol (for high pressure) shown at the center of each of the five ridges. High pressure tends to cause sinking air and suppress precipitation, which can allow a heat wave to develop and intensify over land areas. (Credit: Image courtesy Haiyan Teng.)

The new research, led by scientists at the National Center for Atmospheric Research (NCAR), could potentially enable forecasts of the likelihood of U.S. heat waves 15-20 days out, giving society more time to prepare for these often-deadly events.

The research team discerned the pattern by analyzing a 12,000-year simulation of the atmosphere over the Northern Hemisphere. During those times when a distinctive “wavenumber-5″ pattern emerged, a major summertime heat wave became more likely to subsequently build over the United States.

“It may be useful to monitor the atmosphere, looking for this pattern, if we find that it precedes heat waves in a predictable way,” says NCAR scientist Haiyan Teng, the lead author. “This gives us a potential source to predict heat waves beyond the typical range of weather forecasts.”

The wavenumber-5 pattern refers to a sequence of alternating high- and low-pressure systems (five of each) that form a ring circling the northern midlatitudes, several miles above the surface. This pattern can lend itself to slow-moving weather features, raising the odds for stagnant conditions often associated with prolonged heat spells.

The study is being published next week in Nature Geoscience. It was funded by the U.S. Department of Energy, NASA, and the National Science Foundation (NSF), which is NCAR’s sponsor. NASA scientists helped guide the project and are involved in broader research in this area.

Predicting a lethal event

Heat waves are among the most deadly weather phenomena on Earth. A 2006 heat wave across much of the United States and Canada was blamed for more than 600 deaths in California alone, and a prolonged heat wave in Europe in 2003 may have killed more than 50,000 people.

To see if heat waves can be triggered by certain large-scale atmospheric circulation patterns, the scientists looked at data from relatively modern records dating back to 1948. They focused on summertime events in the United States in which daily temperatures reached the top 2.5 percent of weather readings for that date across roughly 10 percent or more of the contiguous United States. However, since such extremes are rare by definition, the researchers could identify only 17 events that met such criteria — not enough to tease out a reliable signal amid the noise of other atmospheric behavior.

The group then turned to an idealized simulation of the atmosphere spanning 12,000 years. The simulation had been created a couple of years before with a version of the NCAR-based Community Earth System Model, which is funded by NSF and the Department of Energy.

By analyzing more than 5,900 U.S. heat waves simulated in the computer model, they determined that the heat waves tended to be preceded by a wavenumber-5 pattern. This pattern is not caused by particular oceanic conditions or heating of Earth’s surface, but instead arises from naturally varying conditions of the atmosphere. It was associated with an atmospheric phenomenon known as a Rossby wave train that encircles the Northern Hemisphere along the jet stream.

During the 20 days leading up to a heat wave in the model results, the five ridges and five troughs that make up a wavenumber-5 pattern tended to propagate very slowly westward around the globe, moving against the flow of the jet stream itself. Eventually, a high-pressure ridge moved from the North Atlantic into the United States, shutting down rainfall and setting the stage for a heat wave to emerge.

When wavenumber-5 patterns in the model were more amplified, U.S. heat waves became more likely to form 15 days later. In some cases, the probability of a heat wave was more than quadruple what would be expected by chance.

In follow-up work, the research team turned again to actual U.S. heat waves since 1948. They recognized that some historical heat wave events are indeed characterized by a large-scale circulation pattern that indicated a wavenumber-5 event.

Extending forecasts beyond 10 days

The research finding suggests that scientists are making progress on a key meteorological goal: forecasting the likelihood of extreme events more than 10 days in advance. At present, there is very limited skill in such long-term forecasts.

Previous research on extending weather forecasts has focused on conditions in the tropics. For example, scientists have found that El Niño and La Niña, the periodic warming and cooling of surface waters in the central and eastern tropical Pacific Ocean, are correlated with a higher probability of wet or dry conditions in different regions around the globe. In contrast, the wavenumber-5 pattern does not rely on conditions in the tropics. However, the study does not exclude the possibility that tropical rainfall could act to stimulate or strengthen the pattern.

Now that the new study has connected a planetary wave pattern to a particular type of extreme weather event, Teng and her colleagues will continue searching for other circulation patterns that may presage extreme weather events.

“There may be sources of predictability that we are not yet aware of,” she says. “This brings us hope that the likelihood of extreme weather events that are damaging to society can be predicted further in advance.”

The University Corporation for Atmospheric Research manages the National Center for Atmospheric Research under sponsorship by the National Science Foundation. Any opinions, findings and conclusions, or recommendations expressed in this release are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.

Journal Reference:

  1. Haiyan Teng, Grant Branstator, Hailan Wang, Gerald A. Meehl, Warren M. Washington. Probability of US heat waves affected by a subseasonal planetary wave patternNature Geoscience, 2013; DOI: 10.1038/ngeo1988

Will U.S. Hurricane Forecasting Models Catch Up to Europe’s? (National Geographic)

Photo of a satellite view of Hurricane Sandy on October 28, 2012.

A satellite view of Hurricane Sandy on October 28, 2012.

Photograph by Robert Simmon, ASA Earth Observatory and NASA/NOAA GOES Project Science team

Willie Drye

for National Geographic

Published October 27, 2013

If there was a bright spot amid Hurricane Sandy’s massive devastation, including 148 deaths, at least $68 billion in damages, and the destruction of thousands of homes, it was the accuracy of the forecasts predicting where the storm would go.

Six days before Sandy came ashore one year ago this week—while the storm was still building in the Bahamas—forecasters predicted it would make landfall somewhere between New Jersey and New York City on October 29.

They were right.

Sandy, which had by then weakened from a Category 2 hurricane to an unusually potent Category 1, came ashore just south of Atlantic City, a few miles from where forecasters said it would, on the third to last day of October.

“They were really, really excellent forecasts,” said University of Miami meteorologist Brian McNoldy. “We knew a week ahead of time that something awful was going to happen around New York and New Jersey.”

That knowledge gave emergency management officials in the Northeast plenty of time to prepare, issuing evacuation orders for hundreds of thousands of residents in New Jersey and New York.

Even those who ignored the order used the forecasts to make preparations, boarding up buildings, stocking up on food and water, and buying gasoline-powered generators.

But there’s an important qualification about the excellent forecasts that anticipated Sandy’s course: The best came from a European hurricane prediction program.

The six-day-out landfall forecast arrived courtesy of a computer program known as the European Centre for Medium-range Weather Forecasting (ECMWF), which is based in England.

Most of the other models in use at the National Hurricane Center in Miami, including the U.S. Global Forecast System (GFS), didn’t start forecasting a U.S. landfall until four days before the storm came ashore. At the six-day-out mark, that model and others at the National Hurricane Center had Sandy veering away from the Atlantic Coast, staying far out at sea.

“The European model just outperformed the American model on Sandy,” says Kerry Emanuel, a meteorologist at Massachusetts Institute of Technology.

Now, U.S. weather forecasting programmers are working to close the gap between the U.S. Global Forecast System and the European model.

There’s more at stake than simple pride. “It’s to our advantage to have two excellent models instead of just one,” says McNoldy. “The more skilled models you have running, the more you know about the possibilities for a hurricane’s track.”

And, of course, the more lives you can save.

Data, Data, Data

The computer programs that meteorologists rely on to predict the courses of storms draw on lots of data.

U.S. forecasting computers and their European counterparts rely on radar that provides information on cloud formations and the rotation of a storm, on orbiting satellites that show precisely where a storm is, and on hurricane-hunter aircraft that fly into storms to collect wind speeds, barometric pressure readings, and water temperatures.

Hundreds of buoys deployed along the Atlantic and Gulf coasts, meanwhile, relay information about the heights of waves being produced by the storm.

All this data is fed into computers at the National Centers for Environmental Prediction at Camp Springs, Maryland, which use it to run the forecast models. Those computers, linked to others at the National Hurricane Center, translate the computer models into official forecasts.

The forecasters use data from all computer models—including the ECMWF—to make their forecasts four times daily.

Forecasts produced by various models often diverge, leaving plenty of room for interpretation by human forecasters.

“Usually, it’s kind of a subjective process as far as making a human forecast out of all the different computer runs,” says McNoldy. “The art is in the interpretation of all of the computer models’ outputs.”

There are two big reasons why the European model is usually more accurate than U.S. models. First, the European Centre for Medium-range Weather Forecasting model is a more sophisticated program that incorporates more data.

Second, the European computers that run the program are more powerful than their U.S. counterparts and are and able to do more calculations more quickly.

“They don’t have any top-secret things,” McNoldy said. “Because of their (computer) hardware, they can implement more sophisticated code.”

A consortium of European nations began developing the ECMWF in 1976, and the model has been fueled by a series of progressively more powerful supercomputers in England. It got a boost when the European Union was formed in 1993 and member states started contributing taxes for more improvements.

The ECMWF and the GFS are the two primary models that most forecasters look at, said Michael Laca, producer of TropMet, a website that focuses on hurricanes and other severe weather events.

Laca said that forecasts and other data from the ECMWF are provided to forecasters in the U.S. and elsewhere who pay for the information.

“The GFS, on the other hand, is freely available to everyone, and is funded—or defunded—solely through (U.S.) government appropriations,” Laca said.

And since funding for U.S. research and development is subject to funding debates in Congress, U.S. forecasters are “in a hard position to keep pace with the ECMWF from a research and hardware perspective,” Laca said.

Hurricane Sandy wasn’t the first or last hurricane for which the ECMWF was the most accurate forecast model. It has consistently outperformed the GFS and four other U.S. and Canadian forecasting models.

Greg Nordstrom, who teaches meteorology at Mississippi State University in Starkville, said the European model provided much more accurate forecasts for Hurricane Isaac in August 2012 and for Tropical Storm Karen earlier this year.

“This doesn’t mean the GFS doesn’t beat the Euro from time to time,” he says.  “But, overall, the Euro is king of the global models.”

McNoldy says the European Union’s generous funding of research and development of their model has put it ahead of the American version. “Basically, it’s a matter of resources,” he says. “If we want to catch up, we will. It’s important that we have the best forecasting in the world.”

European developers who work on forecasting software have also benefited from better cooperation between government and academic researchers, says MIT’s Emanuel.

“If you talk to (the National Oceanic and Atmospheric Administration), they would deny that, but there’s no real spirit of cooperation (in the U.S.),” he says. “It’s a cultural problem that will not get fixed by throwing more money at the problem.”

Catching Up Amid Chaos

American computer models’ accuracy in forecasting hurricane tracks has improved dramatically since the 1970s. The average margin of error for a three-day forecast of a hurricane’s track has dropped from 500 miles in 1972 to 115 miles in 2012.

And NOAA is in the middle of a ten-year program intended to dramatically improve the forecasting of hurricanes’ tracks and their likelihood to intensify, or become stronger before landfall.

One of the project’s centerpieces is the Hurricane Weather Research and Forecasting model, or HWRF. In development since 2007, it’s similar to the ECMWF in that it will incorporate more data into its forecasting, including data from the GFS model.

Predicting the likelihood that a hurricane will intensify is difficult. For a hurricane to gain strength, it needs humid air, seawater heated to at least 80ºF, and no atmospheric winds to disrupt its circulation.

In 2005, Hurricane Wilma encountered those perfect conditions and in just 30 hours strengthened from a tropical storm with peak winds of about 70 miles per hour to the most powerful Atlantic hurricane on record, with winds exceeding 175 miles per hour.

But hurricanes are as delicate as they are powerful. Seemingly small environmental changes, like passing over water that’s slightly cooler than 80ºF or ingesting dryer air, can rapidly weaken a storm. And the environment is constantly changing.

“Over the next five years, there may be some big breakthrough to help improve intensification forecasting,” McNoldy said. “But we’re still working against the basic chaos in the atmosphere.”

He thinks it will take at least five to ten years for the U.S. to catch up with the European model.

MIT’s Emanuel says three factors will determine whether more accurate intensification forecasting is in the offing: the development of more powerful computers that can accommodate more data, a better understanding of hurricane intensity, and whether researchers reach a point at which no further improvements to intensification forecasting are possible.

Emanuel calls that point the “prediction horizon” and says it may have already been reached: “Our level of ignorance is still too high to know.”

Predictions and Responses

Assuming we’ve not yet hit that point, better predictions could dramatically improve our ability to weather hurricanes.

The more advance warning, the more time there is for those who do choose to heed evacuation orders. Earlier forecasting would also allow emergency management officials more time to provide transportation for poor, elderly, and disabled people unable to flee on their own.

More accurate forecasts would also reduce evacuation expenses.

Estimates of the cost of evacuating coastal areas before a hurricane vary considerably, but it’s been calculated that it costs $1 million for every mile of coastline evacuated. That includes the cost of lost commerce, wages and salaries by those who leave, and the costs of actual evacuating, like travel and shelter.

Better forecasts could reduce the size of evacuation areas and save money.

They would also allow officials to get a jump on hurricane response.  The Federal Emergency Management Administration tries to stockpile relief supplies far enough away from an expected hurricane landfall to avoid damage from the storm but near enough so that the supplies can quickly be moved to affected areas afterwards.

More reliable landfall forecasts would help FEMA position recovery supplies closer to where they’ll be.

Whatever improvements are made, McNoldy warns that forecasting will never be foolproof. However dependable, he said, “Models will always be imperfect.”

Predicting the Future Could Improve Remote-Control of Space Robots (Wired)

BY ADAM MANN

10.15.13

A new system could make space exploration robots faster and more efficient by predicting where they will be in the very near future.

The engineers behind the program hope to overcome a particular snarl affecting our probes out in the solar system: that pesky delay caused by the speed of light. Any commands sent to a robot on a distant body take a certain amount of time to travel and won’t be executed for a while. By building a model of the terrain surrounding a rover and providing an interface that lets operators forecast the how the probe will move around within it, engineer can identify potential obstacles and make decisions nearer to real-time.

“You’re reacting quickly, and the rover is staying active more of the time,” said computer scientist Jeff Norris, who leads mission operation innovations at the Jet Propulsion Laboratory’s Ops Lab.

As an example, the distance between Earth and Mars creates round-trip lags of up to 40 minutes. Nowadays, engineers send a long string of commands once a day to robots like NASA’s Curiosity rover. These get executed but then the rover has to stop and wait until the next instructions are beamed down.

Because space exploration robots are multi-million or even multi-billion-dollar machines, they have to work very carefully. One day’s commands might tell Curiosity to drive up to a rock. It will then check that it has gotten close enough. Then, the following day, if will be instructed to place its arm on that rock. Later on, it might be directed to drill into or probe this rock with its instruments. While safe, this method is very inefficient.

“When we only send commands once a day, we’re not dealing with 10- or 20-minute delays. We’re dealing with a 24-hour round trip,” said Norris.

Norris’ lab wants to make the speed and productivity of distant probes better. Their interface simulates more or less where a robot would be given a particular time delay. This is represented by a small ghostly machine — called the “committed state” — moving just ahead of a rover. The ghosted robot is the software’s best guess of where the probe would end up if operators hit the emergency stop button right then.

By looking slightly into the future, the interface allows a rover driver to update decisions and commands at a much faster rate than is currently possible. Say a robot on Mars is commanded to drive forward 100 meters. But halfway there, its sensors notice an interesting rock that scientists want to investigate. Rather than waiting for the rover to finish its drive and then commanding it to go back, this new interface would give operators the ability to write and rewrite their directions on the fly.

The simulation can’t know every detail around a probe and so provides a small predictive envelope as to where the robot might be. Different terrains have different uncertainties.

“If you’re on loose sand, that might be different than hard rock,” said software engineer Alexander Menzies, who works on the interface.

Menzies added that when they tested the interface, users had an “almost game-like experience” trying to optimize commands for a robot. He designed an actual video game where participants were given points for commanding a time-delayed robot through a slalom-like terrain. (Norris lamented that he had the highest score on that game until the last day of testing, when Menzies beat him.)

The team thinks that aspects of this new interface could start to be used in the near future, perhaps even with the current Mars rovers Curiosity and Opportunity. At this point, though, Mars operations are limited by bandwidth. Because there are only a few communicating satellites in orbit on the Red Planet, commands can only be sent a few times a day, reducing a lot of the efficiency that would be gained from this new system. But operations on the moon or a potential asteroid capture and exploration mission – such as the one NASA is currently planning – would likely be in more constant communication with Earth, providing even faster and more efficient operations that could take advantage of this new time-delay-reducing system.

Video: OPSLabJPL/Youtube

Ice Cap Shows Ancient Mines Polluted the Globe (New York Times)

By MALCOLM W. BROWNE

Published: December 09, 1997

SAMPLES extracted from Greenland’s two-mile-deep ice cap have yielded evidence that ancient Carthaginian and Roman silver miners working in southern Spain fouled the global atmosphere with lead for some 900 years.

The Greenland ice cap accumulates snow year after year, and substances from the atmosphere are entrapped in the permanent ice. From 1990 to 1992, a drill operated by the European Greenland Ice-Core Project recovered a cylindrical ice sample 9,938 feet long, pieces of which were distributed to participating laboratories. The ages of successive layers of the ice cap have been accurately determined, so the chemical makeup of the atmosphere at any given time in the past 9,000 years can be estimated by analyzing the corresponding part of the core sample.

Using exquisitely sensitive techniques to measure four different isotopes of lead in the Greenland ice, scientists in Australia and France determined that most of the man-made lead pollution of the atmosphere in ancient times had come from the Spanish provinces of Huelva, Seville, Almeria and Murcia. Isotopic analysis clearly pointed to the rich silver-mining and smelting district of Rio Tinto near the modern city of Nerva as the main polluter.

The results of this study were reported in the current issue of Environmental Science & Technology by Dr. Kevin J. R. Rosman of Curtin University in Perth, Australia, and his colleagues there and at the Laboratory of Glaciology and Geophysics of the Environment in Grenoble, France.

One of the problems in their analyses, the authors wrote, was the very low concentrations of lead remaining in ice dating from ancient times — only about one-hundredth the lead level found in Greenland ice deposited in the last 30 years. But the investigators used mass-spectrometric techniques that permitted them to sort out isotopic lead composition at lead levels of only about one part per trillion.

Dr. Rosman focused on the ratio of two stable isotopes, or forms, of lead: lead-206 and lead-207. His group found that the ratio of lead-206 to lead-207 in 8,000-year-old ice was 1.201. That was taken as the natural ratio that existed before people began smelting ores. But between 600 B.C. and A.D. 300, the scientists found, the ratio of lead-206 to lead-207 fell to 1.183. They called that ”unequivocal evidence of early large-scale atmospheric pollution by this toxic metal.”

All ore bodies containing lead have their own isotopic signatures, and the Rio Tinto lead ratio is 1.164. Calculations by the Australian-French collaboration based on their ice-core analysis showed that during the period 366 B.C. to at least A.D. 36, a period when the Roman Empire was at its peak, 70 percent of the global atmospheric lead pollution came from the Roman-operated Rio Tinto mines in what is now southwestern Spain.

The Rio Tinto mining region is known to archeologists as one of the richest sources of silver in the ancient world. Some 6.6 million tons of slag were left by Roman smelting operations there.

The global demand for silver increased dramatically after coinage was introduced in Greece around 650 B.C. But silver was only one of the treasures extracted from its ore. The sulfide ore smelted by the Romans also yielded an enormous harvest of lead.

Because it is easily shaped, melted and molded, lead was widely used by the Romans for plumbing, stapling masonry together, casting statues and manufacturing many kinds of utensils. All these uses presumably contributed to the chronic poisoning of Rome’s peoples.

Adding to the toxic hazard, Romans used lead vessels to boil and concentrate fruit juices and preserves. Fruits contain acetic acid, which reacts with metallic lead to form lead acetate, a compound once known as ”sugar of lead.” Lead acetate adds a pleasant sweet taste to food but causes lead poisoning — an ailment that is often fatal and, even in mild cases, causes debilitation and loss of cognitive ability.

Judging from the Greenland ice core, the smelting of lead-bearing ore declined sharply after the fall of the Roman Empire but gradually increased during the Renaissance. By 1523, the last year for which Dr. Rosman’s group conducted its Greenland ice analysis, atmospheric lead pollution had reached nearly the same level recorded for the year 79 B.C., at the peak of Roman mining pollution.

How Scott Collis Is Harnessing New Data To Improve Climate Models (Popular Science)

The former ski bum built open-access tools that convert raw data from radar databases into formats that climate modelers can use to better predict climate change.

By Veronique Greenwood and Valerie Ross

Posted 10.16.2013 at 3:00 pm

Scott Collis (by Joel Kimmel)

Each year, Popular Science seeks out the brightest young scientists and engineers and names them the Brilliant Ten. Like the 110 honorees before them, the members of this year’s class are dramatically reshaping their fields–and the future. Some are tackling pragmatic questions, like how to secure the Internet, while others are attacking more abstract ones, like determining the weather on distant exoplanets. The common thread between them is brilliance, of course, but also impact. If the Brilliant Ten are the faces of things to come, the world will be a safer, smarter, and brighter place.–The Editors

Scott Collis

Argonne National Laboratory

Achievement

Harnessing new data to improve climate models

Clouds are one of the great challenges for climate scientists. They play a complex role in the atmosphere and in any potential climate-change scenario. But rudimentary data has simplified their role in simulations, leading to variability among climate models. Scott Collis discovered a way to add accuracy to forecasts of future climate—by tapping new sources of cloud data.

Collis has extensive experience watching clouds, first as a ski bum during grad school in Australia and then as a professional meteorologist. But when he took a job at the Centre for Australian Weather and Climate Research, he realized there was an immense source of cloud data that climate modelers weren’t using: the information collected for weather forecasts. So Collis took on the gargantuan task of building open-access tools that convert the raw data from radar databases into formats that climate modelers can use. In one stroke, Collis unlocked years of weather data. “We were able to build such robust algorithms that they could work over thousands of radar volumes without human intervention,” says Collis.

When the U.S. Department of Energy caught wind of his project, it recruited him to work with a new radar network designed to collect high-quality cloud data from all over the globe. The network, the largest of its kind, isn’t complete yet, but already the data that Collis and his collaborators have collected is improving next-generation climate models.

Click here to see more from our annual celebration of young researchers whose innovations will change the world. This article originally appeared in the October 2013 issue of Popular Science.

Tool Accurately Predicts Whether A Kickstarter Project Will Bomb (Popular Science)

At about 76 percent accuracy, a new prediction model is the best yet. “Your chances of success are at 8 percent. Commence panic.”

By Colin Lecher

Posted 10.16.2013 at 2:00 pm

 

Ouya, A Popular Kickstarter Project 

Well, here’s something either very discouraging or very exciting for crowdfunding hopefuls: a Swiss team can predict, with about 76 percent accuracy and within only four hours of launch, whether a Kickstarter project will succeed.

The team, from the university École Polytechnique Fédérale de Lausanne, laid out a system in a paper presented at the Conference on Online Social Networks. By mining data on more than 16,000 Kickstarter campaigns and more than 1.3 million users, they created a prediction model based on the project’s popularity on Twitter, the rate of cash it’s getting, how many first-time backers it has, and the previous projects supporters have backed.

A previous, similar model built by Americans could predict a Kicktarter project’s success with 68 percent accuracy–impressive, but the Swiss project has another advantage: it’s dynamic. While the American model could only make a prediction before the project launched, the Swiss project monitors projects in real time. They’ve even built a tool, called Sidekick, that monitors projects and displays their chances of success.

Other sites, like Kicktraq, offer similar services, but the predictions aren’t as accurate as the Swiss team claims theirs are. If you peruse Sidekick, you can see how confident the algorithm is in its pass/fail predictions: almost all of the projects are either above 90 percent or below 10 percent. Sort of scary, probably, if you’re launching a project. Although there’s always a chance you could pull yourself out of the hole, it’s like a genie asking if you want to know how you die: Do you really want that information?

[Guardian]

The Reasons Behind Crime (Science Daily)

Oct. 10, 2013 — More punishment does not necessarily lead to less crime, say researchers at ETH Zurich who have been studying the origins of crime with a computer model. In order to fight crime, more attention should be paid to the social and economic backgrounds that encourage crime.

Whether a person turns criminal and commits a robbery depends greatly on the socio-economic circumstances in which he lives. (Credit: © koszivu / Fotolia)

People have been stealing, betraying others and committing murder for ages. In fact, humans have never succeeded in eradicating crime, although — according to the rational choice theory in economics — this should be possible in principle. The theory states that humans turn criminal if it is worthwhile. Stealing or evading taxes, for instance, pays off if the prospects of unlawful gains outweigh the expected punishment. Therefore, if a state sets the penalties high enough and ensures that lawbreakers are brought to justice, it should be possible to eliminate crime completely.

This theory is largely oversimplified, says Dirk Helbing, a professor of sociology. The USA, for example, often have far more drastic penalties than European countries. But despite the death penalty in some American states, the homicide rate in the USA is five times higher than in Western Europe. Furthermore, ten times more people sit in American prisons than in many European countries. More repression, however, can sometimes even lead to more crime, says Helbing. Ever since the USA declared the “war on terror” around the globe, the number of terrorist attacks worldwide has increased, not fallen. “The classic approach, where criminals merely need to be pursued and punished more strictly to curb crime, often does not work.” Nonetheless, this approach dominates the public discussion.

More realistic model

In order to better understand the origins of crime, Helbing and his colleagues have developed a new so-called agent-based model that takes the network of social interactions into account and is more realistic than previous models. Not only does it include criminals and law enforcers, like many previous models, but also honest citizens as a third group. Parameters such as the penalties size and prosecution costs can be varied in the model. Moreover, it also considers spatial dependencies. The representatives of the three groups do not interact with one another randomly, but only if they encounter each other in space and time. In particular, individual agents imitate the behaviour of agents from other groups, if this is promising.

Cycles of crime

Using the model, the scientists were able to demonstrate that tougher punishments do not necessarily lead to less crime and, if so, then at least not to the extent the punishment effort is increased. The researchers were also able to simulate how crime can suddenly break out and calm down again. Like the pig cycle we know from the economic sciences or the predator-prey cycles from ecology, crime is cyclical as well. This explains observations made, for instance, in the USA: according to the FBI’s Uniform Crime Reporting Program, cyclical changes in the frequency of criminal offences can be found in several American states. “If a state increases the investments in its punitive system to an extent that is no longer cost-effective, politicians will cut the law enforcement budget,” says Helbing. “As a result, there is more room for crime to spread again.”

“Many crimes have a socio-economic background”

But would there be a different way of combatting crime, if not with repression? The focus should be on the socio-economic context, says Helbing. As we know from the milieu theory in sociology, the environment plays a pivotal role in the behaviour of individuals. The majority of criminal acts have a social background, claims Helbing. For example, if an individual feels that all the friends and neighbours are cheating the state, it will inevitably wonder whether it should be the last honest person to fill in the tax declaration correctly.

“If we want to reduce the crime rate, we have to keep an eye on the socio-economic circumstances under which people live,” says Helbing. We must not confuse this with soft justice. However, a state’s response to crime has to be differentiated: besides the police and court, economic and social institutions are relevant as well — and, in fact, every individual when it comes to the integration of others. “Improving social conditions and integrating people socially can probably combat crime much more effectively than building new prisons.”

Journal Reference:

  1. Matjaž Perc, Karsten Donnay, Dirk Helbing. Understanding Recurrent Crime as System-Immanent Collective BehaviorPLoS ONE, 2013; 8 (10): e76063 DOI:10.1371/journal.pone.0076063

Building Cyberinfrastructure Capacity for the Social Sciences (American Anthropological Association)

Posted on October 9, 2013 by Joslyn O.

Today’s guest blog post is by Dr. Emilio Moran. Dr. Moran is Distinguished Professor Emeritus, Indiana University and Visiting Hannah Distinguished Professor, Michigan State University.

emilio-moran_profileThe United States and the world are changing rapidly.  These new conditions challenge the ability of the social, behavioral and economic sciences to understand what is happening at a national scale and in people’s daily local lives.   Forces such as globalization, the shifting composition of the economy, and the revolution in information brought about by the internet and social media are just a few of the forces that are changing Americans’ lives.  Not only has the world changed since data collection methods currently used were developed, but the ways now available to link information and new data sources have radically changed. Expert panels have called for increasing the cyber-infrastructure capability of the social, behavioral, and economic (SBE) sciences so that our tools and research infrastructure keep pace with these changing social and informational landscapes.  A series of workshops for the past three years has met to address these challenges and they now invite you to provide them with feedback on the proposal below and you are invited to attend a Special Event at this year’s AAA meeting in Chicago, Saturday, November 23, 2013 from 1215 to 1:30 pm at the Chicago Hilton Boulevard C room.

Needed is a new national framework, or platform, for social, behavioral and economic research that is both scalable and flexible; that permits new questions to be addressed; that allows for rapid response and adaptation to local shocks (such as extreme weather events or natural resource windfalls); and that facilitates understanding local manifestations of national phenomena such as economic downturns.  To advance a national data collection and analysis infrastructure, the approach we propose —  building a network of social observatories — is a way to have a sensitive instrument to measure how local communities respond to a range of natural and social conditions over time.  This new scientific infrastructure will enable the SBE sciences to contribute to societal needs at multiple levels and will facilitate collaboration with other sciences in addressing questions of critical importance.

Our vision is that of a network of observatories designed from the ground up, each observatory representing an area of the United States.  From a small number of pilot projects the network would develop (through a national sampling frame and protocol) into a representative sample of the places where people live and the people who live there. Each observatory would be an entity, whether physical or virtual, that is charged with collecting, curating, and disseminating data from people, places, and institutions in the United States.  These observatories must provide a basis for inference from what happens in local places to a national context and ensure a robust theoretical foundation for social analysis.  This is the rationale for recommending that this network of observatories be built on a population-based sample capable of addressing the needs of the nation’s diverse people but located in the specific places and communities where they live and work.  Unlike most other existing research platforms, this population and place-based capability will ensure that we understand not only the high-density urban and suburban places where the majority of the population lives, but also the medium- and low-density exurban and rural places that represent a vast majority of the land area in the nation.

To accomplish these objectives, we propose to embed in these regionally-based observatories a nationally representative population-based sample that would enable the observatory data to be aggregated in such a way as to produce a national picture of the United States on an ongoing basis.  The tentative plan would be to select approximately 400 census tracts to represent the U.S. population while also fully capturing the diversity that characterizes local places. The individuals, institutions and communities in which these census tracts are embedded will be systematically studied over time and space by observatories spread across the country. During the formative stages the number of census tracts and the number of observatories that might be needed, given the scope of the charge that is currently envisioned, will be determined.

These observatories will study the social, behavioral and economic experiences of the population in their physical and environmental context at fine detail. The observatories are intended to stimulate the development of new directions and modes of inquiry.  They will do so through the use of diverse complementary methods and data sources including ethnography, experiments, administrative data, social media, biomarkers, and financial and public health record. These observatories will work closely with local and state governments to gain access to administrative records that provide extensive data on the population in those tracts (i.e. 2 million people) thereby providing a depth of understanding and integration of knowledge that is less invasive and less subject to declining response rates than survey-derived data.

To attain the vision proposed here we need the commitment and enthusiasm of the community to meet these challenges and the resolve to make this proposed network of observatories useful to the social sciences and society. For more details on our objectives and reports from previous meetings, visit http://socialobservatories.org/. Please contribute your ideas at the site so that the proposal can benefit from your input and come to Chicago for the Special Event on Saturday, November 23, 2013. We are particularly interesting in hearing how this platform could help you in your future research. This is an opportunity for anthropological strengths in ethnography and local research to contribute its insights in a way that will make a difference for local people and for the nation.

Emilio F. Moran, co-Chair of the SOCN
Distinguished Professor Emeritus, Indiana University and
Visiting Hannah Distinguished Professor, Michigan State University

Terrestrial Ecosystems at Risk of Major Shifts as Temperatures Increase (Science Daily)

Oct. 8, 2013 — Over 80% of the world’s ice-free land is at risk of profound ecosystem transformation by 2100, a new study reveals. “Essentially, we would be leaving the world as we know it,” says Sebastian Ostberg of the Potsdam Institute for Climate Impact Research, Germany. Ostberg and collaborators studied the critical impacts of climate change on landscapes and have now published their results inEarth System Dynamics, an open access journal of the European Geosciences Union (EGU).

This image shows simulated ecosystem change by 2100, depending on the degree of global temperature increase: 2 degrees Celsius (upper image) or five degrees Celsius (lower image) above preindustrial levels. The parameter “ (Gamma) measures how far apart a future ecosystem under climate change would be from the present state. Blue colours (lower “) depict areas of moderate change, yellow to red areas (higher “) show major change. The maps show the median value of the “ parameter across all climate models, meaning at least half of the models agree on major change in the yellow to red areas, and at least half of the models are below the threshold for major change in the blue areas. (Credit: Ostberg et al., 2013)

The researchers state in the article that “nearly no area of the world is free” from the risk of climate change transforming landscapes substantially, unless mitigation limits warming to around 2 degrees Celsius above preindustrial levels.

Ecosystem changes could include boreal forests being transformed into temperate savannas, trees growing in the freezing Arctic tundra or even a dieback of some of the world’s rainforests. Such profound transformations of land ecosystems have the potential to affect food and water security, and hence impact human well-being just like sea level rise and direct damage from extreme weather events.

The new Earth System Dynamics study indicates that up to 86% of the remaining natural land ecosystems worldwide could be at risk of major change in a business-as-usual scenario (see note). This assumes that the global mean temperature will be 4 to 5 degrees warmer at the end of this century than in pre-industrial times — given many countries’ reluctance to commit to binding emissions cuts, such warming is not out of the question by 2100.

“The research shows there is a large difference in the risk of major ecosystem change depending on whether humankind continues with business as usual or if we opt for effective climate change mitigation,” Ostberg points out.

But even if the warming is limited to 2 degrees, some 20% of land ecosystems — particularly those at high altitudes and high latitudes — are at risk of moderate or major transformation, the team reveals.

The researchers studied over 150 climate scenarios, looking at ecosystem changes in nearly 20 different climate models for various degrees of global warming. “Our study is the most comprehensive and internally consistent analysis of the risk of major ecosystem change from climate change at the global scale,” says Wolfgang Lucht, also an author of the study and co-chair of the research domain Earth System Analysis at the Potsdam Institute for Climate Impact Research.

Few previous studies have looked into the global impact of raising temperatures on ecosystems because of how complex and interlinked these systems are. “Comprehensive theories and computer models of such complex systems and their dynamics up to the global scale do not exist.”

To get around this problem, the team measured simultaneous changes in the biogeochemistry of terrestrial vegetation and the relative abundance of different vegetation species. “Any significant change in the underlying biogeochemistry presents an ecological adaptation challenge, fundamentally destabilising our natural systems,” explains Ostberg.

The researchers defined a parameter to measure how far apart a future ecosystem under climate change would be from the present state. The parameter encompasses changes in variables such as the vegetation structure (from trees to grass, for example), the carbon stored in the soils and vegetation, and freshwater availability. “Our indicator of ecosystem change is able to measure the combined effect of changes in many ecosystem processes, instead of looking only at a single process,” says Ostberg.

He hopes the new results can help inform the ongoing negotiations on climate mitigation targets, “as well as planning adaptation to unavoidable change.”

Note

Even though 86% of land ecosystems are at risk if global temperature increases by 5 degrees Celsius by 2100, it is unlikely all these areas will be affected. This would mean that the worst case scenario from each climate model comes true.

Journal Reference:

  1. S. Ostberg, W. Lucht, S. Schaphoff, D. Gerten. Critical impacts of global warming on land ecosystemsEarth System Dynamics, 2013; 4 (2): 347 DOI: 10.5194/esd-4-347-2013

Climate sceptics claim warming pause backs their view (BBC)

26 September 2013 Last updated at 00:47 GMT

By Matt McGrathEnvironment correspondent, BBC News, The Netherlands

Hurricane SandyCome hell or… An increased likelihood of extreme weather events is one predicted outcome of global warming, but some dispute the scale of expected effects

In the run up to a key global warming report, those sceptical of mainstream opinion on climate change claim they are “winning” the argument.

They say a slowing of temperature rises in the past 15 years means the threat from climate change is exaggerated.

But a leading member of the UN’s panel on climate change said the views of sceptics were “wishful thinking”.

Some of what the sceptics are saying is either wishful thinking or totally dishonest”

Prof Jean-Pascal van Ypersele, IPCC

The pause in warming was a distraction, he said, from the growing scientific certainty about long-term impacts.

Prof Jean Pascal van Ypersele spoke to BBC News ahead of the release of a six-yearly status report into global warming by the UN panel known as the Intergovernmental Panel on Climate Change, or IPCC.

Scientists and government representatives are currently meeting in Stockholm, Sweden, going through the dense, 31-page summary of the state of the physical science behind climate change.

When it is released on Friday, the report is likely to state with even greater certainty than before that the present-day, rapid warming of the planet is man-made.

Netherlands flood rescue simulationClimate change could profoundly impact the Netherlands, but sceptics remain influential there

But climate sceptics have focused their attention on the references to a pause or hiatus in the increase in global temperatures since 1998.

The sceptics believe that this slowdown is the most solid evidence yet that mainstream computer simulations of the Earth’s climate – climate models – are wrong.

These computer simulations are used to study the dynamics of the Earth’s climate and make projections about future temperature change.

“The sceptics now have a feeling of being on the winning side of the debate thanks to the pause,” said Marcel Crok, a Dutch author who accepts the idea that human activities warm the planet, but is sceptical about the scale of the effect.

“You are now starting to see a normalisation of climate science. Suddenly mainstream researchers, who all agree that greenhouse gases play a huge role, start to disagree about the cause of the pause.

“For me this is a relief, it is finally opening up again and this is good.”

The view that the sceptics have positively impacted the IPCC is supported by Prof Arthur Petersen, who is a member of the Dutch government team currently examining the report.

“The sceptics are good for the IPCC and the whole process is really flourishing because of this interaction over the past decades,” he told BBC News.

“Our best climate researchers are typically very close to really solid, sceptical scientists. In this sense scepticism is not necessarily a negative term.”

Others disagree.

Bart Verheggen is an atmospheric scientist and blogger who supports the mainstream view of global warming. He said that sceptics have discouraged an open scientific debate.

Crok

Dutch writer Marcel Crok is sceptical about the sensitivity of the atmosphere to carbon emissions

“When scientists start to notice that their science is being distorted in public by these people who say they are the champions of the scientific method, that could make mainstream researchers more defensive.

“Scientists probably think twice now about writing things down. They probably think twice about how this could be twisted by contrarians.”

Sensitive debate

In 2007, the IPCC defined the range for what’s termed “equilibrium climate sensitivity”. This term refers to the likely span of temperatures that would occur following a doubling of CO2 concentrations in the atmosphere.

The panel’s last report said temperatures were likely to rise within the range of 2C to 4.5C with a best estimate of 3C.

The new report is believed to indicate a range of 1.5C to 4.5C with no best estimate indicated.

Although that might not appear like much of a change, many sceptics believe it exposes a critical flaw.

“In the last year, we have seen several studies showing that climate sensitivity is actually much less than we thought for the last 30 years,” said Marcel Crok.

“And these studies indicate that our real climate shows a sensitivity of between 1.5C and 2C, but the climate models on which these future doom scenarios are based warm up three degrees under a doubling of CO2.”

But other researchers who are familiar with the text believe that the sceptics are reading too much into a single figure.

“Some of what the sceptics are saying is either wishful thinking or totally dishonest,” Prof van Ypersele, who is vice-chair of the IPCC, told BBC News.

“It is just a change in a lower border [of the range of temperature rise]. Even if this turns out to be the real sensitivity, instead of making the challenge extremely, extremely, extremely difficult to meet, it is only making it extremely, extremely difficult.

“Is that such a big change?”

Prof van Ypersele points out that many other aspects of the forthcoming report are likely to give greater certainty to the scale of impacts of a warming world. The predictions for sea level rise are likely to be considerably strengthened from 2007. There is also likely to be a clearer understanding of the state of sea ice.

Who are the climate sceptics?

Although there are only a small number of mainstream scientists who reject the established view on global warming, they are supported by a larger group of well resourced bloggers and citizen scientists who pore through climate literature and data looking for evidence of flaws in the hypothesis.

There are many different shades of opinion in the sceptical orbit. Some such as the group Principia Scientific reject the “myth” of greenhouse gas warming.

There are also political sceptics, such as some members of the Republican party in the US, who argue that climate science is a hoax or a conspiracy.

But there are also sceptical bloggers such as Anthony Watts and Andrew Montford who accept the basic science that adding carbon to the atmosphere can affect the temperature. They contest mainstream findings on the sensitivity of the climate to carbon and the future impacts on temperature.

The Dutch approach to sceptics

With around 20% of the country under sea level, the Dutch have a keen interest in anything that might affect their environment, such as climate change.

But scepticism about the human influence on global warming has been growing in the Netherlands, according to research from the OECD.

In a country where consensus is a key word, the government has taken a more inclusive approach to climate dissenters. To that end, they have funded Marcel Crok to carry out a sceptical analysis of the IPCC report.

In an effort to build bridges between sceptics and the mainstream they are also funding an initiative called climatedialogue.org which serves as a platform for debate on the science of global warming.

Related Stories

IPCC climate report: humans ‘dominant cause’ of warming – and other articles (BBC)

27 September 2013 Last updated at 09:12 GMT

By Matt McGrathEnvironment correspondent, BBC News, Stockholm

Climate change “threatens our planet, our only home”, warns Thomas Stocker, IPCC co-chair

A landmark report says scientists are 95% certain that humans are the “dominant cause” of global warming since the 1950s.

The report by the UN’s climate panel details the physical evidence behind climate change.

On the ground, in the air, in the oceans, global warming is “unequivocal”, it explained.

It adds that a pause in warming over the past 15 years is too short to reflect long-term trends.

The panel warns that continued emissions of greenhouse gases will cause further warming and changes in all aspects of the climate system.

To contain these changes will require “substantial and sustained reductions of greenhouse gas emissions”.

Infographic

Projections are based on assumptions about how much greenhouse gases might be released

After a week of intense negotiations in the Swedish capital, the summary for policymakers on the physical science of global warming has finally been released.

The first part of an IPCC trilogy, due over the next 12 months, this dense, 36-page document is considered the most comprehensive statement on our understanding of the mechanics of a warming planet.

It states baldly that, since the 1950s, many of the observed changes in the climate system are “unprecedented over decades to millennia”.

Each of the last three decades has been successively warmer at the Earth’s surface, and warmer than any period since 1850, and probably warmer than any time in the past 1,400 years.

“Our assessment of the science finds that the atmosphere and ocean have warmed, the amount of snow and ice has diminished, the global mean sea level has risen and that concentrations of greenhouse gases have increased,” said Qin Dahe, co-chair of IPCC working group one, who produced the report.

Speaking at a news conference in the Swedish capital, Prof Thomas Stocker, another co-chair, said that climate change “challenges the two primary resources of humans and ecosystems, land and water. In short, it threatens our planet, our only home”.

Since 1950, the report’s authors say, humanity is clearly responsible for more than half of the observed increase in temperatures.

Dr Rajendra Pachauri said he was confident the report would convince the public on global climate change

But a so-called pause in the increase in temperatures in the period since 1998 is downplayed in the report. The scientists point out that this period began with a very hot El Nino year.

“Trends based on short records are very sensitive to the beginning and end dates and do not in general reflect long-term climate trends,” the report says.

Prof Stocker, added: “I’m afraid there is not a lot of public literature that allows us to delve deeper at the required depth of this emerging scientific question.

“For example, there are not sufficient observations of the uptake of heat, particularly into the deep ocean, that would be one of the possible mechanisms to explain this warming hiatus.”

“Likewise we have insufficient data to adequately assess the forcing over the last 10-15 years to establish a relationship between the causes of the warming.”

However, the report does alter a key figure from the 2007 study. The temperature range given for a doubling of CO2 in the atmosphere, called equilibrium climate sensitivity, was 2.0C to 4.5C in that report.

In the latest document, the range has been changed to 1.5C to 4.5C. The scientists say this reflects improved understanding, better temperature records and new estimates for the factors driving up temperatures.

In the summary for policymakers, the scientists say that sea level rise will proceed at a faster rate than we have experienced over the past 40 years. Waters are expected to rise, the document says, by between 26cm (at the low end) and 82cm (at the high end), depending on the greenhouse emissions path this century.

The scientists say ocean warming dominates the increase in energy stored in the climate system, accounting for 90% of energy accumulated between 1971 and 2010.

For the future, the report states that warming is projected to continue under all scenarios. Model simulations indicate that global surface temperature change by the end of the 21st Century is likely to exceed 1.5 degrees Celsius, relative to 1850.

Prof Sir Brian Hoskins, from Imperial College London, told BBC News: “We are performing a very dangerous experiment with our planet, and I don’t want my grandchildren to suffer the consequences of that experiment.”

What is the IPCC?

In its own words, the IPCC is there “to provide the world with a clear scientific view on the current state of knowledge in climate change and its potential environmental and socio-economic impacts”.

The offspring of two UN bodies, the World Meteorological Organization and the United Nations Environment Programme, it has issued four heavyweight assessment reports to date on the state of the climate.

These are commissioned by the governments of 195 countries, essentially the entire world. These reports are critical in informing the climate policies adopted by these governments.

The IPCC itself is a small organisation, run from Geneva with a full time staff of 12. All the scientists who are involved with it do so on a voluntary basis.

Document

PDF download IPCC Summary for Policymakers

Related Stories

O dióxido de carbono está reconfigurando o planeta (IPS)

Ambiente

30/9/2013 – 08h08

por Stephen Leahy, da IPS

stephen6401 O dióxido de carbono está reconfigurando o planeta

O derretimento da geleira de Orizaba, no México, é outra consequência do aquecimento global. Foto: Maurício Ramos/IPS

Nantes, França, 30/9/2013 – A Groenlândia pode acabar se tornando verde, pois a maior parte de sua enorme camada de gelo está condenada a derreter, informou, no dia 27, o Grupo Intergovernamental de Especialistas sobre Mudança Climática (IPCC). O resumo de 36 páginas apresentado em Estocolmo pelo IPCC, que funciona na órbita da Organização das Nações Unidas (ONU), inclui um alerta de que há 20% de possibilidades de os gelos da Groenlândia iniciarem um derretimento irreversível com apenas 0,2 grau de aquecimento adicional.

Essa quantidade de aquecimento agregado é uma certeza agora. Contudo, o derretimento definitivo do gelo na Groenlândia consumirá mil anos. “O novo informe é outro chamado para despertar, que nos diz que estamos com grandes problemas e vamos rumo a graus perigosos de mudança climática”, destacou David Cadman, presidente da Iclei, a única rede de cidades sustentáveis que opera em todo o mundo e envolve 1.200 governos locais.

“O IPCC será atacado por interesses (da indústria dos combustíveis fósseis e seus defensores. Eles tentarão assustar o público, alegando que tomar medidas coloca em risco os empregos e a economia. Isso não é verdade. Pelo contrário”, afirmou Cadman à IPS. O Resumo para Responsáveis por Políticas do Informe do Grupo de Trabalho I – Bases de Ciência Física estabelece claramente que os seres humanos estão esquentando o planeta, e confirma estudos que datam de 1997. Desde os anos 1950, cada década foi mais quente do que a anterior, diz o documento.

O texto completo será divulgado hoje e é o primeiro dos quatro volumes (os dos três grupos de trabalho mais a síntese) do Quinto Informe de Avaliação do IPCC, conhecido como AR5. No Hemisfério Norte, “as temperaturas entre 1983 e 2012 foram as mais quentes dos últimos 1.400 anos”, destacou Thomas Stocker, copresidente do Grupo de Trabalho I.

Em resposta às notícias dos últimos dias, falando sobre uma “interrupção do aquecimento”, Stocker disse que o sistema climático é dinâmico, sendo provável que nos últimos anos tenha ingressado mais calor nos oceanos e que tenha diminuído ligeiramente o ritmo dos aumentos das temperaturas superficiais. Há mais de cem anos pesquisadores demonstraram que o dióxido de carbono prende o calor do Sol.

A queima de combustíveis fósseis, o desmatamento, a agricultura e outras atividades humanas emitem esse gás na atmosfera, onde permanece quase para sempre. Os maiores volumes de dióxido de carbono, por sua vez, acrescentam mais calor, pois atuam como outra camada de isolamento. Mais de 90% desta energia térmica adicional é absorvida pelos oceanos, segundo o Resumo. Isso explica a causa de as temperaturas da superfície ainda não terem superado o aumento médio mundial de 0,8 grau.

O Resumo destaca que a redução da camada de gelo do Ártico nas últimas três décadas foi a maior dos últimos 1.450 anos. Embora o derretimento no último verão boreal tenha sido menor que no ano passado, o Ártico segue no caminho de ficar sem gelo na temporada estival de 2050, bem antes do projetado por relatórios anteriores.

O informe do IPCC, uma revisão das pesquisas científicas disponíveis, se baseou em trabalhos de 259 autores de 39 países, e contou com 54.677 comentários. O informe de avaliação anterior, ou AR4, é de 2007. O IPCC não realiza pesquisas próprias e está sob a direção de 110 governos que passaram os últimos quatro dias aprovando o texto do sumário. “Cada palavra contida nas 36 páginas foi debatida. Alguns parágrafos foram discutidos por mais de uma hora”, contou Stocker em entrevista coletiva em Estocolmo. “Nenhum outro relatório científico jamais passou por um escrutínio tão crítico”, ressaltou.

O Resumo, redigido com cautela, detalha e confirma os impactos observados, com o aumento das temperaturas, as mudanças nos padrões pluviométricos e os extremos climáticos. Também confirma que estes e outros impactos vão piorar na medida em que aumentarem as emissões de dióxido de carbono. As atuais emissões deste gás estão no rumo do pior cenário.

O resumo do AR5 diz que a camada gelada da Groenlândia perdeu, em média, 177 bilhões de toneladas de gelo por ano entre 1992 e 2001. Estudos mais recentes demonstram que o gelo perdido aumentou substancialmente desde então. Segundo o relatório, há uma possibilidade em cinco de a camada gelada da Groenlândia derreter totalmente se as temperaturas globais aumentarem entre 0,8 grau e mais de um, como parece inevitável agora.

Um dos motivos é que a elevação das temperaturas no Ártico é quase três vezes maior do que o aumento médio mundial. Se a temperatura média mundial aumentar cerca de quatro graus será desencadeado um derretimento na Groenlândia, que elevaria o nível do mar em até sete metros. Apesar disto, o AR5 diz que não se espera que o nível do mar suba mais de um metro neste século.

Outros cientistas, entre eles James Hansen, ex-diretor do Instituto Goddar de Estudos Espaciais da Nasa (agência espacial dos Estados Unidos), acreditam que o acelerado derretimento observado no Ártico, na Groenlândia, na Antártida e nas geleiras é um sinal de que neste século o mar pode subir vários metros, a menos que as emissões sejam reduzidas.

Mesmo antes de se tornar público o novo informe do IPCC, “os que negam a mudança climática” o atacaram e tergiversaram, tentando retratar suas conclusões como radicais ou extremistas, afirmou Charles Greene, professor de ciências atmosféricas e da terra na Cornell University, de Nova York. Greene se referia a um orquestrado esforço de propaganda de atores da indústria dos combustíveis fósseis e de organizações que tentam convencer o público de que o aquecimento global não é um assunto de extrema urgência. “Na verdade, o IPCC tem um longo histórico de subestimar os impactos” da mudança climática, afirmou Greene.

Embora a ação mundial contra o aquecimento esteja em ponto morto, algumas cidades já estão reduzindo suas emissões de carbono. Os membros do Iclei se comprometeram com uma redução de 20% até 2020 e de 80% até 2050. A maioria dos governos não está tomando iniciativas, o que revela claramente o poder e a influência do setor dos combustíveis fósseis, pontuou Cadman. “As cidades podem fazer até dez vezes mais, mas, simplesmente, não têm dinheiro”, ressaltou.

Envolverde/IPS

IPCC sustenta que aquecimento global é inequívoco (IPS)

Ambiente

30/9/2013 – 08h01

por Fabíola Ortiz, da IPS

IPCC1 IPCC sustenta que aquecimento global é inequívoco

A elevação do nível do mar pode alagar várias regiões do Recife, no Nordeste do Brasil. Foto: Alejandro Arigón/IPS

Rio de Janeiro, Brasil, 30/9/2013 – Em meio a rumores de que o aquecimento global se deteve nos últimos 15 anos, o novo informe do Grupo Internacional de Especialistas sobre a Mudança Climática (IPCC) indica que as três últimas décadas foram, sucessivamente, mais quentes do que qualquer outra desde 1850. O Resumo para Responsáveis por Políticas do informe do Grupo de Trabalho I – Bases de Ciência Física, foi divulgado no dia 27, em Estocolmo, na Suécia.

O texto completo, sem edições, será conhecido hoje e constitui o primeiro dos quatro volumes do Quinto Informe de Avaliação do IPCC. O aquecimento é “inequívoco”, afirma o IPCC. “A atmosfera e o oceano esquentam, a quantidade de neve e gelo diminui, o nível do mar sobe e as concentrações de gases-estufa aumentam”, destaca o estudo.

Para o especialista brasileiro em clima, Carlos Nobre, um dos autores principais do Quarto Informe de Avaliação, o novo relatório não dá “nenhuma razão para o otimismo. Cada uma das três últimas décadas foi sucessivamente mais quente do que qualquer outra desde 1850. No Hemisfério Norte, o período 1983-2012 representa, provavelmente, os 30 anos mais quentes dos últimos 1.400”, diz o novo resumo. Os dados “das temperaturas médias terrestres e da superfície do oceano, calculados como uma tendência linear, mostram um aquecimento de 0,85 grau no período 1880-2012”, acrescentou Nobre

A respeito da suposta pausa no aumento do calor, o IPCC afirma que “a taxa de aquecimento dos últimos 15 anos (1998-2012) – que foi de 0,05 grau por década e que começou com um potente El Niño (fase quente da Oscilação do Sul) – é menor do que a calculada entre 1951 e 2012, que foi de 0,12 grau por década”.

Entretanto, argumenta, “devido à variabilidade natural, as tendências baseadas em registros de períodos curtos são muito sensíveis às datas de começo e final, e não refletem, em geral, as tendências climáticas de longo prazo”. Resumindo, diz o documento, “é virtualmente certo (99% a 100% de certeza) que a troposfera esquenta desde meados do século 20”.

Nobre indicou à IPS que “o resumo observa com mais detalhe o que está mudando e reduz as incertezas com um conhecimento científico aperfeiçoado”. Além disso, confirma que as alterações do clima se originam principalmente nas ações humanas, destacou Nobre, secretário de Pesquisa e Desenvolvimento do Ministério de Ciência e Tecnologia.

A humanidade deverá se decidir a deixar grande parte dos combustíveis fósseis – responsáveis pela emissão de gases que esquentam a atmosfera – e passar para outras formas de energia renovável, advertiu Nobre. Tecnicamente, é possível, falta uma opção consciente de todos os países, ressaltou. “Essa transição tem um custo, mas que é cada vez menor do que o estimado há 15 anos. O problema não é a tecnologia, mas a decisão política”, acrescentou.

Para Carlos Rittl, coordenador do programa de mudanças climáticas e energia do WWF-Brasil (Fundo Mundial para a Natureza), “embora o aquecimento tenha apresentado uma aparente estabilização quanto à temperatura média, os anos mais quentes foram registrados na última década. Isso não nos deixa em uma situação de conforto”.

O informe do IPCC, uma revisão das pesquisas científicas disponíveis, se baseou em trabalhos de 259 autores de 39 países, e contou com 54.677 comentários. Sua divulgação veio permeada de uma renovada onda mediática de dúvidas e rumores sobre a existência do aquecimento global.

Com base em diferentes trajetórias de emissão de gases-estufa, o resumo projeta para este final de século uma elevação da temperatura superior a 1,5 grau em relação ao período 1850-1900 em todos os cenários, menos no de menor concentração de gases. Também afirma que, provavelmente, o aumento exceda os dois graus até 2100 nos dois cenários de maior emissão de gases. O informe anterior, de 2007, previa aumento de dois graus, no melhor dos casos, e de até seis, no pior.

Após o fracasso da conferência intergovernamental de Copenhague, em dezembro de 2009, quando os países não conseguiram obter um acordo para reduzir a contaminação climática, foram redobrados os questionamentos ao IPCC, em particular por uma errônea estimativa do derretimento das geleiras do Himalaia. A informação “foi usada de forma irresponsável por quem nega o aquecimento global”, alertou Rittl.

Seis anos depois, há mais e melhores evidências científicas para estimar, por exemplo, quanto o derretimento de gelos contribui para a elevação do nível do mar. Até o final de 2100 a elevação média do mar oscilará entre 24 centímetros e 63 centímetros, segundo o melhor e o pior cenários de contaminação atmosférica. As chuvas “aumentarão nas regiões mais úmidas e diminuirão naquelas onde há escassez pluviométrica”, detalhou Rittl, que é doutor em ecologia.

No Brasil os exemplos são a árida região do Nordeste e as mais úmidas do Sul e Sudeste. A incidência de chuvas crescerá entre 1% e 3% em zonas do sul, segundo a velocidade do aquecimento, enquanto as áridas apresentarão padrões de seca mais severos. Todas as tendências confirmadas pelo informe são “alarmantes”, apontou Rittl. “Nós, humanos, somos responsáveis por essas mudanças, que vão piorar o cenário atual em que já há centenas de milhões de pessoas sofrendo escassez de água, de comida e de condições adequadas para sobreviver”, ressaltou.

O primeiro volume do Quinto Informe de Avaliação do IPCC é divulgado dois meses antes de acontecer em Varsóvia, na Polônia, a 19ª Conferência das Partes da Convenção das Nações Unidas sobre Mudança Climática. Nesse encontro deverá ser feito um esforço mundial para garantir a transição para uma economia baixa em dióxido de carbono, disse Nobre. “Este informe é um choque de realidade”, ressaltou.

Em sua opinião, o Brasil é um dos “poucos bons exemplos”, pois conseguiu reduzir em 38,4% suas emissões de gases-estufa entre 2005 e 2010, devido à queda no desmatamento da Amazônia. “O Brasil adotou compromissos voluntários, mas no global não há um acordo ambicioso”, explicou Nobre. Quanto mais tempo se demorar em “adotar ações concretas, mais difícil e improvável será conseguir uma trajetória sustentável de acomodação e adaptação à mudança climática”, enfatizou.

Rittl acredita que os governos devem enfrentar a mudança climática como um desafio nacional para o desenvolvimento, a inclusão social e a redução da pobreza. “É necessário lidar com os riscos e as oportunidades com muita responsabilidade”, concluiu.

Envolverde/IPS