Arquivo da tag: Geoengenharia

‘Superglue’ for the atmosphere: How sulfuric acid increases cloud formation (Science Daily)

Date: October 8, 2014

Source: Goethe-Universität Frankfurt am Main

Summary: It has been known for several years that sulfuric acid contributes to the formation of tiny aerosol particles, which play an important role in the formation of clouds. A new study shows that dimethylamine can tremendously enhance new particle formation. The formation of neutral (i.e. uncharged) nucleating clusters of sulfuric acid and dimethylamine was observed for the first time.

Clouds. Credit: Copyright Michele Hogan

It has been known for several years that sulfuric acid contributes to the formation of tiny aerosol particles, which play an important role in the formation of clouds. The new study by Kürten et al. shows that dimethylamine can tremendously enhance new particle formation. The formation of neutral (i.e. uncharged) nucleating clusters of sulfuric acid and dimethylamine was observed for the first time.

Previously, it was only possible to detect neutral clusters containing up to two sulfuric acid molecules. However, in the present study molecular clusters containing up to 14 sulfuric acid and 16 dimethylamine molecules were detected and their growth by attachment of individual molecules was observed in real-time starting from just one molecule. Moreover, these measurements were made at concentrations of sulfuric acid and dimethylamine corresponding to atmospheric levels (less than 1 molecule of sulfuric acid per 1 x 1013 molecules of air).

The capability of sulfuric acid molecules together with water and ammonia to form clusters and particles has been recognized for several years. However, clusters which form in this manner can vaporize under the conditions which exist in the atmosphere. In contrast, the system of sulfuric acid and dimethylamine forms particles much more efficiently because even the smallest clusters are essentially stable against evaporation. In this respect dimethylamine can act as “superglue” because when interacting with sulfuric acid every collision between a cluster and a sulfuric acid molecule bonds them together irreversibly. Sulphuric acid as well as amines in the present day atmosphere have mainly anthropogenic sources.

Sulphuric acid is derived mainly from the oxidation of sulphur dioxide while amines stem, for example, from animal husbandry. The method used to measure the neutral clusters utilizes a combination of a mass spectrometer and a chemical ionization source, which was developed by the University of Frankfurt and the University of Helsinki. The measurements were made by an international collaboration at the CLOUD (Cosmics Leaving OUtdoor Droplets) chamber at CERN (European Organization for Nuclear Research).

The results allow for very detailed insight into a chemical system which could be relevant for atmospheric particle formation. Aerosol particles influence Earth’s climate through cloud formation: Clouds can only form if so-called cloud condensation nuclei (CCN) are present, which act as seeds for condensing water molecules. Globally about half the CCN originate from a secondary process which involves the formation of small clusters and particles in the very first step followed by growth to sizes of at least 50 nanometers.

The observed process of particle formation from sulfuric acid and dimethylamine could also be relevant for the formation of CCN. A high concentration of CCN generally leads to the formation of clouds with a high concentration of small droplets; whereas fewer CCN lead to clouds with few large droplets. Earth’s radiation budget, climate as well as precipitation patterns can be influenced in this manner. The deployed method will also open a new window for future measurements of particle formation in other chemical systems.


Journal Reference:

  1. A. Kurten, T. Jokinen, M. Simon, M. Sipila, N. Sarnela, H. Junninen, A. Adamov, J. Almeida, A. Amorim, F. Bianchi, M. Breitenlechner, J. Dommen, N. M. Donahue, J. Duplissy, S. Ehrhart, R. C. Flagan, A. Franchin, J. Hakala, A. Hansel, M. Heinritzi, M. Hutterli, J. Kangasluoma, J. Kirkby, A. Laaksonen, K. Lehtipalo, M. Leiminger, V. Makhmutov, S. Mathot, A. Onnela, T. Petaja, A. P. Praplan, F. Riccobono, M. P. Rissanen, L. Rondo, S. Schobesberger, J. H. Seinfeld, G. Steiner, A. Tome, J. Trostl, P. M. Winkler, C. Williamson, D. Wimmer, P. Ye, U. Baltensperger, K. S. Carslaw, M. Kulmala, D. R. Worsnop, J. Curtius. Neutral molecular cluster formation of sulfuric acid-dimethylamine observed in real time under atmospheric conditions. Proceedings of the National Academy of Sciences, 2014; DOI: 10.1073/pnas.1404853111

Can Carbon Capture Technology Be Part of the Climate Solution? (Environment 360)

08 SEP 2014

Some scientists and analysts are touting carbon capture and storage as a necessary tool for avoiding catastrophic climate change. But critics of the technology regard it as simply another way of perpetuating a reliance on fossil fuels.

by david biello

For more than 40 years, companies have been drilling for carbon dioxide in southwestern Colorado. Time and geology had conspired to trap an enormous bubble of CO2 that drillers tapped, and a pipeline was built to carry the greenhouse gas all the way to the oil fields of west Texas. When scoured with the CO2, these aged wells gush forth more oil, and much of the CO2 stays permanently trapped in its new home underneath Texas.

More recently, drillers have tapped the Jackson Dome, nearly three miles beneath Jackson, Mississippi, to get at a trapped pocket of CO2 for similar

Kemper County power plant near Meridian, Mississippi

Gary Tramontina/Bloomberg/Getty Images. This power plant being built in Kemper County, Mississippi, would be the first in the U.S. to capture its own carbon emissions.

use. It’s called enhanced oil recovery. And now there’s a new source of CO2 coming online in Mississippi — a power plant that burns gasified coal in Kemper County, due to be churning out electricity and captured CO2 by 2015 and sending it via a 60-mile pipeline to oil fields in the southern part of the state.

The Mississippi project uses emissions from burning a fossil fuel to help bring more fossil fuels out of the ground — a less than ideal solution to the problem of climate change. But enhanced oil recovery may prove an important step in making more widely available a technology that could be critical for combating climate change — CO2 capture and storage, or CCS.

As the use of coal continues to grow globally — coal consumption is expected to double from 2000 to 2020 largely due to demand in China and India — some scientists believe the widespread adoption of CCS technology could be key to any hope of limiting global average temperature increase to 2 degrees Celsius, the threshold for avoiding major climate disruption. After all, coal is the dirtiest fossil fuel.

“Fossil fuels aren’t disappearing anytime soon,” says John Thompson, director of the Fossil Fuel Transition Project for the non-profit Clean Air Task Force. “If we’re serious about preventing global warming, we’re going to have to find a way to use those fuels without the carbon going into the atmosphere. It seems inconceivable that we can do that without a significant amount of carbon capture and storage. The question is how do we deploy it in time and in a way that’s cost-effective across many nations?”

The biggest challenge is one of scale, as the potential demand from aging oil fields for CO2 produced from coal-fired power plants is enormous. Thompson estimates that enhanced oil recovery could ultimately consume 33 billion metric tons of CO2 in total, or the equivalent of all the CO2 pollution from all U.S. power plants for several decades. Thompson and other analysts view such large-scale enhanced oil recovery as an important phase in the deployment of CCS technology while replacements for fossil fuels are developed. 

“In the short term, in order to develop the technology, we probably will enable more use of hydrocarbons, which makes environmentally conscious people uncomfortable,” says Chris Jones, a chemical engineer working on CO2 capture at the Georgia Institute of Technology. “But it’s a necessary thing we have to do to get the technology out there and learn how to make it more efficient.”

At the same time, CO2 capture and storage is not as simple as locking away carbon deep underground. As Jones notes, the process will perpetuate fossil fuel use and may prove a wash as far as keeping global warming pollution out of the atmosphere. Then there are the risks of human-caused earthquakes as a result of pumping high-pressure liquids underground or accidental releases as all that CO2 finds its way back to the atmosphere.

“Any solution that doesn’t take carbon from the air is, in principle, not sustainable,” says physicist Peter Eisenberger of the Lamont-Doherty Earth Observatory at Columbia University, who is working on methods to pull CO2 out of the sky rather than smokestacks. He notes that merely avoiding CO2 pollution is not enough and will create political powerhouses—heirs to the energy companies of today—that will entrench such unsustainable technologies “Why spend so much time and energy and ingenuity coming up with solutions that are not really solutions?” he adds.

But the expansion of enhanced oil recovery remains the main front in an intensifying effort to more broadly adopt CCS technology and reduce its price, which is currently the major impediment to its deployment. The need for CO2 storage goes beyond China and the U.S., the world’s two largest polluters. Worldwide, more than 35 billion metric tons of CO2 are being dumped into the atmosphere annually, almost all from the burning of coal, oil, and natural gas. To restrain global warming to the 2 degree C target, more than 100 CCS projects eliminating 270 million metric tons of CO2 pollution annually would have to be built by 2020, according to theInternational Energy Agency. But only 60 are currently planned or proposed and just 21 of those are actually built or in operation.

Those include the Kemper facility and other coal-fired power plants, but also a CCS project under construction at an ethanol refinery in Illinois. A group led by Royal Dutch Shell is building technology to capture the CO2 pollution from tar sands operations in Alberta, Canada, and in Saskatchewan, a $1.2 billion project to retrofit a large coal-fired power plant with CCS technology is expected to open later this year. And there are 34 proposed or operating CCS projects outside of North America, the majority in Asia and Australia. But European countries like Germany have rolled back plans to adopt CCS because of public opposition, dropping the number of European projects from 14 planned in 2011 to just five as of 2014, according to the Global CCS Institute. 

That might conflict with the European Union’s avowed intention to help combat climate change. The U.N. Intergovernmental Panel on Climate Change suggested earlier this year that carbon capture and storage at power plants could prove a critical part of any serious effort to restrain global warming. “We depend on removing large amounts of CO2 from the atmosphere in order to bring concentrations well below 450 [parts-per-million] in 2100,” said Ottmar Edenhofer, an economist at the Potsdam Institute for Climate Impact Research and co-chair of the IPCC’s third working group, which was tasked with figuring out ways to mitigate climate change. Ultimately, he said, keeping a global temperature rise to 2 degrees without any CCS would require phasing out fossil fuels entirely within “the next few decades.”

Yet, from 2007 to 2013, global coal consumption increased from 6.4 billion to 7.4 billion metric tons, and coal use continues to rise. Although renewable energy sources like solar and wind are growing rapidly, they are doing so from a very small base and many energy analysts argue that it will be decades before they can supplant fossil fuels. The time and expense of building nuclear power plants — and public opposition — has also hampered that low-carbon technology’s ability to replace coal burning. And biofuels or electric cars remain a long way from supplanting oil for transportation.

The Obama administration hopes to encourage the development of CO2 capture and use or storage. New rules from the Environmental Protection Agency requiring a 30 percent cut in power plant emissions by 2030 may spur development of CCS technologies. Already, NRG Energy has partnered with a Japanese firm to add CO2 capture to a coal-fired power plant near Houston and use a pipeline to send the captured pollution to nearby oilfields. Dubbed Petra Nova, the $1 billion CCS project is the latest in a series of 19 CO2 capture projects underway or proposed in the U.S. 

The bulk of such CO2 capture and storage experiments may soon shift to China, the world’s largest emitter of CO2. The Chinese and U.S. governments have a cooperative agreement to develop the technology, including partnerships between Chinese power companies like Huaneng and American corporations such as Summit Power, which is developing a CCS power plant in west Texas. In China, the long-awaited GreenGen power plant in Tianjin is still under construction and will capture CO2 for China’s own efforts at enhanced oil recovery. But going forward, the expense of CCS may make the technology even more unpalatable in a developing country like China, which also has plans to turn coal into liquid fuels — a process that, from a climate perspective, is even worse than burning the dirty rock directly.

The technology to capture CO2 is relatively simple, and has been in use since the 1930s. For example, CO2 can be captured from the smokestacks of coal plants, natural gas plants, and even factories by routing the flue gases through an amine chemical bath, which binds the CO2. The chemical is then heated to release the CO2. The CO2 is pressurized to convert it to a liquid, and the liquid is then pumped via pipeline to an appropriate storage site. Those include underground geological formations, such as sandstones or saline aquifers, but also old oil fields, where the CO2 replaces the oil in small pores in the rock left behind by conventional methods and forces it up to the surface. Six percent of U.S. oil already comes from usingenhanced oil recovery, a number that will increase, according to the U.S. Energy Information Administration.

Still, the economic and technological challenges facing CCS are daunting. Much-heralded projects like the CO2 capture and storage demonstration at the Mountaineer Power Plant in West Virginia were abandoned because no one wanted to pay for it. The hardware sits unused next to the hulking power plant’s smokestacks and cooling towers. 

The ultimate challenge is that capturing CO2 from a smokestack costs more than simply dumping it into the atmosphere. Analysts say the simplest way to encourage less pollution and more CO2 capture would be to charge for the privilege of emitting CO2 by imposing a tax on carbon emissions. A price on CO2, if high enough, might make capturing the greenhouse gas look cheap.

Even if that policy change happens, the problem of storing all that CO2 remains, including concerns that the CO2 could escape back into the atmosphere or cause earthquakes. In Algeria, a test to store nearly 4 million metric tons of injected CO2 underground was halted after the gas raised the overlying rock and fractured it. Concerns over such induced seismicity or accidental releases of CO2 have blocked CCS plans in Europe, as have concerns over how to ensure the stored CO2 stays put for millennia.

But storing CO2 underground can work, as Norway’s Sleipner project in the North Sea has demonstrated. At Sleipner, which started capturing and storing CO2 in 1996, more than 16 million metric tons of CO2 have been put in an undersea sandstone formation; the project is funded by Norway’s carbon tax. And around the world, the potential storage resource is gargantuan. The U.S. alone has an estimated 4 trillion metric tons of CO2 storage capacity in the form of porous sandstones or saltwater aquifers, according to the U.S. Department of Energy.

Scientists at Columbia’s Lamont-Doherty Earth Observatory and elsewhere are investigating just how vast the storage potential under the ocean could be. David Goldberg, a marine geophysicist at Lamont, proposes that liquid CO2 could be pumped offshore and injected into the ubiquitous basalt formations found off many of the world’s coastlines. When mixed with water, the CO2 leaches metals out of the basalt and forms a carbonate chalk, Goldberg explains. 

“The goal of the whole CCS exercise is to take CO2, which is volatile, and put it in solid form where it will stay locked away forever,” he adds. Goldberg has calculated that just one such ridge site that runs the north-south length of the Atlantic Ocean could theoretically store all of humanity’s excess CO2 emissions to date. “The magic of being offshore is that you are away from people and away from property.”

There is also basalt on land. In an experiment in Iceland, more than 80 percent of the injected CO2 interacted with the surrounding basalt and converted to rock in less than a year. A similar experiment in Washington State achieved similar results.

In the end, getting off fossil fuels entirely is the only way to control CO2 pollution. But until that happens, CCS could be vital to stave off catastrophic climate change. “Ultimately, we need a thermostat on this planet,” says Klaus Lackner, a Columbia University physicist who is working on pulling the greenhouse gas directly out of the air rather than capturing it from smokestacks. “And we need to control the CO2.”

Correction, September 9, 2014: Previous versions of this article misstated the amount of CO2 storage capacity in porous sandstones or saltwater aquifers in the U.S.; it is 4 trillion metric tons.

Clues to trapping carbon dioxide in rock: Calcium carbonate takes multiple, simultaneous roads to different minerals (Science Daily)

Date: September 4, 2014

Source: Pacific Northwest National Laboratory

Summary: Researchers used a powerful microscope that allows them to see the birth of calcium carbonate crystals in real time, giving them a peek at how different calcium carbonate crystals form.


An aragonite crystal — with its characteristic “sheaf of wheat” look — consumed a particle of amorphous calcium carbonate as it formed. Credit: Nielsen et al. 2014/Science

One of the most important molecules on earth, calcium carbonate crystallizes into chalk, shells and minerals the world over. In a study led by the Department of Energy’s Pacific Northwest National Laboratory, researchers used a powerful microscope that allows them to see the birth of crystals in real time, giving them a peek at how different calcium carbonate crystals form, they report in September 5’s issue of Science.

The results might help scientists understand how to lock carbon dioxide out of the atmosphere as well as how to better reconstruct ancient climates.

“Carbonates are most important for what they represent, interactions between biology and Earth,” said lead researcher James De Yoreo, a materials scientist at PNNL.. “For a decade, we’ve been studying the formation pathways of carbonates using high-powered microscopes, but we hadn’t had the tools to watch the crystals form in real time. Now we know the pathways are far more complicated than envisioned in the models established in the twentieth century.”

Earth’s Reserve

Calcium carbonate is the largest reservoir of carbon on the planet. It is found in rocks the world over, shells of both land- and water-dwelling creatures, and pearls, coral, marble and limestone. When carbon resides within calcium carbonate, it is not hanging out in the atmosphere as carbon dioxide, warming the world. Understanding how calcium carbonate turns into various minerals could help scientists control its formation to keep carbon dioxide from getting into the atmosphere.

Calcium carbonate deposits also contain a record of Earth’s history. Researchers reconstructing ancient climates delve into the mineral for a record of temperature and atmospheric composition, environmental conditions and the state of the ocean at the time those minerals formed. A better understanding of its formation pathways will likely provide insights into those events.

To get a handle on mineral formation, researchers at PNNL, the University of California, Berkeley, and Lawrence Berkeley National Laboratory examined the earliest step to becoming a mineral, called nucleation. In nucleation, molecules assemble into a tiny crystal that then grows with great speed. Nucleation has been difficult to study because it happens suddenly and unpredictably, so the scientists needed a microscope that could watch the process in real time.

Come to Order

In the 20th century, researchers established a theory that crystals formed in an orderly fashion. Once the ordered nucleus formed, more molecules added to the crystal, growing the mineral but not changing its structure. Recently, however, scientists have wondered if the process might be more complicated, with other things contributing to mineral formation. For example, in previous experiments they’ve seen forms of calcium carbonate that appear to be dense liquids that could be sources for minerals.

Researchers have also wondered if calcite forms from less stable varieties or directly from calcium and carbonate dissolved in the liquid. Aragonite and vaterite are calcium carbonate minerals with slightly different crystal architectures than calcite and could represent a step in calcite’s formation. The fourth form called amorphous calcium carbonate — or ACC, which could be liquid or solid, might also be a reservoir for sprouting minerals.

To find out, the team created a miniature lab under a transmission electron microscope at the Molecular Foundry, a DOE Office of Science User Facility at LBNL. In this miniature lab, they mixed sodium bicarbonate (used to make club soda) and calcium chloride (similar to table salt) in water. At high enough concentrations, crystals grew. Videos of nucleating and growing crystals recorded what happened [URLs to come].

Morphing Minerals

The videos revealed that mineral growth took many pathways. Some crystals formed through a two-step process. For example, droplet-like particles of ACC formed, then crystals of aragonite or vaterite appeared on the surface of the droplets. As the new crystals formed, they consumed the calcium carbonate within the drop on which they nucleated.

Other crystals formed directly from the solution, appearing by themselves far away from any ACC particles. Multiple forms often nucleated in a single experiment — at least one calcite crystal formed on top of an aragonite crystal while vaterite crystals grew nearby.

What the team didn’t see in and among the many options, however, was calcite forming from ACC even though researchers widely expect it to happen. Whether that means it never does, De Yoreo can’t say for certain. But after looking at hundreds of nucleation events, he said it is a very unlikely event.

“This is the first time we have directly visualized the formation process,” said De Yoreo. “We observed many pathways happening simultaneously. And they happened randomly. We were never able to predict what was going to come up next. In order to control the process, we’d need to introduce some kind of template that can direct which crystal forms and where.”

In future work, De Yoreo and colleagues plan to investigate how living organisms control the nucleation process to build their shells and pearls. Biological organisms keep a store of mineral components in their cells and have evolved ways to make nucleation happen when and where needed. The team is curious to know how they use cellular molecules to achieve this control.

This work was supported by the Department of Energy Office of Science.

 

Journal Reference:

  1. Michael H. Nielsen, Shaul Aloni, and James J. De Yoreo. In Situ TEM Imaging of CaCO3 Nucleation Reveals Coexistence of Direct and Indirect Pathways.Science, September 5, 2014 DOI: 10.1126/science.1254051

Global warming pioneer calls for carbon dioxide to be taken from atmosphere and stored underground (Science Daily)

Date: August 28, 2014

Source: European Association of Geochemistry

Summary: Wally Broeker, the first person to alert the world to global warming, has called for atmospheric carbon dioxide to be captured and stored underground.


Wally Broeker, the first person to alert the world to global warming, has called for atmospheric CO2 to be captured and stored underground. He says that carbon capture, combined with limits on fossil fuel emissions, is the best way to avoid global warming getting out of control over the next fifty years. Professor Broeker (Columbia University, New York) made the call during his presentation to the International Carbon Conference in Reykjavik, Iceland, where 150 scientists are meeting to discuss carbon capture and storage.

He was presenting an analysis which showed that the world has been cooling very slowly, over the last 51 million years, but that human activity is causing a rise in temperature which will lead to problems over the next 100,000 years.

“We have painted ourselves into a tight corner. We can’t reduce our reliance of fossil fuels quickly enough, so we need to look at alternatives.

“One of the best ways to deal with this is likely to be carbon capture — in other words, putting the carbon back where it came from, underground. There has been great progress in capturing carbon from industrial processes, but to really make a difference we need to begin to capture atmospheric CO2. Ideally, we could reach a stage where we could control the levels of CO2 in the atmosphere, like you control your central heating. Continually increasing CO2 levels means that we will need to actively manage CO2 levels in the environment, not just stop more being produced. The technology is proven, it just needs to be brought to a stage where it can be implemented.”

Wally Broeker was speaking at the International Carbon Conference in Reykjavik, where 150 scientists are meeting to discuss how best CO2 can be removed from the atmosphere as part of a programme to reduce global warming.

Meeting co-convener Professor Eric Oelkers (University College London and University of Toulouse) commented: “Capture is now at a crossroads; we have proven methods to store carbon in the Earth but are limited in our ability to capture this carbon directly from the atmosphere. We are very good at capturing carbon from factories and power stations, but because roughly two-thirds of our carbon originates from disperse sources, implementing direct air capture is key to solving this global challenge.”

European Association of Geochemistry. “Global warming pioneer calls for carbon dioxide to be taken from atmosphere and stored underground.” ScienceDaily. ScienceDaily, 28 August 2014. <www.sciencedaily.com/releases/2014/08/140828110915.htm>.

Carbon dioxide ‘sponge’ could ease transition to cleaner energy (Science Daily)

Date: August 10, 2014

Source: American Chemical Society (ACS)

Summary: A plastic sponge that sops up the greenhouse gas carbon dioxide might ease our transition away from polluting fossil fuels to new energy sources like hydrogen. A relative of food container plastics could play a role in President Obama’s plan to cut carbon dioxide emissions. The material might also someday be integrated into power plant smokestacks.


Plastic that soaks up carbon dioxide could someday be used in plant smokestacks.
Credit: American Chemical Society

A sponge-like plastic that sops up the greenhouse gas carbon dioxide (CO2) might ease our transition away from polluting fossil fuels and toward new energy sources, such as hydrogen. The material — a relative of the plastics used in food containers — could play a role in President Obama’s plan to cut CO2 emissions 30 percent by 2030, and could also be integrated into power plant smokestacks in the future.

The report on the material is one of nearly 12,000 presentations at the 248th National Meeting & Exposition of the American Chemical Society (ACS), the world’s largest scientific society, taking place here through Thursday.

“The key point is that this polymer is stable, it’s cheap, and it adsorbs CO2 extremely well. It’s geared toward function in a real-world environment,” says Andrew Cooper, Ph.D. “In a future landscape where fuel-cell technology is used, this adsorbent could work toward zero-emission technology.”

CO2 adsorbents are most commonly used to remove the greenhouse gas pollutant from smokestacks at power plants where fossil fuels like coal or gas are burned. However, Cooper and his team intend the adsorbent, a microporous organic polymer, for a different application — one that could lead to reduced pollution.

The new material would be a part of an emerging technology called an integrated gasification combined cycle (IGCC), which can convert fossil fuels into hydrogen gas. Hydrogen holds great promise for use in fuel-cell cars and electricity generation because it produces almost no pollution. IGCC is a bridging technology that is intended to jump-start the hydrogen economy, or the transition to hydrogen fuel, while still using the existing fossil-fuel infrastructure. But the IGCC process yields a mixture of hydrogen and CO2 gas, which must be separated.

Cooper, who is at the University of Liverpool, says that the sponge works best under the high pressures intrinsic to the IGCC process. Just like a kitchen sponge swells when it takes on water, the adsorbent swells slightly when it soaks up CO2 in the tiny spaces between its molecules. When the pressure drops, he explains, the adsorbent deflates and releases the CO2­, which they can then collect for storage or convert into useful carbon compounds.

The material, which is a brown, sand-like powder, is made by linking together many small carbon-based molecules into a network. Cooper explains that the idea to use this structure was inspired by polystyrene, a plastic used in styrofoam and other packaging material. Polystyrene can adsorb small amounts of CO2 by the same swelling action.

One advantage of using polymers is that they tend to be very stable. The material can even withstand being boiled in acid, proving it should tolerate the harsh conditions in power plants where CO2 adsorbents are needed. Other CO2 scrubbers — whether made from plastics or metals or in liquid form — do not always hold up so well, he says. Another advantage of the new adsorbent is its ability to adsorb CO2 without also taking on water vapor, which can clog up other materials and make them less effective. Its low cost also makes the sponge polymer attractive. “Compared to many other adsorbents, they’re cheap,” Cooper says, mostly because the carbon molecules used to make them are inexpensive. “And in principle, they’re highly reusable and have long lifetimes because they’re very robust.”

Cooper also will describe ways to adapt his microporous polymer for use in smokestacks and other exhaust streams. He explains that it is relatively simple to embed the spongy polymers in the kinds of membranes already being evaluated to remove CO­2 from power plant exhaust, for instance. Combining two types of scrubbers could make much better adsorbents by harnessing the strengths of each, he explains.

The research was funded by the Engineering and Physical Sciences Research Council and E.ON Energy.

Geoengineering the Earth’s climate sends policy debate down a curious rabbit hole (The Guardian)

Many of the world’s major scientific establishments are discussing the concept of modifying the Earth’s climate to offset global warming

Monday 4 August 2014

Many leading scientific institutions are now looking at proposed ways to engineer the planet's climate to offset the impacts of global warming.

Many leading scientific institutions are now looking at proposed ways to engineer the planet’s climate to offset the impacts of global warming. Photograph: NASA/REUTERS

There’s a bit in Alice’s Adventures in Wonderland where things get “curiouser and curiouser” as the heroine tries to reach a garden at the end of a rat-hole sized corridor that she’s just way too big for.

She drinks a potion and eats a cake with no real clue what the consequences might be. She grows to nine feet tall, shrinks to ten inches high and cries literal floods of frustrated tears.

I spent a couple of days at a symposium in Sydney last week that looked at the moral and ethical issues around the concept of geoengineering the Earth’s climate as a “response” to global warming.

No metaphor is ever quite perfect (climate impacts are no ‘wonderland’), but Alice’s curious experiences down the rabbit hole seem to fit the idea of medicating the globe out of a possible catastrophe.

And yes, the fact that in some quarters geoengineering is now on the table shows how the debate over climate change policy is itself becoming “curiouser and curiouser” still.

It’s tempting too to dismiss ideas like pumping sulphate particles into the atmosphere or making clouds whiter as some sort of surrealist science fiction.

But beyond the curiosity lies actions being countenanced and discussed by some of the world’s leading scientific institutions.

What is geoengineering?

Geoengineering – also known as climate engineering or climate modification – comes in as many flavours as might have been on offer at the Mad Hatter’s Tea Party.

Professor Jim Falk, of the Melbourne Sustainable Society Institute at the University of Melbourne, has a list of more than 40 different techniques that have been suggested.

They generally take two approaches.

Carbon Dioxide Reduction (CDR) is pretty self explanatory. Think tree planting, algae farming, increasing the carbon in soils, fertilising the oceans or capturing emissions from power stations. Anything that cuts the amount of CO2 in the atmosphere.

Solar Radiation Management (SRM) techniques are concepts to try and reduce the amount of solar energy reaching the earth. Think pumping sulphate particles into the atmosphere (this mimics major volcanic eruptions that have a cooling effect on the planet), trying to whiten clouds or more benign ideas like painting roofs white.

Geoengineering on the table

In 2008 an Australian Government–backed research group issued a report on the state-of-play of ocean fertilisation, recording there had been 12 experiments carried out of various kinds with limited to zero evidence of “success”.

This priming of the “biological pump” as its known, promotes the growth of organisms (phytoplankton) that store carbon and then sink to the bottom of the ocean.

The report raised the prospect that larger scale experiments could interfere with the oceanic food chain, create oxygen-depleted “dead zones” (no fish folks), impact on corals and plants and various other unknowns.

The Royal Society – the world’s oldest scientific institution – released a report in 2009, also reviewing various geoengineering technologies.

In 2011, Australian scientists gathered at a geoengineering symposium organised by the Australian Academy of Science and the Australian Academy of Technological Sciences and Engineering.

The London Protocol – a maritime convention relating to dumping at sea – was amended last year to try and regulate attempts at “ocean fertilisation” – where substances, usually iron, are dumped into the ocean to artificially raise the uptake of carbon dioxide.

The latest major United Nations Intergovernmental Panel on Climate Change also addressed the geoengineering issue in several chapters of its latest report. The IPCC summarised geoengineering this way.

CDR methods have biogeochemical and technological limitations to their potential on a global scale. There is insufficient knowledge to quantify how much CO2 emissions could be partially offset by CDR on a century timescale. Modelling indicates that SRM methods, if realizable, have the potential to substantially offset a global temperature rise, but they would also modify the global water cycle, and would not reduce ocean acidification. If SRM were terminated for any reason, there is high confidence that global surface temperatures would rise very rapidly to values consistent with the greenhouse gas forcing. CDR and SRM methods carry side effects and long-term consequences on a global scale.

Towards the end of this year, the US National Academy of Sciences will be publishing a major report on the “technical feasibility” of some geoengineering techniques.

Fighting Fire With Fire

The symposium in Sydney was co-hosted by the University of New South Wales and the Sydney Environment Institute at the University of Sydney (for full disclosure here, they paid my travel costs and one night stay).

Dr Matthew Kearnes, one of the organisers of the workshop from UNSW, told me there was “nervousness among many people about even thinking or talking about geoengineering.” He said:

I would not want to dismiss that nervousness, but this is an agenda that’s now out there and it seems to be gathering steam and credibility in some elite establishments.

Internationally geoengineering tends to be framed pretty narrowly as just a case of technical feasibility, cost and efficacy. Could it be done? What would it cost? How quickly would it work?

We wanted to get a way from the arguments about the pros and cons and instead think much more carefully about what this tells us about the climate change debate more generally.

The symposium covered a range of frankly exhausting philosophical, social and political considerations – each of them jumbo-sized cans full of worms ready to open.

Professor Stephen Gardiner, of the University of Washington, Seattle, pushed for the wider community to think about the ethical and moral consequences of geoengineering. He drew a parallel between the way, he said, that current fossil fuel combustion takes benefits now at the expense of impacts on future generations. Geoengineering risked making the same mistake.

Clive Hamilton’s book Earthmasters notes “in practice any realistic assessment of how the world works must conclude that geoengineering research is virtually certain to reduce incentives to pursue emission reductions”.

Odd advocates

Curiouser still, is that some of the world’s think tanks who shout the loudest that human-caused climate change might not even be a thing, or at least a thing not worth worrying about, are happy to countenance geoengineering as a solution to the problem they think is overblown.

For example, in January this year the Copenhagen Consensus Center, a US-based think tank founded by Danish political scientist Bjorn Lomborg, issued a submission to an Australian Senate inquiry looking at overseas aid and development.

Lomborg’s center has for many years argued that cutting greenhouse gas emissions is too expensive and that action on climate change should have a low-priority compared to other issues around the world.

Lomborg himself says human-caused climate change will not turn into an economic negative until near the end of this century.

Yet Lomborg’s submission told the Australian Senate suggested that every dollar spent on “investigat[ing] the feasibility of planetary cooling through geoengineering technologies” could yield “$1000 of benefits” although this, Lomborg wrote, was a “rough estimate”.

But these investigations, Lomborg submitted, “would serve to better understand risks, costs, and benefits, but also act as an important potential insurance against global warming”.

Engineering another excuse

Several academics I’ve spoken with have voiced fears that the idea of unproven and potentially disastrous geoengineering technologies being an option to shield societies from the impacts of climate change could be used to distract policy makers and the public from addressing the core of the climate change issue – that is, curbing emissions in the first place.

But if the idea of some future nation, or group of nations, or even corporations, some embarking on a major project to modify the Earth’s climate systems leaves you feeling like you’ve fallen down a surreal rabbit hole, then perhaps we should also ask ourselves this.

Since the year 1750, the world has added something in the region of 1,339,000,000,000 tonnes of carbon dioxide (that’s 1.34 trillion tonnes) to the atmosphere from fossil fuel and cement production.

Raising the level of CO2 in the atmosphere by 40 per cent could be seen as accidental geoengineering.

Time to crawl out of the rabbit hole?

‘Dressed’ laser aimed at clouds may be key to inducing rain, lightning (Science Daily)

Date: April 18, 2014

Source: University of Central Florida

Summary: The adage “Everyone complains about the weather but nobody does anything about it” may one day be obsolete if researchers further develop a new technique to aim a high-energy laser beam into clouds to make it rain or trigger lightning. Other possible uses of this technique could be used in long-distance sensors and spectrometers to identify chemical makeup.

The adage “Everyone complains about the weather but nobody does anything about it,” may one day be obsolete if researchers at the University of Central Florida’s College of Optics & Photonics and the University of Arizona further develop a new technique to aim a high-energy laser beam into clouds to make it rain or trigger lightning. Credit: © Maksim Shebeko / Fotolia

The adage “Everyone complains about the weather but nobody does anything about it” may one day be obsolete if researchers at the University of Central Florida’s College of Optics & Photonics and the University of Arizona further develop a new technique to aim a high-energy laser beam into clouds to make it rain or trigger lightning.

The solution? Surround the beam with a second beam to act as an energy reservoir, sustaining the central beam to greater distances than previously possible. The secondary “dress” beam refuels and helps prevent the dissipation of the high-intensity primary beam, which on its own would break down quickly. A report on the project, “Externally refueled optical filaments,” was recently published in Nature Photonics.

Water condensation and lightning activity in clouds are linked to large amounts of static charged particles. Stimulating those particles with the right kind of laser holds the key to possibly one day summoning a shower when and where it is needed.

Lasers can already travel great distances but “when a laser beam becomes intense enough, it behaves differently than usual — it collapses inward on itself,” said Matthew Mills, a graduate student in the Center for Research and Education in Optics and Lasers (CREOL). “The collapse becomes so intense that electrons in the air’s oxygen and nitrogen are ripped off creating plasma — basically a soup of electrons.”

At that point, the plasma immediately tries to spread the beam back out, causing a struggle between the spreading and collapsing of an ultra-short laser pulse. This struggle is called filamentation, and creates a filament or “light string” that only propagates for a while until the properties of air make the beam disperse.

“Because a filament creates excited electrons in its wake as it moves, it artificially seeds the conditions necessary for rain and lightning to occur,” Mills said. Other researchers have caused “electrical events” in clouds, but not lightning strikes.

But how do you get close enough to direct the beam into the cloud without being blasted to smithereens by lightning?

“What would be nice is to have a sneaky way which allows us to produce an arbitrary long ‘filament extension cable.’ It turns out that if you wrap a large, low intensity, doughnut-like ‘dress’ beam around the filament and slowly move it inward, you can provide this arbitrary extension,” Mills said. “Since we have control over the length of a filament with our method, one could seed the conditions needed for a rainstorm from afar. Ultimately, you could artificially control the rain and lightning over a large expanse with such ideas.”

So far, Mills and fellow graduate student Ali Miri have been able to extend the pulse from 10 inches to about 7 feet. And they’re working to extend the filament even farther.

“This work could ultimately lead to ultra-long optically induced filaments or plasma channels that are otherwise impossible to establish under normal conditions,” said professor Demetrios Christodoulides, who is working with the graduate students on the project.

“In principle such dressed filaments could propagate for more than 50 meters or so, thus enabling a number of applications. This family of optical filaments may one day be used to selectively guide microwave signals along very long plasma channels, perhaps for hundreds of meters.”

Other possible uses of this technique could be used in long-distance sensors and spectrometers to identify chemical makeup. Development of the technology was supported by a $7.5 million grant from the Department of Defense.

Journal Reference:

  1. Maik Scheller, Matthew S. Mills, Mohammad-Ali Miri, Weibo Cheng, Jerome V. Moloney, Miroslav Kolesik, Pavel Polynkin, Demetrios N. Christodoulides.Externally refuelled optical filamentsNature Photonics, 2014; 8 (4): 297 DOI:10.1038/nphoton.2014.47

Climate Engineering: What Do the Public Think? (Science Daily)

Jan. 12, 2014 — Members of the public have a negative view of climate engineering, the deliberate large-scale manipulation of the environment to counteract climate change, according to a new study.

The results are from researchers from the University of Southampton and Massey University (New Zealand) who have undertaken the first systematic large-scale evaluation of the public reaction to climate engineering.

The work is published in Nature Climate Change this week (12 January 2014).

Some scientists think that climate engineering approaches will be required to combat the inexorable rise in atmospheric CO2 due to the burning of fossil fuels. Climate engineering could involve techniques that reduce the amount of CO2 in the atmosphere or approaches that slow temperature rise by reducing the amount of sunlight reaching the Earth’s surface.

Co-author Professor Damon Teagle of the University of Southampton said: “Because even the concept of climate engineering is highly controversial, there is pressing need to consult the public and understand their concerns before policy decisions are made.”

Lead author, Professor Malcolm Wright of Massey University, said: “Previous attempts to engage the public with climate engineering have been exploratory and small scale. In our study, we have drawn on commercial methods used to evaluate brands and new product concepts to develop a comparative approach for evaluating the public reaction to a variety of climate engineering concepts.”

The results show that the public has strong negative views towards climate engineering. Where there are positive reactions, they favour approaches that reduce carbon dioxide over those that reflected sunlight.

“It was a striking result and a very clear pattern,” said Professor Wright. “Interventions such as putting mirrors in space or fine particles into the stratosphere are not well received. More natural processes of cloud brightening or enhanced weathering are less likely to raise objections, but the public react best to creating biochar (making charcoal from vegetation to lock in CO2) or capturing carbon directly from the air.”

Nonetheless, even the most well regarded techniques still has a net negative perception.

The work consulted large representative samples in both Australia and New Zealand. Co-author Pam Feetham said: “The responses are remarkably consistent from both countries, with surprisingly few variations except for a slight tendency for older respondents to view climate engineering more favourably.”

Professor Wright noted that giving the public a voice so early in technological development was unusual, but increasingly necessary. “If these techniques are developed the public must be consulted. Our methods can be employed to evaluate the responses in other countries and reapplied in the future to measure how public opinion changes as these potential new technologies are discussed and developed,” he said.

Journal Reference:

  1. Malcolm J. Wright, Damon A. H. Teagle, Pamela M. Feetham. A quantitative evaluation of the public response to climate engineeringNature Climate Change, 2014; DOI: 10.1038/nclimate2087

Our singularity future: should we hack the climate? (Singularity Hub)

Written By: 

Posted: 01/8/14 8:31 AM

Basaltlake-coring_greenland

Even the most adamant techno-optimists among us must admit that new technologies can introduce hidden dangers: Fire, as the adage goes, can cook the dinner, but it can also burn the village down.

The most powerful example of unforeseen disadvantages stemming from technology is climate change. Should we attempt to fix a problem caused by technology, using more novel technology to hack the climate? The question has spurred heated debate.

Those in favor point to failed efforts to curb carbon dioxide emissions and insist we need other options. What if a poorly understood climatic tipping point tips and the weather becomes dangerous overnight; how will slowing emissions help us then?

“If you look at the projections for how much the Earth’s air temperature is supposed to warm over the next century, it is frightening. We should at least know the options,” said Rob Wood, a University of Washington climatologist who edited a recent special issue of the journal Climatic Change devoted to geoengineering.

Wood’s view is gaining support, as the predictions about the effects of climate change continue to grow more dire, and the weather plays its part to a tee.

But big, important questions need answers before geoengineering projects take off. Critics point to science’s flimsy understanding of the complex systems that drive the weather. And even supporters lament the lack of any experimental framework to contain disparate experiments on how to affect it.

“Proposed projects have been protested or canceled, and calls for a governance framework abound,” Lisa Dilling and Rachel Hauser wrote in a paper that appears in the special issue. “Some have argued, even, that it is difficult if not impossible to answer some research questions in geoengineering at the necessary scale without actually implementing geoengineering itself.”

Most proposed methods of geoengineering derive from pretty basic science, but questions surround how to deploy them at a planetary scale and how to measure desired and undesired effects on complex weather and ocean cycles. Research projects that would shed light on those questions would be big enough themselves potentially to affect neighboring populations, raising ethical questions as well.

stratoshieldEarlier efforts to test fertilizing the ocean with iron to feed algae that would suck carbon dioxide from the air and to spray the pollutant sulfur dioxide, which reflects solar radiation, into the atmosphere were mired in controversy. A reputable UK project abandoned its plans to test its findings in the field.

But refinements on those earlier approaches are percolating. They include efforts both to remove previously emitted carbon dioxide from the atmosphere and to reduce the portion of the sun’s radiation that enters the atmosphere.

One method of carbon dioxide removal (or CDR) would expose large quantities of carbon-reactive minerals to the air and then store the resulting compounds underground; another would use large C02 vacuums to suck the greenhouse gas directly from the air into underground storage.

Solar radiation management (or SRM) methods include everything from painting roofs white to seeding the clouds with salt crystals to make them more reflective and mimicking the climate-cooling effects of volcanic eruptions by spraying  sulfur compounds into the atmosphere.

The inevitable impact of geoengineering research on the wider population has led many scientists to compare geoengineering to genetic research. The comparison to genetic research also hints at the huge benefits geoengineering could have if it successfully wards off the most savage effects of climate change.

As with genetic research, principles have been developed to shape the ethics of the research. Still, the principles remain vague, according to a 2012 Nature editorial, and flawed, according to a philosophy-of-science take in the recent journal issue. Neither the U.S. government nor international treaties have addressed geoengineering per se, though many treaties would influence its testing implementation.

The hottest research now explores how long climate-hacks would take to work, lining up their timelines with the slow easing of global warming that would result from dramatically lowered carbon dioxide emissions, and how to weigh the costs of geoengineering projects and accommodate public debate.

Proceeding with caution won’t get fast answers, but it seems a wise way to address an issue as thorny as readjusting the global thermostat.

Geoengineering Approaches to Reduce Climate Change Unlikely to Succeed (Science Daily)

Dec. 5, 2013 — Reducing the amount of sunlight reaching the planet’s surface by geoengineering may not undo climate change after all. Two German researchers used a simple energy balance analysis to explain how Earth’s water cycle responds differently to heating by sunlight than it does to warming due to a stronger atmospheric greenhouse effect. Further, they show that this difference implies that reflecting sunlight to reduce temperatures may have unwanted effects on Earth’s rainfall patterns.

Heavy rainfall events can be more common in a warmer world. (Credit: Annett Junginger, distributed via imaggeo.egu.eu)

The results are now published in Earth System Dynamics, an open access journal of the European Geosciences Union (EGU).

Global warming alters Earth’s water cycle since more water evaporates to the air as temperatures increase. Increased evaporation can dry out some regions while, at the same time, result in more rain falling in other areas due to the excess moisture in the atmosphere. The more water evaporates per degree of warming, the stronger the influence of increasing temperature on the water cycle. But the new study shows the water cycle does not react the same way to different types of warming.

Axel Kleidon and Maik Renner of the Max Planck Institute for Biogeochemistry in Jena, Germany, used a simple energy balance model to determine how sensitive the water cycle is to an increase in surface temperature due to a stronger greenhouse effect and to an increase in solar radiation. They predicted the response of the water cycle for the two cases and found that, in the former, evaporation increases by 2% per degree of warming while in the latter this number reaches 3%. This prediction confirmed results of much more complex climate models.

“These different responses to surface heating are easy to explain,” says Kleidon, who uses a pot on the kitchen stove as an analogy. “The temperature in the pot is increased by putting on a lid or by turning up the heat — but these two cases differ by how much energy flows through the pot,” he says. A stronger greenhouse effect puts a thicker ‘lid’ over Earth’s surface but, if there is no additional sunlight (if we don’t turn up the heat on the stove), extra evaporation takes place solely due to the increase in temperature. Turning up the heat by increasing solar radiation, on the other hand, enhances the energy flow through Earth’s surface because of the need to balance the greater energy input with stronger cooling fluxes from the surface. As a result, there is more evaporation and a stronger effect on the water cycle.

In the new Earth System Dynamics study the authors also show how these findings can have profound consequences for geoengineering. Many geoengineering approaches aim to reduce global warming by reducing the amount of sunlight reaching Earth’s surface (or, in the pot analogy, reduce the heat from the stove). But when Kleidon and Renner applied their results to such a geoengineering scenario, they found out that simultaneous changes in the water cycle and the atmosphere cannot be compensated for at the same time. Therefore, reflecting sunlight by geoengineering is unlikely to restore the planet’s original climate.

“It’s like putting a lid on the pot and turning down the heat at the same time,” explains Kleidon. “While in the kitchen you can reduce your energy bill by doing so, in the Earth system this slows down the water cycle with wide-ranging potential consequences,” he says.

Kleidon and Renner’s insight comes from looking at the processes that heat and cool Earth’s surface and how they change when the surface warms. Evaporation from the surface plays a key role, but the researchers also took into account how the evaporated water is transported into the atmosphere. They combined simple energy balance considerations with a physical assumption for the way water vapour is transported, and separated the contributions of surface heating from solar radiation and from increased greenhouse gases in the atmosphere to obtain the two sensitivities. One of the referees for the paper commented: “it is a stunning result that such a simple analysis yields the same results as the climate models.”

Journal Reference:

  1. A. Kleidon, M. Renner. A simple explanation for the sensitivity of the hydrologic cycle to global climate changeEarth System Dynamics Discussions, 2013; 4 (2): 853 DOI: 10.5194/esdd-4-853-2013

Geoengineering the Climate Could Reduce Vital Rains (Science Daily)

Oct. 31, 2013 — Although a significant build-up in greenhouse gases in the atmosphere would alter worldwide precipitation patterns, a widely discussed technological approach to reduce future global warming would also interfere with rainfall and snowfall, new research shows.

Rice field in Bali. (Credit: © pcruciatti / Fotolia)

The international study, led by scientists at the National Center for Atmospheric Research (NCAR), finds that global warming caused by a massive increase in greenhouse gases would spur a nearly 7 percent average increase in precipitation compared to preindustrial conditions.

But trying to resolve the problem through “geoengineering” could result in monsoonal rains in North America, East Asia, and other regions dropping by 5-7 percent compared to preindustrial conditions. Globally, average precipitation could decrease by about 4.5 percent.

“Geoengineering the planet doesn’t cure the problem,” says NCAR scientist Simone Tilmes, lead author of the new study. “Even if one of these techniques could keep global temperatures approximately balanced, precipitation would not return to preindustrial conditions.”

As concerns have mounted about climate change, scientists have studied geoengineering approaches to reduce future warming. Some of these would capture carbon dioxide before it enters the atmosphere. Others would attempt to essentially shade the atmosphere by injecting sulfate particles into the stratosphere or launching mirrors into orbit with the goal of reducing global surface temperatures.

The new study focuses on the second set of approaches, those that would shade the planet. The authors warn, however, that Earth’s climate would not return to its preindustrial state even if the warming itself were successfully mitigated.

“It’s very much a pick-your-poison type of problem,” says NCAR scientist John Fasullo, a co-author. “If you don’t like warming, you can reduce the amount of sunlight reaching the surface and cool the climate. But if you do that, large reductions in rainfall are unavoidable. There’s no win-win option here.”

The study appears in an online issue of the Journal of Geophysical Research: Atmospheres, published this week by the American Geophysical Union. An international team of scientists from NCAR and 14 other organizations wrote the study, which was funded in part by the National Science Foundation (NSF), NCAR’s sponsor. The team used, among other tools, the NCAR-based Community Earth System Model, which is funded by NSF and the Department of Energy.

Future carbon dioxide, with or without geoengineering

The research team turned to 12 of the world’s leading climate models to simulate global precipitation patterns if the atmospheric level of carbon dioxide, a leading greenhouse gas, reached four times the level of the preindustrial era. They then simulated the effect of reduced incoming solar radiation on the global precipitation patterns.

The scientists chose the artificial scenario of a quadrupling of carbon dioxide levels, which is on the high side of projections for the end of this century, in order to clearly draw out the potential impacts of geoengineering.

In line with other research, they found that an increase in carbon dioxide levels would significantly increase global average precipitation, although there would likely be significant regional variations and even prolonged droughts in some areas.

Much of the reason for the increased rainfall and snowfall has to do with greater evaporation, which would pump more moisture into the atmosphere as a result of more heat being trapped near the surface.

The team then took the research one step further, examining what would happen if a geoengineering approach partially reflected incoming solar radiation high in the atmosphere.

The researchers found that precipitation amounts and frequency, especially for heavy rain events, would decrease significantly. The effects were greater over land than over the ocean, and particularly pronounced during months of heavy, monsoonal rains. Monsoonal rains in the model simulations dropped by an average of 7 percent in North America, 6 percent in East Asia and South America, and 5 percent in South Africa. In India, however, the decrease was just 2 percent. Heavy precipitation further dropped in Western Europe and North America in summer.

A drier atmosphere

The researchers found two primary reasons for the reduced precipitation.

One reason has to do with evaporation. As Earth is shaded and less solar heat reaches the surface, less water vapor is pumped into the atmosphere through evaporation.

The other reason has to do with plants. With more carbon dioxide in the atmosphere, plants partially close their stomata, the openings that allow them to take in carbon dioxide while releasing oxygen and water into the atmosphere. Partially shut stomata release less water, so the cooled atmosphere would also become even drier over land.

Tilmes stresses that the authors did not address such questions as how certain crops would respond to a combination of higher carbon dioxide and reduced rainfall.

“More research could show both the positive and negative consequences for society of such changes in the environment,” she says. “What we do know is that our climate system is very complex, that human activity is making Earth warmer, and that any technological fix we might try to shade the planet could have unforeseen consequences.”

The University Corporation for Atmospheric Research manages the National Center for Atmospheric Research under sponsorship by the National Science Foundation. Any opinions, findings and conclusions, or recommendations expressed in this publication are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.

Journal Reference:

  1. Simone Tilmes, John Fasullo, Jean-Francois Lamarque, Daniel R. Marsh, Michael Mills, Kari Alterskjaer, Helene Muri, Jón E. Kristjánsson, Olivier Boucher, Michael Schulz, Jason N. S. Cole, Charles L. Curry, Andy Jones, Jim Haywood, Peter J. Irvine, Duoying Ji, John C. Moore, Diana B. Karam, Ben Kravitz, Philip J. Rasch, Balwinder Singh, Jin-Ho Yoon, Ulrike Niemeier, Hauke Schmidt, Alan Robock, Shuting Yang, Shingo Watanabe. The hydrological impact of geoengineering in the Geoengineering Model Intercomparison Project (GeoMIP)Journal of Geophysical Research: Atmospheres, 2013; 118 (19): 11,036 DOI:10.1002/jgrd.50868

Mosquitos transgênicos no céu do sertão (Agência Pública)

Saúde

10/10/2013 – 10h36

por Redação da Agência Pública

armadilhas 300x199 Mosquitos transgênicos no céu do sertão

As armadilhas são instrumentos instalados nas casas de alguns moradores da área do experimento. As ovitrampas, como são chamadas, fazem as vezes de criadouros para as fêmeas. Foto: Coletivo Nigéria

Com a promessa de reduzir a dengue, biofábrica de insetos transgênicos já soltou 18 milhões de mosquitos Aedes aegypti no interior da Bahia. Leia a história e veja o vídeo.

No começo da noite de uma quinta-feira de setembro, a rodoviária de Juazeiro da Bahia era o retrato da desolação. No saguão mal iluminado, funcionavam um box cuja especialidade é caldo de carne, uma lanchonete de balcão comprido, ornado por salgados, biscoitos e batata chips, e um único guichê – com perturbadoras nuvens de mosquitos sobre as cabeças de quem aguardava para comprar passagens para pequenas cidades ou capitais nordestinas.

Assentada à beira do rio São Francisco, na fronteira entre Pernambuco e Bahia, Juazeiro já foi uma cidade cortada por córregos, afluentes de um dos maiores rios do país. Hoje, tem mais de 200 mil habitantes, compõe o maior aglomerado urbano do semiárido nordestino ao lado de Petrolina – com a qual soma meio milhão de pessoas – e é infestada por muriçocas (ou pernilongos, se preferir). Os cursos de água que drenavam pequenas nascentes viraram esgotos a céu aberto, extensos criadouros do inseto, tradicionalmente combatidos com inseticida e raquete elétrica, ou janelas fechadas com ar condicionado para os mais endinheirados.

Mas os moradores de Juazeiro não espantam só muriçocas nesse início de primavera. A cidade é o centro de testes de uma nova técnica científica que utiliza Aedes aegypti transgênicos para combater a dengue, doença transmitida pela espécie. Desenvolvido pela empresa britânica de biotecnologia Oxitec, o método consiste basicamente na inserção de um gene letal nos mosquitos machos que, liberados em grande quantidade no meio ambiente, copulam com as fêmeas selvagens e geram uma cria programada para morrer. Assim, se o experimento funcionar, a morte prematura das larvas reduz progressivamente a população de mosquitos dessa espécie.

A técnica é a mais nova arma para combater uma doença que não só resiste como avança sobre os métodos até então empregados em seu controle. A Organização Mundial de Saúde estima que possam haver de 50 a 100 milhões de casos de dengue por ano no mundo. No Brasil, a doença é endêmica, com epidemias anuais em várias cidades, principalmente nas grandes capitais. Em 2012, somente entre os dias 1º de janeiro e 16 de fevereiro, foram registrados mais de 70 mil casos no país. Em 2013, no mesmo período, o número praticamente triplicou, passou para 204 mil casos. Este ano, até agora, 400 pessoas já morreram de dengue no Brasil.

Em Juazeiro, o método de patente britânica é testado pela organização social Moscamed, que reproduz e libera ao ar livre os mosquitos transgênicos desde 2011. Na biofábrica montada no município e que tem capacidade para produzir até 4 milhões de mosquitos por semana, toda cadeia produtiva do inseto transgênico é realizada – exceção feita à modificação genética propriamente dita, executada nos laboratórios da Oxitec, em Oxford. Larvas transgênicas foram importadas pela Moscamed e passaram a ser reproduzidas nos laboratórios da instituição.

Os testes desde o início são financiados pela Secretaria da Saúde da Bahia – com o apoio institucional da secretaria de Juazeiro – e no último mês de julho se estenderam ao município de Jacobina, na extremidade norte da Chapada Diamantina. Na cidade serrana de aproximadamente 80 mil habitantes, a Moscamed põe à prova a capacidade da técnica de “suprimir” (a palavra usada pelos cientistas para exterminar toda a população de mosquitos) o Aedes aegypti em toda uma cidade, já que em Juazeiro a estratégia se mostrou eficaz, mas limitada por enquanto a dois bairros.

“Os resultados de 2011 e 2012 mostraram que [a técnica] realmente funcionava bem. E a convite e financiados pelo Governo do Estado da Bahia resolvemos avançar e irmos pra Jacobina. Agora não mais como piloto, mas fazendo um teste pra realmente eliminar a população [de mosquitos]”, fala Aldo Malavasi, professor aposentado do Departamento de Genética do Instituto de Biociências da Universidade de São Paulo (USP) e atual presidente da Moscamed. A USP também integra o projeto.

Malavasi trabalha na região desde 2006, quando a Moscamed foi criada para combater uma praga agrícola, a mosca-das-frutas, com técnica parecida – a Técnica do Inseto Estéril. A lógica é a mesma: produzir insetos estéreis para copular com as fêmeas selvagens e assim reduzir gradativamente essa população. A diferença está na forma como estes insetos são esterilizados. Ao invés de modificação genética, radiação. A TIE é usada largamente desde a década de 1970, principalmente em espécies consideradas ameaças à agricultura. O problema é que até agora a tecnologia não se adequava a mosquitos como o Aedes aegypti, que não resistiam de forma satisfatória à radiação

O plano de comunicação

As primeiras liberações em campo do Aedes transgênico foram realizadas nas Ilhas Cayman, entre o final de 2009 e 2010. O território britânico no Caribe, formado por três ilhas localizadas ao Sul de Cuba, se mostrou não apenas um paraíso fiscal (existem mais empresas registradas nas ilhas do que seus 50 mil habitantes), mas também espaço propício para a liberação dos mosquitos transgênicos, devido à ausência de leis de biossegurança. As Ilhas Cayman não são signatárias do Procolo de Cartagena, o principal documento internacional sobre o assunto, nem são cobertas pela Convenção de Aarthus – aprovada pela União Europeia e da qual o Reino Unido faz parte – que versa sobre o acesso à informação, participação e justiça nos processos de tomada de decisão sobre o meio ambiente.

Ao invés da publicação e consulta pública prévia sobre os riscos envolvidos no experimento, como exigiriam os acordos internacionais citados, os cerca de 3 milhões de mosquitos soltos no clima tropical das Ilhas Cayman ganharam o mundo sem nenhum processo de debate ou consulta pública. A autorização foi concedida exclusivamente pelo Departamento de Agricultura das Ilhas. Parceiro local da Oxitec nos testes, a Mosquito Research & Control Unit (Unidade de Pesquisa e Controle de Mosquito) postou um vídeo promocional sobre o assunto apenas em outubro de 2010, ainda assim sem mencionar a natureza transgênica dos mosquitos. O vídeo foi divulgado exatamente um mês antes da apresentação dos resultados dos experimentos pela própria Oxitec no encontro anual daAmerican Society of Tropical Medicine and Hygiene (Sociedade Americana de Medicina Tropical e Higiene), nos Estados Unidos.

A comunidade científica se surpreendeu com a notícia de que as primeiras liberações no mundo de insetos modificados geneticamente já haviam sido realizadas, sem que os próprios especialistas no assunto tivessem conhecimento. A surpresa se estendeu ao resultado: segundo os dados da Oxitec, os experimentos haviam atingido 80% de redução na população de Aedes aegypti nas Ilhas Cayman. O número confirmava para a empresa que a técnica criada em laboratório poderia ser de fato eficiente. Desde então, novos testes de campo passaram a ser articulados em outros países – notadamente subdesenvolvidos ou em desenvolvimento, com clima tropical e problemas históricos com a dengue.

Depois de adiar testes semelhantes em 2006, após protestos, a Malásia se tornou o segundo país a liberar os mosquitos transgênicos entre dezembro de 2010 e janeiro de 2011. Seis mil mosquitos foram soltos num área inabitada do país. O número, bem menor em comparação ao das Ilhas Cayman, é quase insignificante diante da quantidade de mosquitos que passou a ser liberada em Juazeiro da Bahia a partir de fevereiro de 2011. A cidade, junto com Jacobina mais recentemente, se tornou desde então o maior campo de testes do tipo no mundo, com mais de 18 milhões de mosquitos já liberados, segundo números da Moscamed.

“A Oxitec errou profundamente, tanto na Malásia quanto nas Ilhas Cayman. Ao contrário do que eles fizeram, nós tivemos um extenso trabalho do que a gente chama de comunicação pública, com total transparência, com discussão com a comunidade, com visita a todas as casas. Houve um trabalho extraordinário aqui”, compara Aldo Malavasi.

Em entrevista por telefone, ele fez questão de demarcar a independência da Moscamed diante da Oxitec e ressaltou a natureza diferente das duas instituições. Criada em 2006, a Moscamed é uma organização social, sem fins lucrativos portanto, que se engajou nos testes do Aedes aegypti transgênico com o objetivo de verificar a eficácia ou não da técnica no combate à dengue. Segundo Malavasi, nenhum financiamento da Oxitec foi aceito por eles justamente para garantir a isenção na avaliação da técnica. “Nós não queremos dinheiro deles, porque o nosso objetivo é ajudar o governo brasileiro”, resume.

Em favor da transparência, o programa foi intitulado “Projeto Aedes Transgênico” (PAT), para trazer já no nome a palavra espinhosa. Outra determinação de ordem semântica foi o não uso do termo “estéril”, corrente no discurso da empresa britânica, mas empregada tecnicamente de forma incorreta, já que os mosquitos produzem crias, mas geram prole programada para morrer no estágio larval. Um jingle pôs o complexo sistema em linguagem popular e em ritmo de forró pé-de-serra. E o bloco de carnaval “Papa Mosquito” saiu às ruas de Juazeiro no Carnaval de 2011.

No âmbito institucional, além do custeio pela Secretaria de Saúde estadual, o programa também ganhou o apoio da Secretaria de Saúde de Juazeiro da Bahia. “De início teve resistência, porque as pessoas também não queriam deixar armadilhas em suas casas, mas depois, com o tempo, elas entenderam o projeto e a gente teve uma boa aceitação popular”, conta o enfermeiro sanitarista Mário Machado, diretor de Promoção e Vigilância à Saúde da secretaria.

As armadilhas, das quais fala Machado, são simples instrumentos instalados nas casas de alguns moradores da área do experimento. As ovitrampas, como são chamadas, fazem as vezes de criadouros para as fêmeas. Assim é possível colher os ovos e verificar se eles foram fecundados por machos transgênicos ou selvagens. Isso também é possível porque os mosquitos geneticamente modificados carregam, além do gene letal, o fragmento do DNA de uma água-viva que lhe confere uma marcação fluorescente, visível em microscópios.

Desta forma, foi possível verificar que a redução da população de Aedes aegypti selvagem atingiu, segundo a Moscamed, 96% em Mandacaru – um assentamento agrícola distante poucos quilômetros do centro comercial de Juazeiro que, pelo isolamento geográfico e aceitação popular, se transformou no local ideal para as liberações. Apesar do número, a Moscamed continua com liberações no bairro. Devido à breve vida do mosquito (a fêmea vive aproximadamente 35 dias), a soltura dos insetos precisa continuar para manter o nível da população selvagem baixo. Atualmente, uma vez por semana um carro deixa a sede da organização com 50 mil mosquitos distribuídos aos milhares em potes plásticos que serão abertos nas ruas de Mandacaru.

“Hoje a maior aceitação é no Mandacaru. A receptividade foi tamanha que a Moscamed não quer sair mais de lá”, enfatiza Mário Machado.

O mesmo não aconteceu com o bairro de Itaberaba, o primeiro a receber os mosquitos no começo de 2011. Nem mesmo o histórico alto índice de infecção pelo Aedes aegypti fez com que o bairro periférico juazeirense, vizinho à sede da Moscamed, aceitasse de bom grado o experimento. Mário Machado estima “em torno de 20%” a parcela da população que se opôs aos testes e pôs fim às liberações.

“Por mais que a gente tente informar, ir de casa em casa, de bar em bar, algumas pessoas desacreditam: ‘Não, vocês estão mentindo pra gente, esse mosquito tá picando a gente’”, resigna-se.

Depois de um ano sem liberações, o mosquito parece não ter deixado muitas lembranças por ali. Em uma caminhada pelo bairro, quase não conseguimos encontrar alguém que soubesse do que estávamos falando. Não obstante, o nome de Itaberaba correu o mundo ao ser divulgado pela Oxitec que o primeiro experimento de campo no Brasil havia atingido 80% de redução na população de mosquitos selvagens.

Supervisora de campo da Moscamed, a bióloga Luiza Garziera foi uma das que foram de casa em casa explicando o processo, por vezes contornando o discurso científico para se fazer entender. “Eu falava que a gente estaria liberando esses mosquitos, que a gente liberava somente o macho, que não pica. Só quem pica é a fêmea. E que esses machos quando ‘namoram’ – porque a gente não pode falar às vezes de ‘cópula’ porque as pessoas não vão entender. Então quando esses machos namoram com a fêmea, os seus filhinhos acabam morrendo”.

Este é um dos detalhes mais importantes sobre a técnica inédita. Ao liberar apenas machos, numa taxa de 10 transgênicos para 1 selvagem, a Moscamed mergulha as pessoas numa nuvem de mosquitos, mas garante que estes não piquem aqueles. Isto acontece porque só a fêmea se alimenta de sangue humano, líquido que fornece as proteínas necessárias para sua ovulação.

A tecnologia se encaixa de forma convincente e até didática – talvez com exceção da “modificação genética”, que requer voos mais altos da imaginação. No entanto, ainda a ignorância sobre o assunto ainda campeia em considerável parcela dos moradores ouvidos para esta reportagem. Quando muito, sabe-se que se trata do extermínio do mosquito da dengue, o que é naturalmente algo positivo. No mais, ouviu-se apenas falar ou arrisca-se uma hipótese que inclua a, esta sim largamente odiada, muriçoca.

A avaliação dos riscos

Apesar da campanha de comunicação da Moscamed, a ONG britânica GeneWatch aponta uma série de problemas no processo brasileiro. O principal deles, o fato do relatório de avaliação de riscos sobre o experimento não ter sido disponibilizado ao público antes do início das liberações. Pelo contrário, a pedido dos responsáveis pelo Programa Aedes Transgênico, o processo encaminhado à Comissão Técnica Nacional de Biossegurança (CTNBio, órgão encarregado de autorizar ou não tais experimentos) foi considerado confidencial.

“Nós achamos que a Oxitec deve ter o consentimento plenamente informado da população local, isso significa que as pessoas precisam concordar com o experimento. Mas para isso elas precisam também ser informadas sobre os riscos, assim como você seria se estivesse sendo usado para testar um novo medicamento contra o câncer ou qualquer outro tipo de tratamento”, comentou, em entrevista por Skype, Helen Wallace, diretora executiva da organização não governamental.

Especialista nos riscos e na ética envolvida nesse tipo de experimento, Helen publicou este ano o relatório Genetically Modified Mosquitoes: Ongoing Concerns (“Mosquitos Geneticamente Modificados: atuais preocupações”), que elenca em 13 capítulos o que considera riscos potenciais não considerados antes de se autorizar a liberação dos mosquitos transgênicos. O documento também aponta falhas na condução dos experimentos pela Oxitec.

Por exemplo, após dois anos das liberações nas Ilhas Cayman, apenas os resultados de um pequeno teste haviam aparecido numa publicação científica. No começo de 2011, a empresa submeteu os resultados do maior experimento nas Ilhas à revista Science, mas o artigo não foi publicado. Apenas em setembro do ano passado o texto apareceu em outra revista, a Nature Biotechnology, publicado como “correspondência” – o que significa que não passou pela revisão de outros cientistas, apenas pela checagem do próprio editor da publicação.

Para Helen Wallace, a ausência de revisão crítica dos pares científicos põe o experimento da Oxitec sob suspeita. Mesmo assim, a análise do artigo, segundo o documento, sugere que a empresa precisou aumentar a proporção de liberação de mosquitos transgênicos e concentrá-los em uma pequena área para que atingisse os resultados esperados. O mesmo teria acontecido no Brasil, em Itaberaba. Os resultados do teste no Brasil também ainda não foram publicados pela Moscamed. O gerente do projeto, Danilo Carvalho, informou que um dos artigos já foi submetido a uma publicação e outro está em fase final de escrita.

Outro dos riscos apontados pelo documento está no uso comum do antibiótico tetraciclina. O medicamento é responsável por reverter o gene letal e garantir em laboratório a sobrevivência do mosquito geneticamente modificado, que do contrário não chegaria à fase adulta. Esta é a diferença vital entre a sorte dos mosquitos reproduzidos em laboratório e a de suas crias, geradas no meio ambiente a partir de fêmeas selvagens – sem o antibiótico, estão condenados à morte prematura.

A tetraciclina é comumente empregada nas indústrias da pecuária e da aquicultura, que despejam no meio ambiente grandes quantidades da substância através de seus efluentes. O antibiótico também é largamente usado na medicina e na veterinária. Ou seja, ovos e larvas geneticamente modificados poderiam entrar em contato com o antibiótico mesmo em ambientes não controlados e assim sobreviverem. Ao longo do tempo, a resistência dos mosquitos transgênicos ao gene letal poderia neutralizar seu efeito e, por fim, teríamos uma nova espécie geneticamente modificada adaptada ao meio ambiente.

laboratorio 300x186 Mosquitos transgênicos no céu do sertãoA hipótese é tratada com ceticismo pela Oxitec, que minimiza a possibilidade disto acontecer no mundo real. No entanto, documento confidencial tornado público mostra que a hipótese se mostrou, por acaso, real nos testes de pesquisador parceiro da empresa. Ao estranhar uma taxa de sobrevivência das larvas sem tetraciclina de 15% – bem maior que os usuais 3% contatos pelos experimentos da empresa –, os cientistas da Oxitec descobriram que a ração de gato com a qual seus parceiros estavam alimentando os mosquitos guardava resquícios do antibiótico, que é rotineiramente usado para tratar galinhas destinadas à ração animal.

O relatório da GeneWatch chama atenção para a presença comum do antibiótico em dejetos humanos e animais, assim como em sistemas de esgotamento doméstico, a exemplo de fossas sépticas. Isto caracterizaria um risco potencial, já que vários estudos constataram a capacidade do Aedes aegypti se reproduzir em águas contaminadas – apesar de isso ainda não ser o mais comum, nem acontecer ainda em Juazeiro, segundo a Secretaria de Saúde do município.

Além disso, há preocupações quanto a taxa de liberação de fêmeas transgênicas. O processo de separação das pupas (último estágio antes da vida adulta) é feito de forma manual, com a ajuda de um aparelho que reparte os gêneros pelo tamanho (a fêmea é ligeiramente maior). Uma taxa de 3% de fêmeas pode escapar neste processo, ganhando a liberdade e aumentando os riscos envolvidos. Por último, os experimentos ainda não verificaram se a redução na população de mosquitos incide diretamente na transmissão da dengue.

Todas as críticas são rebatidas pela Oxitec e pela Moscamed, que dizem manter um rigoroso controle de qualidade – como o monitoramento constante da taxa de liberação de fêmeas e da taxa de sobrevivências das larvas sem tetraciclina. Desta forma, qualquer sinal de mutação do mosquito seria detectado a tempo de se suspender o programa. Ao final de aproximadamente um mês, todos os insetos liberados estariam mortos. Os mosquitos, segundo as instituições responsáveis, também não passam os genes modificados mesmo que alguma fêmea desgarrada pique um ser humano.

Mosquito transgênico à venda

Em julho passado, depois do êxito dos testes de campo em Juazeiro, a Oxitec protocolou a solicitação de licença comercial na Comissão Técnica Nacional de Biossegurança (CTNBio). Desde o final de 2012, a empresa britânica possui CNPJ no país e mantém um funcionário em São Paulo. Mais recentemente, com os resultados promissores dos experimentos em Juazeiro, alugou um galpão em Campinas e está construindo o que será sua sede brasileira. O país representa hoje seu mais provável e iminente mercado, o que faz com que o diretor global de desenvolvimento de negócios da empresa, Glen Slade, viva hoje numa ponte aérea entre Oxford e São Paulo.

“A Oxitec está trabalhando desde 2009 em parceria com a USP e Moscamed, que são parceiros bons e que nos deram a oportunidade de começar projetos no Brasil. Mas agora acabamos de enviar nosso dossiê comercial à CTNBio e esperamos obter um registro no futuro, então precisamos aumentar nossa equipe no país. Claramente estamos investindo no Brasil. É um país muito importante”, disse Slade numa entrevista por Skype da sede na Oxitec, em Oxford, na Inglaterra.

A empresa de biotecnologia é uma spin-out da universidade britânica, o que significa dizer que a Oxitec surgiu dos laboratórios de uma das mais prestigiadas universidades do mundo. Fundada em 2002, desde então vem captando investimentos privados e de fundações sem fins lucrativos, tais como a Bill & Melinda Gates, para bancar o prosseguimento das pesquisas. Segundo Slade, mais de R$ 50 milhões foram gastos nesta última década no aperfeiçoamento e teste da tecnologia.

O executivo espera que a conclusão do trâmite burocrático para a concessão da licença comercial aconteça ainda próximo ano, quando a sede brasileira da Oxitec estará pronta, incluindo uma nova biofábrica. Já em contato com vários municípios do país, o executivo prefere não adiantar nomes. Nem o preço do serviço, que provavelmente será oferecido em pacotes anuais de controle da população de mosquitos, a depender o orçamento do número de habitantes da cidade.

“Nesse momento é difícil dar um preço. Como todos os produtos novos, o custo de produção é mais alto quando a gente começa do que a gente gostaria. Acho que o preço vai ser um preço muito razoável em relação aos benefícios e aos outros experimentos para controlar o mosquito, mas muito difícil de dizer hoje. Além disso, o preço vai mudar segundo a escala do projeto. Projetos pequenos não são muito eficientes, mas se tivermos a oportunidade de controlar os mosquitos no Rio de Janeiro todo, podemos trabalhar em grande escala e o preço vai baixar”, sugere.

A empresa pretende também instalar novas biofábricas nas cidades que receberem grandes projetos, o que reduzirá o custo a longo prazo, já que as liberações precisam ser mantidas indefinidamente para evitar o retorno dos mosquitos selvagens. A velocidade de reprodução do Aedes aegypti é uma preocupação. Caso seja cessado o projeto, a espécie pode recompor a população em poucas semanas.

“O plano da empresa é conseguir pagamentos repetidos para a liberação desses mosquitos todo ano. Se a tecnologia deles funcionar e realmente reduzir a incidência de dengue, você não poderá suspender estas liberações e ficará preso dentro desse sistema. Uma das maiores preocupações a longo prazo é que se as coisas começarem a dar errado, ou mesmo se tornarem menos eficientes, você realmente pode ter uma situação pior ao longo de muitos anos”, critica Helen Wallace.

O risco iria desde a redução da imunidade das pessoas à doença, até o desmantelamento de outras políticas públicas de combate à dengue, como as equipes de agentes de saúde. Apesar de tanto a Moscamed quanto a própria secretaria de Saúde de Juazeiro enfatizarem a natureza complementar da técnica, que não dispensaria os outros métodos de controle, é plausível que hajam conflitos na alocação de recursos para a área. Hoje, segundo Mário Machado da secretaria de Saúde, Juazeiro gasta em média R$ 300 mil por mês no controle de endemias, das quais a dengue é a principal.

A secretaria negocia com a Moscamed a ampliação do experimento para todo o município ou mesmo para toda a região metropolitana formada por Juazeiro e Petrolina – um teste que cobriria meio milhão pessoas –, para assim avaliar a eficácia em grandes contingentes populacionais. De qualquer forma e apesar do avanço das experiências, nem a organização social brasileira nem a empresa britânica apresentaram estimativas de preço pra uma possível liberação comercial.

“Ontem nós estávamos fazendo os primeiros estudos, pra analisar qual é o preço deles, qual o nosso. Porque eles sabem quanto custa o programa deles, que não é barato, mas não divulgam”, disse Mário Machado.

Em reportagem do jornal britânico The Observer de julho do ano passado, a Oxitec estimou o custo da técnica em “menos de” £6 libras esterlinas por pessoa por ano. Num cálculo simples, apenas multiplicando o número pela contação atual da moeda britânia frente ao real e desconsiderando as inúmeras outras variáveis dessa conta, o projeto em uma cidade de 150 mil habitantes custaria aproximadamente R$ 3,2 milhões por ano.

Se imaginarmos a quantidade de municípios de pequeno e médio porte brasileiros em que a dengue é endêmica, chega-se a pujança do mercado que se abre – mesmo desconsiderando por hora os grandes centros urbanos do país, que extrapolariam a capacidade atual da técnica. Contudo, este é apenas uma fatia do negócio. A Oxitec também possui uma série de outros insetos transgênicos, estes destinados ao controle de pragas agrícolas e que devem encontrar campo aberto no Brasil, um dos gigantes do agronegócio no mundo.

Aguardando autorização da CTNBio, a Moscamed já se preparara para testar a mosca-das-frutas transgênica, que segue a mesma lógica do Aedes aegypti. Além desta, a Oxitec tem outras 4 espécies geneticamente modificadas que poderão um dia serem testadas no Brasil, a começar por Juazeiro e o Vale do São Francisco. A região é uma das maiores produtoras de frutas frescas para exportação do país. 90% de toda uva e manga exportadas no Brasil saem daqui. Uma produção que requer o combate incessante às pragas. Nas principais avenidas de Juazeiro e Petrolina, as lojas de produtos agrícolas e agrotóxicos se sucedem, variando em seus totens as logos das multinacionais do ramo.

“Não temos planos concretos [além da mosca-das-frutas], mas, claro, gostaríamos muito de ter a oportunidade de fazer ensaios com esses produtos também. O Brasil tem uma indústria agrícola muito grande. Mas nesse momento nossa prioridade número 1 é o mosquito da dengue. Então uma vez que tivermos este projeto com recursos bastante, vamos tentar acrescentar projetos na agricultura.”, comentou Slade.

Ele e vários de seus colegas do primeiro escalão da empresa já trabalharam numa das gigantes do agronegócio, a Syngenta. O fato, segundo Helen Wallace, é um dos revelam a condição do Aedes aegypti transgênico de pioneiro de todo um novo mercado de mosquitos geneticamente modificados: “Nos achamos que a Syngenta está principalmente interessada nas pragas agrícolas. Um dos planos que conhecemos é a proposta de usar pragas agrícolas geneticamente modificadas junto com semestres transgênicas para assim aumentar a resistências destas culturas às pragas”.

“Não tem nenhum relacionamento entre Oxitec e Syngenta dessa forma. Talvez tenhamos possibilidade no futuro de trabalharmos juntos. Eu pessoalmente tenho o interesse de buscar projetos que possamos fazer com Syngenta, Basf ou outras empresas grandes da agricultura”, esclarece Glen Slade.

Em 2011, a indústria de agrotóxicos faturou R$14,1 bilhões no Brasil. Maior mercado do tipo no mundo, o país pode nos próximos anos inaugurar um novo estágio tecnológico no combate às pestes. Assim como na saúde coletiva, com o Aedes aegypti transgênico, que parece ter um futuro comercial promissor. Todavia, resta saber como a técnica conviverá com as vacinas contra o vírus da dengue, que estão em fase final de testes – uma desenvolvida por um laboratório francês, outra pelo Instituto Butantan, de São Paulo. As vacinas devem chegar ao público em 2015. O mosquito transgênico, talvez já próximo ano.

Dentre as linhagens de mosquitos transgênicos, pode surgir também uma versão nacional. Como confirmou a professora Margareth de Lara Capurro-Guimarães, do Departamento de Parasitologia da USP e coordenadora do Programa Aedes Transgênico, já está sob estudo na universidade paulista a muriçoca transgênica. Outra possível solução tecnológica para um problema de saúde pública em Juazeiro da Bahia – uma cidade na qual, segundo levantamento do Sistema Nacional de Informações sobre Saneamento (SNIS) de 2011, a rede de esgoto só atende 67% da população urbana.

* Publicado originalmente no site Agência Pública.

(Agência Pública)

Climate Change’s Silver Bullet? Our Interview With One Of The World’s Top Geoengineering Scholars (Climate Progress)

BY ARI PHILLIPS ON SEPTEMBER 6, 2013 AT 1:10 PM

Clive HamiltonMELBOURNE, Australia — Since coming to Australia almost two months ago I’ve heard about Clive Hamilton in the process of reporting just about every story I’ve done. Then I picked up his new book Earthmasters: The Dawn of the Age of Climate Engineeringand now I see what all the fuss is about.

In all of the debates over how to address climate change, climate engineering — or geoengineering — is among the most contentious. It involves large-scale manipulation of the Earth’s climate using grand technological interventions, such as fertilizing the oceans with iron to absorb carbon dioxide or releasing sulfur into the atmosphere to reduce radiation. While its proponents call geoengineering a silver bullet for our climate woes, its skeptics are far more critical. Joe Romm, for one, likens geoengineering to a dangerous course of chemotherapy and radiation to treat a condition curable through diet and exercise — or, in this case, emissions reduction.

According to the cover of Hamilton’s new book, “The potential risks are enormous. It is messing with nature on a scale we’ve never seen before, and it’s attracting a flood of interest from scientists, venture capitalists and oil companies.”

Hamilton is an Australian author and public intellectual. Until 2008 he was the Executive Director of The Australia Institute, a progressive think tank that he founded in 1993. Now he’s Professor of Public Ethics at the Centre for Applied Philosophy and Public Ethics, a joint center of Charles Stuart University and the University of Melbourne.

His books include Requiem for a Species: Why we resist the truth about climate changeScorcher: The dirty truth about climate change and Growth Fetishamongst others.

Hamilton’s next book will be about the anthropocene — a new geologic era in which human activities have had a significant impact on the Earth’s ecosystems. He took some time to talk with me about this new era, the future of geoengineering and what it all means for humanity. This interview has been edited for clarity and length.

How has the environmental community responded to your book on geoengineering?

Cover-image-Yale-UP3

I remember back in late 1990s around Kyoto there was a great deal of resistance amongst environmentalists and climate activists, including myself, against any talk of adaptation. It was seen to be a capitulation to a kind of defeatism that we ought not to be talking about adaptation because that means that mitigation has failed. Eventually I think we all came around to view that some climate change is going to happen and therefore adaptation has to be considered. It’s better to have seat at the table, as it were, when adaptation is being discussed.

I think we’re in the same stage now with geoengineering. Most environmentalists don’t want to know about it. Most climate activists don’t want to talk about it. There is a sense that in doing so you are conceding that it could well be possible that geoengineering will be necessary because the world community will continue to fail, perhaps even more egregiously, at responding to scientific warnings.

But I wrote the book because I became aware in writing my previous book that the genie was out of the bottle: geoengineering was going to grow in importance. Therefore, climate campaigners and environmental groups sooner or later are going to have to engage in the issue. It’s a question of whether they start now or leave it for another five years, at which point the lobby backing geoengineering will be much more powerful and will have had an opportunity to frame it more inflexibly in the media and in broader public mind.

Did you come across any big surprises while writing the book?

There were a couple of big surprises. One was the extent of the geoengineering lobby and the links between the scientists and the investors. I developed a much stronger sense of the likelihood of a powerful geoengineering constituency emerging, which would — if it were not countered by a skeptical community of thinkers and campaigners — essentially take control of whole agenda. Plotting those links and laying them out was something that I go into quite a lot of detail over. At the same time it stimulated me to think about the military-industrial complex, the famous lobby group that help such sway in the U.S. in the middle of the 20th century.

One thing I noticed while doing this research and looking at scientists involved was the density of the linkages with the Lawrence Livermore National Laboratory. So I investigated further and thought it’s really quite astonishing the extent to which many, if not most, prominent scientific researchers in geoengineering in the U.S. worked at Livermore or have close links with people there now or those who used to work there.

Then when I read Hugh Gusterson’s book on Livermore and it’s role in the cold war and nuclear weapons development, I started to think much more carefully about the type of mindset that is especially drawn to geoengineering as a technological response to global warming. I think it’s quite alarming in its implications. That lead me to further think about the geostrategic implications of climate engineering, which is something that’s received almost no attention, but we do know that people in the military and related strategic communities are starting to think about geoengineering and what it would mean for international relations and conflict.

What about the potential for financial gain?

A noble desire to save the world from climate change will attract less noble intentions. That’s just the way of the world. Never let a good crisis go to waste. Already we’re seeing it with Canadian oil sands billionaire Murray Edwards investing in geoengineering technologies.

I spend quite a bit of time talking about Bill Gates in the book. Earlier this year, I was talking about the scientific entrepreneurial lobby group that was emerging during a debate with Peter Singer (another Australian philosopher) and I mentioned Bill Gates. Singer said, “Well what’s wrong with Bill Gates? He’s well motivated. He does a lot of good charity work. If you’re going to have millionaires investing in geoengineering then Bill Gates would be one of the first.”

I made the point that yes, Bill Gates is now in philanthropic mode, and The Bill and Melinda Gates Foundation does praiseworthy work, but it’s not Bill Gates’ motives I’m worried about; it’s his worldview. That Silicon Valley, ‘we’ve-got-an-app-for-that’ kind of understanding. Joe Romm (Editor of Climate Progress) has been very critical of Gates for his dissing of renewable energy technology. Gates described solar energy as cute.

So Gates is drawn to big, new, shiny technological responses that clever people dream up. You know, brainstorming over pizza and coke. That’s where he comes from and that’s how he thinks. It’s one thing to think about computers and software that way, but it’s a completely different matter to think about the earth as a whole as in need of a snazzy new app that will solve the problem. I think that’s an extremely dangerous way to understand it for all sorts of reasons.

Perhaps the most important of which is that the climate change problem is not a technological one. It’s a social and political one. The more people focus on techno-fixes, the more they distract us from the real problems, which are the social and political difficulties of responding to climate change. We have the technology and have had it for many years. So arguing that the blockage is the absence of technology is extremely unhelpful and plays into the hands of the fossil fuel lobby.

In the book you say the slippery slope to a techno-fix promises a substitute for the slippery slope to revolution.

Revolutions take all different forms of course, from the industrial revolution to Tahrir Square to the cultural revolution of the 60s and 70s, and it’s more along the lines of the latter that I was thinking of, and into which new forms of climate activism can feed. There’s often a sort of terror in environmental groups that if they do something radical or outrageous, they’ll alienate mom and dad in the suburbs.

But social movements that have radically changed the way our world works — think of the woman’s movement — have frequently started out being rancorous and difficult and attracting the derision of the conservative press and politicians. And indeed, mystifying and often alienating people living in the suburbs or high-rises. If people have to be shocked and outraged before they come around to seeing that some fundamental transformation is necessary, then so be it. I think that there’s a level of fear and complacency and unwillingness to shift to change on the part of our societies, and some kind of circuit breaker is necessary.

You talk about the acceptance of the “solution” of geoengineering even by people who don’t seem to think climate change is a problem in the first place.

That’s one of the, on the face of it, mystifying aspects of the geoengineering debate. Why conservative think tanks like The American Enterprise Institute, The Cato Institute and even The Heartland Institute, which have for years worked hard to deny climate science and block all measures to reduce carbon emissions, have come out in favor of geoengineering.

What it shows us is that the debate over climate change and the role of the deniers is not about the science. They want to make it about the science because that gives it an air of legitimacy, but it’s really about fundamental cultural and political values. So if geoengineering is the solution then they’re happy to concede that there’s a problem because geoengineering is a big, technological, macho, system-justifying response to climate change. And that’s the kind of response that fits with their political orientation.

How does the “American Way of Life” factor into all of this?

Already we’re seeing what authorities in the U.S. need to do in order to protect people from massive hurricanes and monster wildfires and frequent floods. As the effects of climate change become even more severe we’ll see nations like the U.S. that are in a position to adapt start spending billions of dollars dong so.

And of course, the more the climate deniers persuade politicians that hurricanes and wildfires aren’t due to climate change, the less responsive authorities are likely to be and the more people will die, in effect. Eventually it will be impossible to continue to pretend that mitigation is not the first best option. It might take five years, it might take ten, let’s hope it doesn’t take twenty.

The sort of tragedy of all this is that if the world had become serious ten or so years ago that cost would have been vastly smaller than it’s going to be. But you know that’s in a perfect world where human beings are rational and take reasonable measures to protect themselves from the warnings of scientists. But we now know that the enlightenment conception of human beings as rational creatures who assess the evidence and take measures to protect themselves from harm, that has now collapsed before us. We can no longer maintain that belief.

That feeds into this idea of the anthropocene and the ethical implications of living in a world with climate change.

I am writing a book about the anthropocene because it seems to me that when human beings become so powerful that they transform the fundamental cycles and processes that govern the evolution of the earth itself that we’re entering into an era, or we’ve reached an event, that’s as significant as the industrial revolution, or even the process of civilization itself.

It causes us to rethink pretty much everything. It certainly causes us to rethink what the relationship of human beings is to the planet, but in a harder way, what is a human being. The sort of modern conception of what a human being is is an isolated ego existing inside a body. And most of us think that’s just what we are. But in fact that’s a very recent and culturally specific understanding of what a human being is. And it’s the conception of a human being that’s consistent with an advanced consumer society.

Collectively though, we are the kind of creatures, like certain types of microbes, that can completely transform the nature of the planet on which we live. If this is so, then it causes us to rethink who we are and what the place of this strange, clever creature is on planet earth.

We can no longer think of the Earth as the passive and unresponsive backdrop to the human drama where we play out our parts in a kind of Shakespearean play and not worry about the backdrop. We now find that the backdrop, the stage scenery, has entered into the play and is disrupting the whole proceedings.

Something very profound has happened. Human history, which we think of as only being a few thousand years old and is the history of human actions, has converged with geologic history, which we always thought of as operating in a very distinct domain having nothing to do with us. But now we find that our history affects the history of the earth.

If there is no more human history distinct from earth history, then what does that mean?

Geoengineering: Can We Save the Planet by Messing with Nature? (Democracy Now!)

Video: http://www.democracynow.org/2013/5/20/geoengineering_can_we_save_the_planet

Clive Hamilton, professor of public ethics at Charles Sturt University in Canberra, Australia. He is the author of the new book, Earthmasters: The Dawn of the Age of Climate Engineering.

Carbon Dioxide Removal Can Lower Costs of Climate Protection (Science Daily)

Apr. 12, 2013 — Directly removing CO2 from the air has the potential to alter the costs of climate change mitigation. It could allow prolonging greenhouse-gas emissions from sectors like transport that are difficult, thus expensive, to turn away from using fossil fuels. And it may help to constrain the financial burden on future generations, a study now published by the Potsdam Institute for Climate Impact Research (PIK) shows. It focuses on the use of biomass for energy generation, combined with carbon capture and storage (CCS). According to the analysis, carbon dioxide removal could be used under certain requirements to alleviate the most costly components of mitigation, but it would not replace the bulk of actual emissions reductions. 

Directly removing CO2 from the air has the potential to alter the costs of climate change mitigation. It could allow prolonging greenhouse-gas emissions from sectors like transport that are difficult, thus expensive, to turn away from using fossil fuels. And it may help to constrain the financial burden on future generations, a new study shows. It focuses on the use of biomass for energy generation, combined with carbon capture and storage. (Credit: © Jürgen Fälchle / Fotolia)

“Carbon dioxide removal from the atmosphere allows to separate emissions control from the time and location of the actual emissions. This flexibility can be important for climate protection,” says lead-author Elmar Kriegler. “You don’t have to prevent emissions in every factory or truck, but could for instance plant grasses that suck CO2 out of the air to grow — and later get processed in bioenergy plants where the CO2 gets stored underground.”

In economic terms, this flexibility allows to lower costs by compensating for emissions which would be most costly to eliminate. “This means that a phase-out of global emissions by the end of the century — that we would need to hold the 2 degree line adopted by the international community — does not necessarily require to eliminate each and every source of emissions,” says Kriegler. “Decisions whether and how to protect future generations from the risks of climate change have to be made today, but the burden of achieving these targets will increase over time. The costs for future generations can be substantially reduced if carbon dioxide removal technologies become available in the long run.”

Balancing the financial burden across generations

The study now published is the first to quantify this. If bioenergy plus CCS is available, aggregate mitigation costs over the 21st century might be halved. In the absence of such a carbon dioxide removal strategy, costs for future generations rise significantly, up to a quadrupling of mitigation costs in the period of 2070 to 2090. The calculation was carried out using a computer simulation of the economic system, energy markets, and climate, covering a range of scenarios.

Options for carbon dioxide removal from the atmosphere include afforestation and chemical approaches like direct air capture of CO2 from the atmosphere or reactions of CO2 with minerals to form carbonates. But the use of biomass for energy generation combined with carbon capture and storage is less costly than chemical options, as long as sufficient biomass feedstock is available, the scientists point out.

Serious concerns about large-scale biomass use combined with CCS

“Of course, there are serious concerns about the sustainability of large-scale biomass use for energy,” says co-author Ottmar Edenhofer, chief-economist of PIK. “We therefore considered the bioenergy with CCS option only as an example of the role that carbon dioxide removal could play for climate change mitigation.” The exploitation of bioenergy can conflict with land-use for food production or ecosystem protection. To account for sustainability concerns, the study restricts the bioenergy production to a medium level, that may be realized mostly on abandoned agricultural land.

Still, global population growth and changing dietary habits, associated with an increased demand for land, as well as improvements of agricultural productivity, associated with a decreased demand for land, are important uncertainties here. Furthermore, CCS technology is not yet available for industrial-scale use and, due to environmental concerns, is controversial in countries like Germany. Yet in this study it is assumed that it will become available in the near future.

“CO2 removal from the atmosphere could enable humankind to keep the window of opportunity open for low-stabilization targets despite of a likely delay in international cooperation, but only under certain requirements,” says Edenhofer. “The risks of scaling up bioenergy use need to be better understood, and safety concerns about CCS have to be thoroughly investigated. Still, carbon dioxide removal technologies are no science fiction and need to be further explored.” In no way should they be seen as a pretext to neglect emissions reductions now, notes Edenhofer. “By far the biggest share of climate change mitigation has to come from a large effort to reduce greenhouse-gas emissions globally.”

Journal Reference:

  1. Elmar Kriegler, Ottmar Edenhofer, Lena Reuster, Gunnar Luderer, David Klein. Is atmospheric carbon dioxide removal a game changer for climate change mitigation? Climatic Change, 2013; DOI: 10.1007/s10584-012-0681-4

Shading Earth: Delivering Solar Geoengineering Materials to Combat Global Warming May Be Feasible and Affordable (Science Daily)

ScienceDaily (Aug. 29, 2012) — A cost analysis of the technologies needed to transport materials into the stratosphere to reduce the amount of sunlight hitting Earth and therefore reduce the effects of global climate change has shown that they are both feasible and affordable.

A cost analysis of the technologies needed to transport materials into the stratosphere to reduce the amount of sunlight hitting Earth and therefore reduce the effects of global climate change has shown that they are both feasible and affordable. (Credit: © mozZz / Fotolia)

Published August 31, 2012, in IOP Publishing’s journal Environmental Research Letters, the study has shown that the basic technology currently exists and could be assembled and implemented in a number of different forms for less than USD $5 billion a year.

Put into context, the cost of reducing carbon dioxide emissions is currently estimated to be between 0.2 and 2.5 per cent of GDP in the year 2030, which is equivalent to roughly USD $200 to $2000 billion.

Solar radiation management (SRM) looks to induce the effects similar to those observed after volcanic eruptions; however, the authors state that it is not a preferred strategy and that such a claim could only be made after the thorough investigation of the implications, risks and costs associated with these issues.

The authors caution that reducing incident sunlight does nothing at all to reduce greenhouse gas concentrations in the atmosphere, nor the resulting increase in the acid content of the oceans. They note that other research has shown that the effects of solar radiation management are not uniform, and would cause different temperature and precipitation changes in different countries.

Co-author of the study, Professor Jay Apt, said: “As economists are beginning to explore the role of several types of geoengineering, it is important that a cost analysis of SRM is carried out. The basic feasibility of SRM with current technology is still being disputed and some political scientists and policy makers are concerned about unilateral action.”

In the study, the researchers, from Aurora Flight Sciences, Harvard University and Carnegie Mellon University, performed an engineering cost analysis on six systems capable of delivering 1-5 million metric tonnes of material to altitudes of 18-30 km: existing aircraft, a new airplane designed to perform at altitudes up to 30 km, a new hybrid airship, rockets, guns and suspended pipes carrying gas or slurry to inject the particles into the atmosphere.

Based on existing research into solar radiation management, the researchers performed their cost analyses for systems that could deliver around one million tonnes of aerosols each year at an altitude between 18 and 25 km and between a latitude range of 30°N and 30°S.

The study concluded that using aircraft is easily within the current capabilities of aerospace engineering, manufacturing and operations. The development of new, specialized aircraft appeared to be the cheapest option, with costs of around $1 to $2 billion a year; existing aircraft would be more expensive as they are not optimized for high altitudes and would need considerable and expensive modifications to do so.

Guns and rockets appeared to be capable of delivering materials at high altitudes but the costs associated with these are much higher than those of airplanes and airships due to their lack of reusability.

Although completely theoretical at this point in time, a large gas pipe, rising to 20 km in the sky and suspended by helium-filled floating platforms, would offer the lowest recurring cost-per-kilogram of particles delivered but the costs of research into the materials required, the development of the pipe and the testing to ensure safety, would be high; the whole system carries a large uncertainty.

Professor Apt continued: “We hope our study will help other scientists looking at more novel methods for dispersing particles and help them to explore methods with increased efficiency and reduced environmental risk.”

The researchers make it clear that they have not sought to address the science of aerosols in the stratosphere, nor issues of risk, effectiveness or governance that will add to the costs of solar radiation management geoengineering.

Journal Reference:

  1. Justin McClellan, David W Keith, Jay Apt. Cost analysis of stratospheric albedo modification delivery systems.Environmental Research Letters, 2012; 7 (3): 034019 DOI:10.1088/1748-9326/7/3/034019

Cloud Brightening to Control Global Warming? Geoengineers Propose an Experiment (Science Daily)

A conceptualized image of an unmanned, wind-powered, remotely controlled ship that could be used to implement cloud brightening. (Credit: John McNeill)

ScienceDaily (Aug. 20, 2012) — Even though it sounds like science fiction, researchers are taking a second look at a controversial idea that uses futuristic ships to shoot salt water high into the sky over the oceans, creating clouds that reflect sunlight and thus counter global warming.

University of Washington atmospheric physicist Rob Wood describes a possible way to run an experiment to test the concept on a small scale in a comprehensive paper published this month in the journal Philosophical Transactions of the Royal Society.

The point of the paper — which includes updates on the latest study into what kind of ship would be best to spray the salt water into the sky, how large the water droplets should be and the potential climatological impacts — is to encourage more scientists to consider the idea of marine cloud brightening and even poke holes in it. In the paper, he and a colleague detail an experiment to test the concept.

“What we’re trying to do is make the case that this is a beneficial experiment to do,” Wood said. With enough interest in cloud brightening from the scientific community, funding for an experiment may become possible, he said.

The theory behind so-called marine cloud brightening is that adding particles, in this case sea salt, to the sky over the ocean would form large, long-lived clouds. Clouds appear when water forms around particles. Since there is a limited amount of water in the air, adding more particles creates more, but smaller, droplets.

“It turns out that a greater number of smaller drops has a greater surface area, so it means the clouds reflect a greater amount of light back into space,” Wood said. That creates a cooling effect on Earth.

Marine cloud brightening is part of a broader concept known as geoengineering which encompasses efforts to use technology to manipulate the environment. Brightening, like other geoengineering proposals, is controversial for its ethical and political ramifications and the uncertainty around its impact. But those aren’t reasons not to study it, Wood said.

“I would rather that responsible scientists test the idea than groups that might have a vested interest in proving its success,” he said. The danger with private organizations experimenting with geoengineering is that “there is an assumption that it’s got to work,” he said.

Wood and his colleagues propose trying a small-scale experiment to test feasibility and begin to study effects. The test should start by deploying sprayers on a ship or barge to ensure that they can inject enough particles of the targeted size to the appropriate elevation, Wood and a colleague wrote in the report. An airplane equipped with sensors would study the physical and chemical characteristics of the particles and how they disperse.

The next step would be to use additional airplanes to study how the cloud develops and how long it remains. The final phase of the experiment would send out five to 10 ships spread out across a 100 kilometer, or 62 mile, stretch. The resulting clouds would be large enough so that scientists could use satellites to examine them and their ability to reflect light.

Wood said there is very little chance of long-term effects from such an experiment. Based on studies of pollutants, which emit particles that cause a similar reaction in clouds, scientists know that the impact of adding particles to clouds lasts only a few days.

Still, such an experiment would be unusual in the world of climate science, where scientists observe rather than actually try to change the atmosphere.

Wood notes that running the experiment would advance knowledge around how particles like pollutants impact the climate, although the main reason to do it would be to test the geoengineering idea.

A phenomenon that inspired marine cloud brightening is ship trails: clouds that form behind the paths of ships crossing the ocean, similar to the trails that airplanes leave across the sky. Ship trails form around particles released from burning fuel.

But in some cases ship trails make clouds darker. “We don’t really know why that is,” Wood said.

Despite increasing interest from scientists like Wood, there is still strong resistance to cloud brightening.

“It’s a quick-fix idea when really what we need to do is move toward a low-carbon emission economy, which is turning out to be a long process,” Wood said. “I think we ought to know about the possibilities, just in case.”

The authors of the paper are treading cautiously.

“We stress that there would be no justification for deployment of [marine cloud brightening] unless it was clearly established that no significant adverse consequences would result. There would also need to be an international agreement firmly in favor of such action,” they wrote in the paper’s summary.

There are 25 authors on the paper, including scientists from University of Leeds, University of Edinburgh and the Pacific Northwest National Laboratory. The lead author is John Latham of the National Center for Atmospheric Research and the University of Manchester, who pioneered the idea of marine cloud brightening.

Wood’s research was supported by the UW College of the Environment Institute.

Journal Reference:

J. Latham, K. Bower, T. Choularton, H. Coe, P. Connolly, G. Cooper, T. Craft, J. Foster, A. Gadian, L. Galbraith, H. Iacovides, D. Johnston, B. Launder, B. Leslie, J. Meyer, A. Neukermans, B. Ormond, B. Parkes, P. Rasch, J. Rush, S. Salter, T. Stevenson, H. Wang, Q. Wang, R. Wood. Marine cloud brighteningPhilosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences, 2012; 370 (1974): 4217 DOI:10.1098/rsta.2012.0086