Arquivo da tag: Meteorologia

Um balanço da primeira semana da COP19 (Vitae Civilis)

Ambiente
18/11/2013 – 09h10

por Délcio Rodrigues e Silvia Dias*

cop19 ecod 300x183 Um balanço da primeira semana da COP19

Ao fim da primeira semana da CoP19, a sensação de dejá vú é inevitável. Mais uma vez, o negociador filipino foi o responsável pelo discurso mais emocionante. Mais uma vez, o Germanwatch divulga que os países pobres são os mais vulneráveis aos eventos climáticos extremos. Mais uma vez, aliás, temos um evento climático vitimando milhares de pessoas enquanto acontece a conferência. Mais uma vez, temos a divulgação de que estamos vivendo os anos mais quentes da história recente do planeta, de que a quantidade de gases causadores do efeito estufa na atmosfera já está em níveis alarmantes, de que o certo seria deixar as reservas de combustíveis fósseis intocadas…

Mesmo o novo relatório do IPCC chega com um certo gosto de notícia velha. Pois apesar da maior gama de detalhes e da maior certeza científica, basicamente o AR5 confirma que estamos seguindo em uma trajetória que esgotará já em 2030 todo o carbono que poderemos queimar neste século sem alterar perigosamente o clima do planeta. Da mesma forma, a Agência Internacional de Energia (IEA) confirma o exposto por uma forte campanha feita na CoP18 contra os subsídios aos combustíveis fósseis. Segundo a IEA, os governos gastaram US$ 523 bilhões em subsídios aos combustíveis fósseis em 2011 – uma completa inversão de prioridades, do ponto de vista da mudança climática: para cada US$ 1 em apoio às energias renováveis​​, outros US$ 6 estão promovendo combustíveis intensivos em carbono. Parte dos subsídios aos combustíveis fósseis estão acontecendo em países emergentes e em desenvolvimento, haja vista os subsídios à gasolina impostos pelo governo brasileiro à Petrobrás. Mas talvez sejam mais importantes nos países ricos. Pesquisa do Overseas Development Institute, do Reino Unido, mostrou que os subsídios ao consumo de combustíveis fósseis em 11 países da OCDE alcançam o total de US$ 72 bilhões dólares, ou cerca de US$ 112 por habitante adulto destes países.

Essa perversidade econômica estrangula, no nascimento, as inovações tecnológicas que podem contribuir para evitarmos a colisão iminente entre a economia global (e o seu sistema energético) e os limites ecológicos do nosso planeta. Os recentes desenvolvimentos em energia eólica, solar, bio-combustíveis , geotermia, marés, células de combustível e eficiência energética estão aumentando as possibilidades de construção de um cenário energético de baixo carbono. Além de poderem afastar a crise climática, estas tecnologias poderiam abrir novas oportunidades de investimento, fornecer energia a preços acessíveis e sustentar o crescimento. Mas este potencial somente será realizado se os governos perseguirem ativamente políticas industriais sustentáveis. É necessário alinhar o objetivo de mitigação da crise climática com desincentivos para as fontes de energia intensivas em carbono por meio de impostos e apoio a alternativas sustentáveis.

O fim dos subsídios aos combustíveis fósseis precisa ser acompanhado por políticas que favoreçam a transferência de tecnologias limpas. Não podemos deixar de lado o exemplo da China, da Índia e também do Brasil, para onde multinacionais historicamente enviam plataformas de produção sujas e energo-intensivas. Infelizmente, as negociações sobre tecnologia estão entre as mais emperradas – tanto no formato anterior, estabelecido pelo Caminho de Bali, como agora, na chamada Plataforma Durban. Simultaneamente, tomamos conhecimento, pelo WikiLeaks, da Parceria Trans-Pacífica (TPP) referente a patentes e proteção intelectual – acordo que vem sendo negociado secretamente entre líderes de 12 países que concentram 40% do PIB e um terço do comércio global e que visa impor medidas mais agressivas para coibir a quebra de propriedade intelectual.

A discrepância entre o que a ciência recomenda e o que os governos estão promovendo permanece, independente do formato das negociações climáticas. Saímos dos dois trilhos estabelecidos em Bali para a Plataforma Durban, mas os compromissos financeiros ou metas mais agressivas de mitigação não vieram. Na primeira semana da CoP19, os discursos dos negociadores reviveram posicionamentos arcaicos e obstrutivos ao processo. Sim, é certo que já sabíamos que esta não seria uma conferência de grandes resultados. Mas o fato é que os bad guys resolveram ser realmente bad sob a condução complacente de uma presidência que não se constrange em explicitar sua conduta em prol do carvão e demais combustíveis fósseis. Tanto que a Rússia abriu mão de atravancar o processo, guardando suas queixas sobre o processo da UNFCCC para outra ocasião.

Esta outra ocasião pode ser a CoP20, no Peru, para onde as esperanças de negociações mais produtivas se voltam. Antes, porém, haverá a cúpula de Ban Ki Moon, para a qual as lideranças dos países estão convidadas. O objetivo é gerar a sensibilidade política que faltou em Copenhague e tentar definir metas antes da reta derradeira do acordo, em Paris. Esse encontro deve ser precedido e seguido de várias reuniões interseccionais para que os delegados avancem na costura do acordo e para que os itens críticos, como metas de mitigação e financiamento, comecem a adquirir contornos mais concretos.

Em outras palavras, uma agenda consistente de reuniões e o compromisso para apresentar metas no ano que vem são o melhor resultado que podemos esperar de uma conferência que corre o risco de entrar para a História como a CoP do carvão.

Délcio Rodrigues é especialista em Mudanças Climáticas do Vitae Civilis. Silvia Dias, membro do Conselho Deliberativo do Vitae Civilis, acompanha as negociações climáticas desde 2009.

Climate change pledges: rich nations face fury over moves to renege (The Guardian)

Typhoon Haiyan raises fear over global warming threat as Philippines leads attack on eve of key talks

 in Warsaw

The Observer, Sunday 17 November 2013

Typhoon Haiyan

Survivors of Typhoon Haiyan form a queue to receive relief goods at a devasted coastal area in Leyte. Photograph: Dondi Tawatao/Getty Images

Developing nations have launched an impassioned attack on the failure of the world’s richest countries to live up to their climate change pledges in the wake of the disaster in the Philippines.

With more than 3,600 people now believed to have been killed byTyphoon Haiyan, moves by several major economies to backtrack on commitments over carbon emissions have put the world’s poorest and most wealthy states on a collision course, on the eve of crucial high-level talks at a summit of world powers.

Yeb Sano, the Philippines’ lead negotiator at the UN climate change summit being held this weekend in Warsaw, spoke of a major breakdown in relations overshadowing the crucial talks, which are due to pave the way for a 2015 deal to bring down global emissions.

The diplomat, on the sixth day of a hunger strike in solidarity for those affected by Haiyan, including his own family, told the Observer: “We are very concerned. Public announcements from some countries about lowering targets are not conducive to building trust. We must acknowledge the new climate reality and put forward a new system to help us manage the risks and deal with the losses to which we cannot adjust.”

Munjurul Hannan Khan, representing the world’s 47 least affluent countries, said: “They are behaving irrationally and unacceptably. The way they are talking to the most vulnerable countries is not acceptable. Today the poor are suffering from climate change. But tomorrow the rich countries will be. It starts with us but it goes to them.”

Recent decisions by the governments of AustraliaJapan and Canada to downgrade their efforts over climate change have caused panic among those states most affected by global warming, who fear others will follow as they rearrange their priorities during the downturn.

In the last few days, Japan has announced it will backtrack on its pledge to reduce its emission cuts from 25% to 3.8% by 2020 on the basis that it had to close its nuclear reactors after the 2011 earthquake and tsunami.

Australia, which is not sending a minister to this weekend’s talks,signalled it may weaken its targets and is repealing domestic carbon lawsfollowing the election of a conservative government.

Canada has pulled out of the Kyoto accord, which committed major industrial economies to reducing their annual CO2 emissions to below 1990 levels.

China’s lead negotiator at the Warsaw talks, Su Wei, said: “I do not have any words to describe my dismay at Japan’s decision.” He criticised Europe for showing a lack of ambition to cut emissions further, adding: “They talk about ratcheting up ambition, but rather they would have to ratchet up to ambition from zero ambition.”

When the highest-level talks start at the summit on Monday, due to be attended by representatives from 195 countries, including energy secretary Ed Davey, the developing world will seek confirmation from states such as Britain that they will not follow the path of Japan and others. David Cameron’s comments this weekend in which he backed carbon emission cuts and suggested that there was growing evidence of a link between manmade climate change and disasters such as Typhoon Haiyan, will inevitably be used to pressure others to offer similar assurances.

The developing world also wants the rich western nations to commit to establishing a compensation scheme for future extreme weather events, as the impact of global warming is increasingly felt. And they want firm signals that rich countries intend to find at least $100bn a year by 2020 to help them to adapt their countries to severe climate extremes.

China and 132 nations that are part of the G77 block of developing countries have expressed dismay that rich countries had refused to discuss a proposal for scientists to calculate emissions since the start of the Industrial Revolution.

Ambassador Jose Antonio Marcondes de Carvalho of Brazil, who initially proposed the talks, said: “We were shocked, very much surprised by their rejection and dismissal. It is puzzling. We need to understand why they have rejected it.

“Developing countries are doing vastly more to reduce their emissions than Annexe 1 [rich] countries.”

Members of the Disaster Emergencies Committee, which co-ordinates British aid efforts, also warned leaders that the disaster offers a glimpse of the future if urgent action is not taken.

Aid agencies including Christian Aid, Cafod, Care International, Oxfam and Tearfund said ministers meeting in the Polish capital must act urgently because climate change is likely to make such extreme weather events more common in the future, putting millions more lives at risk.

A Climate-Change Victory (Slate)

If global warming is slowing, thank the Montreal Protocol.

By 

An aerosol spray can.

No CFCs, please. (Photo by iStock)

Climate deniers like to point to the so-called global warming “hiatus” as evidence that humans aren’t changing the climate. But according a new study, exactly the opposite is true: The recent slowdown in global temperature increases is partially the result of one of the few successful international crackdowns on greenhouse gases.

Back in 1988, more than 40 countries, including the United States, signed the Montreal Protocol, an agreement to phase out the use of ozone-depleting gases like chlorofluorocarbons. (Today the protocol has nearly 200 signatories.) According to the Environmental Protection Agency, CFC emissions are down 90 percent since the protocol, a drop that the agency calls “one of the largest reductions to date in global greenhouse gas emissions.” That’s a blessing for the ozone layer, but also for the climate. CFCs are a potent heat-trapping gas, and a new analysis published in Nature Geoscience finds that slashing them has been a major driver of the much-discussed slowdown in global warming.

Without the protocol, environmental economist Francisco Estrada of the Universidad Nacional Autónoma de México reports, global temperatures today would be about a tenth of a degree Celsius higher than they are. That’s roughly an eighth of the total warming documented since 1880.

Estrada and his co-authors compared global temperature and greenhouse gas emissions records over the last century and found that breaks in the steady upward march of both coincided closely. At times when emissions leveled off or dropped, such as during the Great Depression, the trend was mirrored in temperatures; likewise for when emissions climbed.

“With these breaks, what’s interesting is that when they’re common that’s pretty indicative of causation,” said Pierre Perron, a Boston University economist who developed the custom-built statistical tests used in the study.

The findings put a new spin on investigation into the cause of the recent “hiatus.” Scientists have suggested that several temporary natural phenomena, including thedeep ocean sucking up more heat, are responsible for this slowdown. Estrada says his findings show that a recent reduction in heat-trapping CFCs as a result of the Montreal Protocol has also played an important role.

“Paradoxically, the recent decrease in warming, presented by global warming skeptics as proof that humankind cannot affect the climate system, is shown to have a direct human origin,” Estrada writes in the study.

The chart below, from a column accompanying the study, illustrates that impact. The solid blue line shows the amount of warming relative to pre-industrial levels attributed to CFCs and other gases regulated by the Montreal Protocol; the dashed blue line is an extrapolation of what the level would be without the agreement. Green represents warming from methane; Estrada suggests that leveling out may be the result of improved farming practices in Asia. The diamonds are annual global temperature averages, with the red line fitted to them. The dashed red line represents Estrada’s projection of where global temperature would be without these recent mitigation efforts.

131115_CDESK_chart

Courtesy of Francisco Estrada via Mother Jones

Estrada said his study doesn’t undermine the commonly accepted view among climate scientists that the global warming effect of greenhouse gases can take years or decades to fully manifest. Even if we cut off all emissions today, we’d still very likely see warming into the future, thanks to the long shelf life of carbon dioxide, the principal climate-change culprit. The study doesn’t let CO2 off the hook: The reduction in warming would likely have been even greater if CO2 had leveled off as much as CFCs and methane. Instead, Estrada said, it has increased 20 percent since the protocol was signed.

Still, the study makes clear that efforts to reduce greenhouse gas emissions—like arecent international plan to phase out hydrofluorocarbons, a group of cousin chemicals to CFCs that are used in air conditioners and refrigerators, and the Obama administration’s move this year to impose strict new limits on emissions from power plants—can have a big payoff.

“The Montreal Protocol was really successful,” Estrada said. And as policymakers and climate scientists gather in Warsaw, Poland, for the latest U.N. climate summit next week, “this shows that international agreements can really work.”

Scientists Eye Longer-Term Forecasts of U.S. Heat Waves (Science Daily)

Oct. 27, 2013 — Scientists have fingerprinted a distinctive atmospheric wave pattern high above the Northern Hemisphere that can foreshadow the emergence of summertime heat waves in the United States more than two weeks in advance.

This map of air flow a few miles above ground level in the Northern Hemisphere shows the type of wavenumber-5 pattern associated with US drought. This pattern includes alternating troughs (blue contours) and ridges (red contours), with an “H” symbol (for high pressure) shown at the center of each of the five ridges. High pressure tends to cause sinking air and suppress precipitation, which can allow a heat wave to develop and intensify over land areas. (Credit: Image courtesy Haiyan Teng.)

The new research, led by scientists at the National Center for Atmospheric Research (NCAR), could potentially enable forecasts of the likelihood of U.S. heat waves 15-20 days out, giving society more time to prepare for these often-deadly events.

The research team discerned the pattern by analyzing a 12,000-year simulation of the atmosphere over the Northern Hemisphere. During those times when a distinctive “wavenumber-5” pattern emerged, a major summertime heat wave became more likely to subsequently build over the United States.

“It may be useful to monitor the atmosphere, looking for this pattern, if we find that it precedes heat waves in a predictable way,” says NCAR scientist Haiyan Teng, the lead author. “This gives us a potential source to predict heat waves beyond the typical range of weather forecasts.”

The wavenumber-5 pattern refers to a sequence of alternating high- and low-pressure systems (five of each) that form a ring circling the northern midlatitudes, several miles above the surface. This pattern can lend itself to slow-moving weather features, raising the odds for stagnant conditions often associated with prolonged heat spells.

The study is being published next week in Nature Geoscience. It was funded by the U.S. Department of Energy, NASA, and the National Science Foundation (NSF), which is NCAR’s sponsor. NASA scientists helped guide the project and are involved in broader research in this area.

Predicting a lethal event

Heat waves are among the most deadly weather phenomena on Earth. A 2006 heat wave across much of the United States and Canada was blamed for more than 600 deaths in California alone, and a prolonged heat wave in Europe in 2003 may have killed more than 50,000 people.

To see if heat waves can be triggered by certain large-scale atmospheric circulation patterns, the scientists looked at data from relatively modern records dating back to 1948. They focused on summertime events in the United States in which daily temperatures reached the top 2.5 percent of weather readings for that date across roughly 10 percent or more of the contiguous United States. However, since such extremes are rare by definition, the researchers could identify only 17 events that met such criteria — not enough to tease out a reliable signal amid the noise of other atmospheric behavior.

The group then turned to an idealized simulation of the atmosphere spanning 12,000 years. The simulation had been created a couple of years before with a version of the NCAR-based Community Earth System Model, which is funded by NSF and the Department of Energy.

By analyzing more than 5,900 U.S. heat waves simulated in the computer model, they determined that the heat waves tended to be preceded by a wavenumber-5 pattern. This pattern is not caused by particular oceanic conditions or heating of Earth’s surface, but instead arises from naturally varying conditions of the atmosphere. It was associated with an atmospheric phenomenon known as a Rossby wave train that encircles the Northern Hemisphere along the jet stream.

During the 20 days leading up to a heat wave in the model results, the five ridges and five troughs that make up a wavenumber-5 pattern tended to propagate very slowly westward around the globe, moving against the flow of the jet stream itself. Eventually, a high-pressure ridge moved from the North Atlantic into the United States, shutting down rainfall and setting the stage for a heat wave to emerge.

When wavenumber-5 patterns in the model were more amplified, U.S. heat waves became more likely to form 15 days later. In some cases, the probability of a heat wave was more than quadruple what would be expected by chance.

In follow-up work, the research team turned again to actual U.S. heat waves since 1948. They recognized that some historical heat wave events are indeed characterized by a large-scale circulation pattern that indicated a wavenumber-5 event.

Extending forecasts beyond 10 days

The research finding suggests that scientists are making progress on a key meteorological goal: forecasting the likelihood of extreme events more than 10 days in advance. At present, there is very limited skill in such long-term forecasts.

Previous research on extending weather forecasts has focused on conditions in the tropics. For example, scientists have found that El Niño and La Niña, the periodic warming and cooling of surface waters in the central and eastern tropical Pacific Ocean, are correlated with a higher probability of wet or dry conditions in different regions around the globe. In contrast, the wavenumber-5 pattern does not rely on conditions in the tropics. However, the study does not exclude the possibility that tropical rainfall could act to stimulate or strengthen the pattern.

Now that the new study has connected a planetary wave pattern to a particular type of extreme weather event, Teng and her colleagues will continue searching for other circulation patterns that may presage extreme weather events.

“There may be sources of predictability that we are not yet aware of,” she says. “This brings us hope that the likelihood of extreme weather events that are damaging to society can be predicted further in advance.”

The University Corporation for Atmospheric Research manages the National Center for Atmospheric Research under sponsorship by the National Science Foundation. Any opinions, findings and conclusions, or recommendations expressed in this release are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.

Journal Reference:

  1. Haiyan Teng, Grant Branstator, Hailan Wang, Gerald A. Meehl, Warren M. Washington. Probability of US heat waves affected by a subseasonal planetary wave patternNature Geoscience, 2013; DOI: 10.1038/ngeo1988

Will U.S. Hurricane Forecasting Models Catch Up to Europe’s? (National Geographic)

Photo of a satellite view of Hurricane Sandy on October 28, 2012.

A satellite view of Hurricane Sandy on October 28, 2012.

Photograph by Robert Simmon, ASA Earth Observatory and NASA/NOAA GOES Project Science team

Willie Drye

for National Geographic

Published October 27, 2013

If there was a bright spot amid Hurricane Sandy’s massive devastation, including 148 deaths, at least $68 billion in damages, and the destruction of thousands of homes, it was the accuracy of the forecasts predicting where the storm would go.

Six days before Sandy came ashore one year ago this week—while the storm was still building in the Bahamas—forecasters predicted it would make landfall somewhere between New Jersey and New York City on October 29.

They were right.

Sandy, which had by then weakened from a Category 2 hurricane to an unusually potent Category 1, came ashore just south of Atlantic City, a few miles from where forecasters said it would, on the third to last day of October.

“They were really, really excellent forecasts,” said University of Miami meteorologist Brian McNoldy. “We knew a week ahead of time that something awful was going to happen around New York and New Jersey.”

That knowledge gave emergency management officials in the Northeast plenty of time to prepare, issuing evacuation orders for hundreds of thousands of residents in New Jersey and New York.

Even those who ignored the order used the forecasts to make preparations, boarding up buildings, stocking up on food and water, and buying gasoline-powered generators.

But there’s an important qualification about the excellent forecasts that anticipated Sandy’s course: The best came from a European hurricane prediction program.

The six-day-out landfall forecast arrived courtesy of a computer program known as the European Centre for Medium-range Weather Forecasting (ECMWF), which is based in England.

Most of the other models in use at the National Hurricane Center in Miami, including the U.S. Global Forecast System (GFS), didn’t start forecasting a U.S. landfall until four days before the storm came ashore. At the six-day-out mark, that model and others at the National Hurricane Center had Sandy veering away from the Atlantic Coast, staying far out at sea.

“The European model just outperformed the American model on Sandy,” says Kerry Emanuel, a meteorologist at Massachusetts Institute of Technology.

Now, U.S. weather forecasting programmers are working to close the gap between the U.S. Global Forecast System and the European model.

There’s more at stake than simple pride. “It’s to our advantage to have two excellent models instead of just one,” says McNoldy. “The more skilled models you have running, the more you know about the possibilities for a hurricane’s track.”

And, of course, the more lives you can save.

Data, Data, Data

The computer programs that meteorologists rely on to predict the courses of storms draw on lots of data.

U.S. forecasting computers and their European counterparts rely on radar that provides information on cloud formations and the rotation of a storm, on orbiting satellites that show precisely where a storm is, and on hurricane-hunter aircraft that fly into storms to collect wind speeds, barometric pressure readings, and water temperatures.

Hundreds of buoys deployed along the Atlantic and Gulf coasts, meanwhile, relay information about the heights of waves being produced by the storm.

All this data is fed into computers at the National Centers for Environmental Prediction at Camp Springs, Maryland, which use it to run the forecast models. Those computers, linked to others at the National Hurricane Center, translate the computer models into official forecasts.

The forecasters use data from all computer models—including the ECMWF—to make their forecasts four times daily.

Forecasts produced by various models often diverge, leaving plenty of room for interpretation by human forecasters.

“Usually, it’s kind of a subjective process as far as making a human forecast out of all the different computer runs,” says McNoldy. “The art is in the interpretation of all of the computer models’ outputs.”

There are two big reasons why the European model is usually more accurate than U.S. models. First, the European Centre for Medium-range Weather Forecasting model is a more sophisticated program that incorporates more data.

Second, the European computers that run the program are more powerful than their U.S. counterparts and are and able to do more calculations more quickly.

“They don’t have any top-secret things,” McNoldy said. “Because of their (computer) hardware, they can implement more sophisticated code.”

A consortium of European nations began developing the ECMWF in 1976, and the model has been fueled by a series of progressively more powerful supercomputers in England. It got a boost when the European Union was formed in 1993 and member states started contributing taxes for more improvements.

The ECMWF and the GFS are the two primary models that most forecasters look at, said Michael Laca, producer of TropMet, a website that focuses on hurricanes and other severe weather events.

Laca said that forecasts and other data from the ECMWF are provided to forecasters in the U.S. and elsewhere who pay for the information.

“The GFS, on the other hand, is freely available to everyone, and is funded—or defunded—solely through (U.S.) government appropriations,” Laca said.

And since funding for U.S. research and development is subject to funding debates in Congress, U.S. forecasters are “in a hard position to keep pace with the ECMWF from a research and hardware perspective,” Laca said.

Hurricane Sandy wasn’t the first or last hurricane for which the ECMWF was the most accurate forecast model. It has consistently outperformed the GFS and four other U.S. and Canadian forecasting models.

Greg Nordstrom, who teaches meteorology at Mississippi State University in Starkville, said the European model provided much more accurate forecasts for Hurricane Isaac in August 2012 and for Tropical Storm Karen earlier this year.

“This doesn’t mean the GFS doesn’t beat the Euro from time to time,” he says.  “But, overall, the Euro is king of the global models.”

McNoldy says the European Union’s generous funding of research and development of their model has put it ahead of the American version. “Basically, it’s a matter of resources,” he says. “If we want to catch up, we will. It’s important that we have the best forecasting in the world.”

European developers who work on forecasting software have also benefited from better cooperation between government and academic researchers, says MIT’s Emanuel.

“If you talk to (the National Oceanic and Atmospheric Administration), they would deny that, but there’s no real spirit of cooperation (in the U.S.),” he says. “It’s a cultural problem that will not get fixed by throwing more money at the problem.”

Catching Up Amid Chaos

American computer models’ accuracy in forecasting hurricane tracks has improved dramatically since the 1970s. The average margin of error for a three-day forecast of a hurricane’s track has dropped from 500 miles in 1972 to 115 miles in 2012.

And NOAA is in the middle of a ten-year program intended to dramatically improve the forecasting of hurricanes’ tracks and their likelihood to intensify, or become stronger before landfall.

One of the project’s centerpieces is the Hurricane Weather Research and Forecasting model, or HWRF. In development since 2007, it’s similar to the ECMWF in that it will incorporate more data into its forecasting, including data from the GFS model.

Predicting the likelihood that a hurricane will intensify is difficult. For a hurricane to gain strength, it needs humid air, seawater heated to at least 80ºF, and no atmospheric winds to disrupt its circulation.

In 2005, Hurricane Wilma encountered those perfect conditions and in just 30 hours strengthened from a tropical storm with peak winds of about 70 miles per hour to the most powerful Atlantic hurricane on record, with winds exceeding 175 miles per hour.

But hurricanes are as delicate as they are powerful. Seemingly small environmental changes, like passing over water that’s slightly cooler than 80ºF or ingesting dryer air, can rapidly weaken a storm. And the environment is constantly changing.

“Over the next five years, there may be some big breakthrough to help improve intensification forecasting,” McNoldy said. “But we’re still working against the basic chaos in the atmosphere.”

He thinks it will take at least five to ten years for the U.S. to catch up with the European model.

MIT’s Emanuel says three factors will determine whether more accurate intensification forecasting is in the offing: the development of more powerful computers that can accommodate more data, a better understanding of hurricane intensity, and whether researchers reach a point at which no further improvements to intensification forecasting are possible.

Emanuel calls that point the “prediction horizon” and says it may have already been reached: “Our level of ignorance is still too high to know.”

Predictions and Responses

Assuming we’ve not yet hit that point, better predictions could dramatically improve our ability to weather hurricanes.

The more advance warning, the more time there is for those who do choose to heed evacuation orders. Earlier forecasting would also allow emergency management officials more time to provide transportation for poor, elderly, and disabled people unable to flee on their own.

More accurate forecasts would also reduce evacuation expenses.

Estimates of the cost of evacuating coastal areas before a hurricane vary considerably, but it’s been calculated that it costs $1 million for every mile of coastline evacuated. That includes the cost of lost commerce, wages and salaries by those who leave, and the costs of actual evacuating, like travel and shelter.

Better forecasts could reduce the size of evacuation areas and save money.

They would also allow officials to get a jump on hurricane response.  The Federal Emergency Management Administration tries to stockpile relief supplies far enough away from an expected hurricane landfall to avoid damage from the storm but near enough so that the supplies can quickly be moved to affected areas afterwards.

More reliable landfall forecasts would help FEMA position recovery supplies closer to where they’ll be.

Whatever improvements are made, McNoldy warns that forecasting will never be foolproof. However dependable, he said, “Models will always be imperfect.”

Terrestrial Ecosystems at Risk of Major Shifts as Temperatures Increase (Science Daily)

Oct. 8, 2013 — Over 80% of the world’s ice-free land is at risk of profound ecosystem transformation by 2100, a new study reveals. “Essentially, we would be leaving the world as we know it,” says Sebastian Ostberg of the Potsdam Institute for Climate Impact Research, Germany. Ostberg and collaborators studied the critical impacts of climate change on landscapes and have now published their results inEarth System Dynamics, an open access journal of the European Geosciences Union (EGU).

This image shows simulated ecosystem change by 2100, depending on the degree of global temperature increase: 2 degrees Celsius (upper image) or five degrees Celsius (lower image) above preindustrial levels. The parameter “ (Gamma) measures how far apart a future ecosystem under climate change would be from the present state. Blue colours (lower “) depict areas of moderate change, yellow to red areas (higher “) show major change. The maps show the median value of the “ parameter across all climate models, meaning at least half of the models agree on major change in the yellow to red areas, and at least half of the models are below the threshold for major change in the blue areas. (Credit: Ostberg et al., 2013)

The researchers state in the article that “nearly no area of the world is free” from the risk of climate change transforming landscapes substantially, unless mitigation limits warming to around 2 degrees Celsius above preindustrial levels.

Ecosystem changes could include boreal forests being transformed into temperate savannas, trees growing in the freezing Arctic tundra or even a dieback of some of the world’s rainforests. Such profound transformations of land ecosystems have the potential to affect food and water security, and hence impact human well-being just like sea level rise and direct damage from extreme weather events.

The new Earth System Dynamics study indicates that up to 86% of the remaining natural land ecosystems worldwide could be at risk of major change in a business-as-usual scenario (see note). This assumes that the global mean temperature will be 4 to 5 degrees warmer at the end of this century than in pre-industrial times — given many countries’ reluctance to commit to binding emissions cuts, such warming is not out of the question by 2100.

“The research shows there is a large difference in the risk of major ecosystem change depending on whether humankind continues with business as usual or if we opt for effective climate change mitigation,” Ostberg points out.

But even if the warming is limited to 2 degrees, some 20% of land ecosystems — particularly those at high altitudes and high latitudes — are at risk of moderate or major transformation, the team reveals.

The researchers studied over 150 climate scenarios, looking at ecosystem changes in nearly 20 different climate models for various degrees of global warming. “Our study is the most comprehensive and internally consistent analysis of the risk of major ecosystem change from climate change at the global scale,” says Wolfgang Lucht, also an author of the study and co-chair of the research domain Earth System Analysis at the Potsdam Institute for Climate Impact Research.

Few previous studies have looked into the global impact of raising temperatures on ecosystems because of how complex and interlinked these systems are. “Comprehensive theories and computer models of such complex systems and their dynamics up to the global scale do not exist.”

To get around this problem, the team measured simultaneous changes in the biogeochemistry of terrestrial vegetation and the relative abundance of different vegetation species. “Any significant change in the underlying biogeochemistry presents an ecological adaptation challenge, fundamentally destabilising our natural systems,” explains Ostberg.

The researchers defined a parameter to measure how far apart a future ecosystem under climate change would be from the present state. The parameter encompasses changes in variables such as the vegetation structure (from trees to grass, for example), the carbon stored in the soils and vegetation, and freshwater availability. “Our indicator of ecosystem change is able to measure the combined effect of changes in many ecosystem processes, instead of looking only at a single process,” says Ostberg.

He hopes the new results can help inform the ongoing negotiations on climate mitigation targets, “as well as planning adaptation to unavoidable change.”

Note

Even though 86% of land ecosystems are at risk if global temperature increases by 5 degrees Celsius by 2100, it is unlikely all these areas will be affected. This would mean that the worst case scenario from each climate model comes true.

Journal Reference:

  1. S. Ostberg, W. Lucht, S. Schaphoff, D. Gerten. Critical impacts of global warming on land ecosystemsEarth System Dynamics, 2013; 4 (2): 347 DOI: 10.5194/esd-4-347-2013

Insetos conseguem prever tempestades e ventanias (Fapesp)

Animais mudam o comportamento sexual quando há queda da pressão atmosférica, fenômeno comum antes de chuvas e ventos fortes, indica pesquisa feita na Esalq (foto: José Maurício Simões Bento/Esalq)

03/10/2013

Por Elton Alisson

Agência FAPESP – Na Índia e no Japão há um ditado popular que diz que “formigas carregando ovos barranco acima, é a chuva que se aproxima”. Já no Brasil, outro provérbio afirma que “quando aumenta a umidade do ar, cupins e formigas saem de suas tocas para acasalar”.

Um estudo publicado na edição do dia 2 de outubro da revistaPLoS One – realizado por pesquisadores da Escola Superior de Agricultura “Luiz de Queiroz” (Esalq), da Universidade de São Paulo (USP), de Piracicaba (SP), em parceria com colegas da Universidade Estadual do Centro-Oeste (Unicentro), de Guarapuava (PR), e da University of Western Ontario, do Canadá – comprovou que os insetos preveem mudanças climáticas e dão indicações disso com modificações no comportamento.

Os pesquisadores observaram que besouros da espécie Diabrotica speciosa – conhecido popularmente como “brasileirinho” ou “patriota”, por terem cor verde e pintas amarelas –, além de pulgões-da-batata (Macrosiphum euphorbiae) e lagartas da pastagem (Pseudaletia unipuncta), têm capacidade de detectar queda na pressão atmosférica – que, na maioria dos casos, é um sinal de chuva iminente. E, ao perceberem isso, modificam o comportamento sexual, diminuindo a disposição de cortejar e acasalar.

“Demonstramos que os insetos, de fato, têm capacidade de detectar mudanças no tempo por meio da queda da pressão atmosférica, de se antecipar e buscar abrigo para se proteger das más condições climáticas, como temporais e ventanias, por exemplo”, disse José Maurício Simões Bento, professor do Departamento de Entomologia e Acarologia da Esalq e um dos autores do estudo, à Agência FAPESP.

“Certamente esses animais estão mais preparados para enfrentar as mudanças repentinas no tempo que, provavelmente, ocorrerão com maior frequência e intensidade no mundo nos próximos anos em razão das mudanças climáticas globais”, avaliou Bento, um dos pesquisadores principais do Instituto Nacional de Ciência e Tecnologia de Semioquímicos na Agricultura – um dos INCTs financiados pela FAPESP e pelo Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq).

Para realizar o estudo, os pesquisadores selecionaram três diferentes espécies de insetos – o besouro “brasileirinho”, o pulgão-da-batata e a lagarta da pastagem –, que pertencem a ordens bem distintas e que variam significativamente em termos de massa corpórea e morfologia.

O besouro “brasileirinho” tem estrutura mais robusta e possui cutícula dura e, por isso, é mais resistente a condições de tempo severas, como chuvas fortes e ventanias. Já o pulgão-da-batata tem estrutura mais frágil e é menos resistente a eventos climáticos extremos.

Como já existiam evidências de que os insetos ajustam seus comportamentos associados com o voo e com a alimentação às mudanças na velocidade dos ventos, os pesquisadores decidiram avaliar o efeito das condições atmosféricas especificamente sobre o comportamento de “namoro” e acasalamento dessas três espécies quando sujeitas a mudanças naturais ou manipuladas experimentalmente da pressão atmosférica.

Os experimentos em condições naturais (sem a manipulação da pressão) e sob condições controladas, em laboratório, revelaram que, ao detectar uma queda brusca na pressão atmosférica, por exemplo, as fêmeas diminuem ou simplesmente deixam de manifestar um comportamento conhecido como “chamamento”, no qual liberam feromônio para atrair machos para o acasalamento.

Os machos, por sua vez, passam a apresentar menor interesse sexual, não respondem aos estímulos das fêmeas e procuram abrigos para se proteger da mudança de tempo capaz de ocorrer nas próximas horas. Passado o mau tempo, os insetos retomam as atividades de cortejo, namoro e acasalamento.

“Esse comportamento de perda momentânea do interesse no acasalamento horas antes de uma tempestade representa uma capacidade adaptativa que, ao mesmo tempo, reduz a probabilidade de lesões e mortes desses animais – uma vez que são organismos diminutos e muito vulneráveis a condições climáticas adversas, como temporais, chuvas pesadas e ventanias – e assegura a reprodução e a perpetuação das espécies”, afirmou Bento.

 Experimentos

Os experimentos em condições naturais foram realizados na Esalq e no INCT de Semioquímicos na Agricultura, ambos em Piracicaba. Os pesquisadores utilizaram um olfatômetro com estrutura em Y, colocando uma fêmea em uma das duas extremidades menores e, na outra, um controle (que fica vago). Posteriormente, uma corrente de ar foi passada no interior do sistema, de forma que o feromônio fosse levado para a extremidade principal, onde o macho estava colocado.

Dependendo da direção seguida pelo  macho quando a corrente de ar com feromônio era liberada – a extremidade onde estava a fêmea ou a outra, do controle –, era possível avaliar se ele estava sendo atraído ou não pelo feromônio emitido pela fêmea.

Nos experimentos com o besouro “brasileirinho”, os pesquisadores constataram que, sob condições de pressão atmosférica estável ou crescente, o inseto caminhava normalmente em direção ao tubo por onde a corrente de ar com o feromônio da fêmea estava sendo liberado. Já na condição de queda de pressão, o inseto apresentava menor movimentação e interesse em seguir em direção à fêmea.

O grupo também observou que, quando mantido em contato direto com as fêmeas em condição de queda da pressão atmosférica, os machos também não empenharam esforço para acasalar. Tal comportamento, de acordo com Bento, pode ser explicado pela sensação de risco de vida do inseto.

“É como se, diante de uma situação de perigo iminente, esses animais colocassem a questão da sobrevivência em primeiro lugar – porque é o que garante a perpetuação da espécie – e deixassem o acasalamento para um segundo plano, por ser uma atividade que pode ser retomada após a passagem do mau tempo”, avaliou.

Os pesquisadores também avaliaram o número de vezes que o pulgão-da-batata e a lagarta da pastagem atenderam ao “chamamento” das respectivas fêmeas, sob diferentes condições atmosféricas. Os resultados foram semelhantes aos obtidos com o besouro “brasileirinho”.

O comportamento dos insetos estudados foi afetado significativamente por mudanças na pressão do ar, obtida pelos pesquisadores por meio de dados fornecidos de hora em hora pelo site do Instituto Nacional de Meteorologia (Inmet).

Ao perceber que os dados fornecidos pela instituição indicavam uma queda ou aumento brusco da pressão atmosférica da região de Piracicaba, os pesquisadores davam início aos experimentos para verificar se os insetos apresentavam mudanças no comportamento de chamamento e de cópula e faziam as comparações com as condições de pressão estável.

“Conseguimos verificar, dessa forma, que o comportamento sexual dos insetos varia em função do efeito da pressão atmosférica, uma vez que todas as outras condições – como a temperatura, umidade e a luz – foram controladas nos experimentos”, afirmou Bento.

Após constatar essa mudança de comportamento sexual dos insetos em condições naturais, o grupo da Esalq fez uma parceria com colegas canadenses do Departamento de Biologia da University of Western Ontario para realizar novos ensaios comportamentais em laboratório, mais precisamente em uma câmara barométrica de grandes dimensões, que permite controlar a pressão atmosférica, além da temperatura, umidade e luz.

Segundo Bento, os testes conduzidos sob controle de pressão comprovaram as observações feitas em condições naturais.

Fênomeno extensivo

De acordo com o pesquisador, o fato de as três espécies de insetos analisadas no estudo terem modificado o comportamento sexual em resposta às alterações na pressão do ar sugere que o fenômeno pode ser extensivo, de maneira geral, às demais espécies de insetos, e que esses animais são adaptados para enfrentar as más condições meteorológicas.

“Já havia outros trabalhos científicos sugerindo mudanças de comportamento de animais em função das mudanças no tempo, mas foram realizados com uma única espécie, especificamente, e não poderiam ser generalizados para as demais espécies do grupo”, disse Bento.

“Nosso estudo demonstrou que, no caso dos insetos, esse fenômeno parece ser extensivo às outras espécies”, afirmou.

O grupo de pesquisadores da Esalq investiga agora os mecanismos que os insetos utilizam para detectar mudanças na pressão atmosférica e desenvolver o comportamento adaptativo de interromper o acasalamento ao pressentir mudanças no tempo.

O artigo Weather forecasting by insects: modified sexual behaviour in response to atmospheric pressure changes (doi: 10.1371/journal.pone.0075004), de Bento e outros, pode ser lido na PLoS One em http://www.plosone.org/article/info%3Adoi%2F10.1371%2Fjournal.pone.0075004.

*   *   *

Insects Modify Mating Behavior in Anticipation of Storms(Science Daily)

Oct. 2, 2013 — Insects modify calling and courting mating behavior in response to changes in air pressure, according to results published October 2 in the open-access journal PLOS ONE by Ana Cristina Pellegrino and José Maurício Bento, University of São Paulo, and colleagues from other institutions. The bugs’ ability to predict adverse weather conditions may help them modify their mating behavior during high winds and rain, reducing risk of injury or even death.

Researchers studied mating behavior changes in the curcurbit beetle, the true armyworm moth, and the potato aphid under falling, stable, and increasing air pressure conditions. When researchers measured the male beetles’ response to female sex pheromones under the different conditions, they found a significant decrease in pheromone response when air pressure fell compared to stable or increasing pressure. Furthermore, 63% of males started copulating faster in the presence of females during dropping atmospheric pressure, a condition associated with high rains and winds. By contrast, under stable or rising air pressure conditions, all males showed full courtship behavior.

Additionally, the amount that female armyworm moths and potato aphids showed mate-attracting behavior was also measured under the three atmospheric conditions. The female armyworms’ calling was reduced during decreasing air pressure, but the potato aphid showed reduced calling during both decreasing and increasing air pressure, two conditions that can occur with high winds. In both cases, reduced calling went hand-in-hand with reduced mating behavior. Bento explains, “The results presented show that three very different insect species all modify aspects of their sexual behaviour in response to changing barometric pressure. However, there is a great deal of interspecific variability in their responses that can be related to differences in size, flight ability and the diel periodicity of mating.”

Journal Reference:

  1. Ana Cristina Pellegrino, Maria Fernanda Gomes Villalba Peñaflor, Cristiane Nardi, Wayne Bezner-Kerr, Christopher G. Guglielmo, José Maurício Simões Bento, Jeremy N. McNeil.Weather Forecasting by Insects: Modified Sexual Behaviour in Response to Atmospheric Pressure ChangesPLoS ONE, 2013; 8 (10): e75004 DOI:10.1371/journal.pone.0075004

Demissões no Inpe comprometem a previsão do tempo, afirma sindicato (Portal G1, via Agência Ambiente Brasil)

JC e-mail 4825, de 02 de Outubro de 2013.

Se os contratos de 71 funcionários forem suspensos, o Centro de Previsão do Tempo e Estudos Climáticos vai perder mais de um terço da mão de obra

A 10 dias para o fim do prazo determinado pela Justiça para a demissão de servidores contratados irregularmente pelo Instituto Nacional de Pesquisas Espaciais (Inpe), ligado ao Ministério da Ciência, Tecnologia e Inovação (MCTI), o governo federal ainda não definiu a liberação de vagas para abertura de um concurso público. Se os contratos de 71 funcionários forem suspensos a partir de 11 de outubro, o Centro de Previsão do Tempo e Estudos Climáticos (Cptec), em Cachoeira Paulista (SP), vai perder mais de um terço da mão de obra – o centro é o mais avançado em previsão numérica de tempo e clima da América Latina.

O Sindicato dos Servidores Públicos Federais da área de Ciência e Tecnologia no setor Aerospacial (SindCT) cobra agilidade da direção do Inpe e dos Ministérios da Ciência, Tecnologia e Inovação e do Planejamento na solução do problema com o objetivo de tentar suspender temporariamente a decisão judicial. De acordo com a entidade, a previsão do tempo poderá ser comprometida. A direção do Inpe nega que isso vá ocorrer e aguarda a decisão do recurso protolado no Tribunal Regional Federal (TRF-SP) em que pede prolongamento do prazo para solucionar o problema dos contratos irregulares.

O Inpe foi notificado da nulidade dos contratos em 27 agosto e, na ocasião, a sentença definiu prazo de 45 dias para promover as demissões – o prazo termina no próximo dia 10. A ação do Ministério Público Federal (MPF) contesta 111 contratações feitas em caráter emergencial em 2010. Mas segundo Leonel Perondi, diretor do Inpe, um concurso realizado em 2012 possibilitou a substituição dos temporários e, atualmente, a situação se mantém irregular para um grupo de 71 servidores, entre os quais 52 que atuam na previsão do tempo. Outros nove profissionais trabalham no Laboratório de Combustão e Propulsão, onde testes de combustíveis para satélites são realizados. O Cptec tem atualmente um total de 146 servidores.

A maioria dos profissionais que serão desligados são meteorologistas, engenheiros analistas de sistemas e técnicos de computação. Parte dos funcionários que devem ter os contratos suspensos atuam na operação do Tupã, o supercomputador que custou R$ 50 milhões e ampliou a precisão das previsões de fenômenos climáticos extremos, como grandes temporais. O computador é o único no país, segundo o Inpe e, além do Cptec, também fornece informações ao Instituto Nacional Meteorologia (Inmet).

Os contratos dos servidores temporários, caso não houvesse intervenção judicial, terminaria entre os anos de 2014 e 2015. Segundo o instituto, nenhum servidor foi demitido até o momento.

Ameaça de paralisação – O Sindicato dos Servidores Públicos Federais da área de Ciência e Tecnologia no setor Aerospacial (SindCT) cobra agilidade. “Já se passou metade do prazo judicial para que as demissões aconteçam e as vagas não foram liberadas. Essa seria a única possibilidade de tentar suspender a decisão judicial. A previsão do tempo será comprometida, não temos efetivo para operar o Tupã e isso trará consequências a todo país, sobretudo na agricultura”, disse Fernando Morais, vice-presidente do SindCT.

A entidade informou que planeja fazer uma paralisação na próxima sexta-feira (4), envolvendo os 71 demissíveis na tentativa de sensibilizar o governo federal. “Não é possível esperar mais. Estive em Brasília para falar com as autoridades para tentar intervir no problema, mas é preciso que, se liberadas as vagas, esse edital seja lançado no dia seguinte”, afirmou Morais ao G1.

Segundo o presidente do Inpe, dois avisos ministeriais solicitando a abertura de concursos ao Ministério do Planejamento, Orçamento e Gestão foram enviados pelo instituto desde novembro do ano passado, mas nenhuma resposta foi obtida. O diretor do Inpe vai a Brasília nesta terça (1º) tentar negociar a liberação das vagas.

Prejuízo – Os dados e informações fornecidos pelo supercomputador Tupã geram previsões meteorológicas diárias e a longo prazo do tempo. As informações são usadas na agricultura, pelas prefeituras e Defesa Civil que recebem alertas de desastres naturais por meio do Centro Nacional de Monitoramento e Alertas de Desastres Naturais (Cemadem) e também pela aviação civil.

O supercomputador não pode ser desligado. “Essa máquina não foi feita para ser desligada, por isso está atrelada a geradores. O desligamento pode resultar em até 3 meses de serviço, envolvendo técnicos especializados, na tentativa de recuperá-lo. Isso vai envolver custos”, informou o vice-presidente do SindCT.

Ele destacou também que a abertura das vagas ameaçadas por meio de concurso público não traria impacto financeiro ao órgão. “Essas pessoas já tem seus salários inclusos na folha de pagamento do Inpe, não muda nada”, destacou.

De acordo com Perondi, a possibilidade de desligamento do supercomputador está descartada. Ele informa que espera que o recurso sensibilize a Justiça quanto ao prolongamento dos prazos e, que paralelamente, tem estudado medidas para manter o serviço de previsão do tempo.

“Acredito que será possível fazermos um termo de ajuste de conduta e mantermos esses profissionais nas suas atividades até a realização do concurso. Não se contempla a possibilidade de parar a previsão, são dados para áreas críticas do país, vamos estudar alternativas para suprir a eventual falta destes profissionais”, disse ao G1. Perondi acredita que o ideal seria prolongar o prazo para a regularização dos contratos em pelo menos um ano. O Inpe recebeu uma série de pedidos de órgãos como do Operador Nacional do Sistema Elétrico (ONS) e da Defesa Civil. Os documentos foram anexados no recurso.

Acusação – As contratações irregulares no Inpe aconteceram durante a gestão de Gilberto Câmara. O SindCT acusa o então gestor de negligenciar as recomendações jurídicas e promover as contratações irregulares. “Existia um parecer desaconselhando essas contratações. Ele criou um imbróglio, nunca brigou pelas vagas, nunca pediu pela abertura de concurso e agora temos esse problema”, denunciou Morais.

Em missão na Alemanha, Câmara entrou em contato com a reportagem nesta terça-feira (1º) e rebateu as acusações. Segundo ele, as contratações já ocorriam antes de sua gestão e as terceirizações questionadas pelo Ministério Público, que ocorreram em 2010, estavam respaldadas por parecer da Advocacia Geral da União (AGU). Ele também encaminhou ao G1 cópia de ofícios solicitando a abertura de vagas para o órgão em 2007 e 2009.

O ex-diretor questionou ainda o modelo de contratação exigido para o Inpe, considerado por ele um prejuízo para o país. “O país precisa do serviço prestado pelo Inpe e não de mais servidores públicos. O melhor regime seria a criação de uma organização social (OS), como ocorre em vários institutos que recebem missões do governo. Não faz sentido contratar servidor público para a vida inteira. Entendo que o interesse público está servido enquanto o Inpe tiver servidores qualificados, exercendo seus cargos”, defendeu.

O último concurso público do Inpe foi realizado em 2012 para o preenchimento de pouco mais de 100 vagas, incluindo a regularização dos 40 profissionais que corriam risco de terem os contratos suspensos. Antes deste concurso, um outro foi realizado em 2004.

A previsão é que até 2015, tendo como base a quantidade de 1.063 funcionários em atividade em 2010, o Inpe tenha uma redução de 36% no efetivo por motivo de aposentadoria. A idade média dos servidores é de 52 anos.

Entenda o caso – De acordo com o procurador Fernando Lacerda Dias, autor da proposta que resultou na anulação dos contratos, o Inpe fez manobras jurídicas proibidas pela lei para contratar pessoal. “Os contratos foram feitos com base em uma lei que autoriza essas contratações sem concurso. Mas é uma legislação específica para contratação temporária e não para funções de rotina, que é o que eles fizeram”, disse ao G1 no último dia 5.

O Inpe, ao longo dos anos, acumulou novas funções na sua área de atuação, mas o aumento de atribuições não foi acompanhado pela renovação de servidores, o que gerou defasagem quantitativa de mão de obra.

Na tentativa de resolver o problema, o órgão passou a contratar servidores terceirizados, com base no decreto 2.271/97. O procedimento é considerado ilegal, porque a terceirização de mão de obra não pode servir para suprir as atividades finalísticas de um órgão público.

Em 2006, a União se comprometeu a substituir gradualmente os funcionários terceirizados irregulares por servidores concursados, mas em 2009, próximo a expiração dos contratos existentes, o Inpe tentou nova contratação de servidores terceirizados.

A terceirização foi negada pelos órgãos internos de assessoramento jurídico do Inpe. Sem ter o concurso público aberto, o órgão alterou a forma jurídica das contratações, realizando processo seletivo para contratação temporária de 111 novos servidores, com base na lei 8.745/93. A manobra também foi considerada ilegal.

Outro lado – O ministro da Ciência e Tecnologia, Marco Antônio Raupp, foi procurado e informou, por meio de sua assessoria de imprensa que a pasta pleiteou a liberação das vagas para a abertura do concurso. A assessoria da pasta recomendou inicialmente que o G1 procurasse o Ministério do Planejamento para obter informações sobre a liberação das vagas,

O Ministério do Planejamento informou por e-mail que autorizou em 2013 o provimento de 832 vagas para o Ministério de Ciência, Tecnologia e Inovação, a quem compete realizar a distribuição das vagas entre seus vários institutos de pesquisa e que autorizações adicionais dependerão de novas tratativas com o MCTI.

Informado sobre o retorno do Ministério do Planejamento nesta quinta-feira (26), a assessoria de imprensa do Ministério de Ciência e Tecnologia não informou até a tarde desta segunda-feira (30) o destino destas vagas liberadas, nem se elas estão disponíveis para o Inpe.

(Portal G1, via Agência Ambiente Brasil)

Climate sceptics claim warming pause backs their view (BBC)

26 September 2013 Last updated at 00:47 GMT

By Matt McGrathEnvironment correspondent, BBC News, The Netherlands

Hurricane SandyCome hell or… An increased likelihood of extreme weather events is one predicted outcome of global warming, but some dispute the scale of expected effects

In the run up to a key global warming report, those sceptical of mainstream opinion on climate change claim they are “winning” the argument.

They say a slowing of temperature rises in the past 15 years means the threat from climate change is exaggerated.

But a leading member of the UN’s panel on climate change said the views of sceptics were “wishful thinking”.

Some of what the sceptics are saying is either wishful thinking or totally dishonest”

Prof Jean-Pascal van Ypersele, IPCC

The pause in warming was a distraction, he said, from the growing scientific certainty about long-term impacts.

Prof Jean Pascal van Ypersele spoke to BBC News ahead of the release of a six-yearly status report into global warming by the UN panel known as the Intergovernmental Panel on Climate Change, or IPCC.

Scientists and government representatives are currently meeting in Stockholm, Sweden, going through the dense, 31-page summary of the state of the physical science behind climate change.

When it is released on Friday, the report is likely to state with even greater certainty than before that the present-day, rapid warming of the planet is man-made.

Netherlands flood rescue simulationClimate change could profoundly impact the Netherlands, but sceptics remain influential there

But climate sceptics have focused their attention on the references to a pause or hiatus in the increase in global temperatures since 1998.

The sceptics believe that this slowdown is the most solid evidence yet that mainstream computer simulations of the Earth’s climate – climate models – are wrong.

These computer simulations are used to study the dynamics of the Earth’s climate and make projections about future temperature change.

“The sceptics now have a feeling of being on the winning side of the debate thanks to the pause,” said Marcel Crok, a Dutch author who accepts the idea that human activities warm the planet, but is sceptical about the scale of the effect.

“You are now starting to see a normalisation of climate science. Suddenly mainstream researchers, who all agree that greenhouse gases play a huge role, start to disagree about the cause of the pause.

“For me this is a relief, it is finally opening up again and this is good.”

The view that the sceptics have positively impacted the IPCC is supported by Prof Arthur Petersen, who is a member of the Dutch government team currently examining the report.

“The sceptics are good for the IPCC and the whole process is really flourishing because of this interaction over the past decades,” he told BBC News.

“Our best climate researchers are typically very close to really solid, sceptical scientists. In this sense scepticism is not necessarily a negative term.”

Others disagree.

Bart Verheggen is an atmospheric scientist and blogger who supports the mainstream view of global warming. He said that sceptics have discouraged an open scientific debate.

Crok

Dutch writer Marcel Crok is sceptical about the sensitivity of the atmosphere to carbon emissions

“When scientists start to notice that their science is being distorted in public by these people who say they are the champions of the scientific method, that could make mainstream researchers more defensive.

“Scientists probably think twice now about writing things down. They probably think twice about how this could be twisted by contrarians.”

Sensitive debate

In 2007, the IPCC defined the range for what’s termed “equilibrium climate sensitivity”. This term refers to the likely span of temperatures that would occur following a doubling of CO2 concentrations in the atmosphere.

The panel’s last report said temperatures were likely to rise within the range of 2C to 4.5C with a best estimate of 3C.

The new report is believed to indicate a range of 1.5C to 4.5C with no best estimate indicated.

Although that might not appear like much of a change, many sceptics believe it exposes a critical flaw.

“In the last year, we have seen several studies showing that climate sensitivity is actually much less than we thought for the last 30 years,” said Marcel Crok.

“And these studies indicate that our real climate shows a sensitivity of between 1.5C and 2C, but the climate models on which these future doom scenarios are based warm up three degrees under a doubling of CO2.”

But other researchers who are familiar with the text believe that the sceptics are reading too much into a single figure.

“Some of what the sceptics are saying is either wishful thinking or totally dishonest,” Prof van Ypersele, who is vice-chair of the IPCC, told BBC News.

“It is just a change in a lower border [of the range of temperature rise]. Even if this turns out to be the real sensitivity, instead of making the challenge extremely, extremely, extremely difficult to meet, it is only making it extremely, extremely difficult.

“Is that such a big change?”

Prof van Ypersele points out that many other aspects of the forthcoming report are likely to give greater certainty to the scale of impacts of a warming world. The predictions for sea level rise are likely to be considerably strengthened from 2007. There is also likely to be a clearer understanding of the state of sea ice.

Who are the climate sceptics?

Although there are only a small number of mainstream scientists who reject the established view on global warming, they are supported by a larger group of well resourced bloggers and citizen scientists who pore through climate literature and data looking for evidence of flaws in the hypothesis.

There are many different shades of opinion in the sceptical orbit. Some such as the group Principia Scientific reject the “myth” of greenhouse gas warming.

There are also political sceptics, such as some members of the Republican party in the US, who argue that climate science is a hoax or a conspiracy.

But there are also sceptical bloggers such as Anthony Watts and Andrew Montford who accept the basic science that adding carbon to the atmosphere can affect the temperature. They contest mainstream findings on the sensitivity of the climate to carbon and the future impacts on temperature.

The Dutch approach to sceptics

With around 20% of the country under sea level, the Dutch have a keen interest in anything that might affect their environment, such as climate change.

But scepticism about the human influence on global warming has been growing in the Netherlands, according to research from the OECD.

In a country where consensus is a key word, the government has taken a more inclusive approach to climate dissenters. To that end, they have funded Marcel Crok to carry out a sceptical analysis of the IPCC report.

In an effort to build bridges between sceptics and the mainstream they are also funding an initiative called climatedialogue.org which serves as a platform for debate on the science of global warming.

Related Stories

IPCC climate report: humans ‘dominant cause’ of warming – and other articles (BBC)

27 September 2013 Last updated at 09:12 GMT

By Matt McGrathEnvironment correspondent, BBC News, Stockholm

Climate change “threatens our planet, our only home”, warns Thomas Stocker, IPCC co-chair

A landmark report says scientists are 95% certain that humans are the “dominant cause” of global warming since the 1950s.

The report by the UN’s climate panel details the physical evidence behind climate change.

On the ground, in the air, in the oceans, global warming is “unequivocal”, it explained.

It adds that a pause in warming over the past 15 years is too short to reflect long-term trends.

The panel warns that continued emissions of greenhouse gases will cause further warming and changes in all aspects of the climate system.

To contain these changes will require “substantial and sustained reductions of greenhouse gas emissions”.

Infographic

Projections are based on assumptions about how much greenhouse gases might be released

After a week of intense negotiations in the Swedish capital, the summary for policymakers on the physical science of global warming has finally been released.

The first part of an IPCC trilogy, due over the next 12 months, this dense, 36-page document is considered the most comprehensive statement on our understanding of the mechanics of a warming planet.

It states baldly that, since the 1950s, many of the observed changes in the climate system are “unprecedented over decades to millennia”.

Each of the last three decades has been successively warmer at the Earth’s surface, and warmer than any period since 1850, and probably warmer than any time in the past 1,400 years.

“Our assessment of the science finds that the atmosphere and ocean have warmed, the amount of snow and ice has diminished, the global mean sea level has risen and that concentrations of greenhouse gases have increased,” said Qin Dahe, co-chair of IPCC working group one, who produced the report.

Speaking at a news conference in the Swedish capital, Prof Thomas Stocker, another co-chair, said that climate change “challenges the two primary resources of humans and ecosystems, land and water. In short, it threatens our planet, our only home”.

Since 1950, the report’s authors say, humanity is clearly responsible for more than half of the observed increase in temperatures.

Dr Rajendra Pachauri said he was confident the report would convince the public on global climate change

But a so-called pause in the increase in temperatures in the period since 1998 is downplayed in the report. The scientists point out that this period began with a very hot El Nino year.

“Trends based on short records are very sensitive to the beginning and end dates and do not in general reflect long-term climate trends,” the report says.

Prof Stocker, added: “I’m afraid there is not a lot of public literature that allows us to delve deeper at the required depth of this emerging scientific question.

“For example, there are not sufficient observations of the uptake of heat, particularly into the deep ocean, that would be one of the possible mechanisms to explain this warming hiatus.”

“Likewise we have insufficient data to adequately assess the forcing over the last 10-15 years to establish a relationship between the causes of the warming.”

However, the report does alter a key figure from the 2007 study. The temperature range given for a doubling of CO2 in the atmosphere, called equilibrium climate sensitivity, was 2.0C to 4.5C in that report.

In the latest document, the range has been changed to 1.5C to 4.5C. The scientists say this reflects improved understanding, better temperature records and new estimates for the factors driving up temperatures.

In the summary for policymakers, the scientists say that sea level rise will proceed at a faster rate than we have experienced over the past 40 years. Waters are expected to rise, the document says, by between 26cm (at the low end) and 82cm (at the high end), depending on the greenhouse emissions path this century.

The scientists say ocean warming dominates the increase in energy stored in the climate system, accounting for 90% of energy accumulated between 1971 and 2010.

For the future, the report states that warming is projected to continue under all scenarios. Model simulations indicate that global surface temperature change by the end of the 21st Century is likely to exceed 1.5 degrees Celsius, relative to 1850.

Prof Sir Brian Hoskins, from Imperial College London, told BBC News: “We are performing a very dangerous experiment with our planet, and I don’t want my grandchildren to suffer the consequences of that experiment.”

What is the IPCC?

In its own words, the IPCC is there “to provide the world with a clear scientific view on the current state of knowledge in climate change and its potential environmental and socio-economic impacts”.

The offspring of two UN bodies, the World Meteorological Organization and the United Nations Environment Programme, it has issued four heavyweight assessment reports to date on the state of the climate.

These are commissioned by the governments of 195 countries, essentially the entire world. These reports are critical in informing the climate policies adopted by these governments.

The IPCC itself is a small organisation, run from Geneva with a full time staff of 12. All the scientists who are involved with it do so on a voluntary basis.

Document

PDF download IPCC Summary for Policymakers

Related Stories

O dióxido de carbono está reconfigurando o planeta (IPS)

Ambiente

30/9/2013 – 08h08

por Stephen Leahy, da IPS

stephen6401 O dióxido de carbono está reconfigurando o planeta

O derretimento da geleira de Orizaba, no México, é outra consequência do aquecimento global. Foto: Maurício Ramos/IPS

Nantes, França, 30/9/2013 – A Groenlândia pode acabar se tornando verde, pois a maior parte de sua enorme camada de gelo está condenada a derreter, informou, no dia 27, o Grupo Intergovernamental de Especialistas sobre Mudança Climática (IPCC). O resumo de 36 páginas apresentado em Estocolmo pelo IPCC, que funciona na órbita da Organização das Nações Unidas (ONU), inclui um alerta de que há 20% de possibilidades de os gelos da Groenlândia iniciarem um derretimento irreversível com apenas 0,2 grau de aquecimento adicional.

Essa quantidade de aquecimento agregado é uma certeza agora. Contudo, o derretimento definitivo do gelo na Groenlândia consumirá mil anos. “O novo informe é outro chamado para despertar, que nos diz que estamos com grandes problemas e vamos rumo a graus perigosos de mudança climática”, destacou David Cadman, presidente da Iclei, a única rede de cidades sustentáveis que opera em todo o mundo e envolve 1.200 governos locais.

“O IPCC será atacado por interesses (da indústria dos combustíveis fósseis e seus defensores. Eles tentarão assustar o público, alegando que tomar medidas coloca em risco os empregos e a economia. Isso não é verdade. Pelo contrário”, afirmou Cadman à IPS. O Resumo para Responsáveis por Políticas do Informe do Grupo de Trabalho I – Bases de Ciência Física estabelece claramente que os seres humanos estão esquentando o planeta, e confirma estudos que datam de 1997. Desde os anos 1950, cada década foi mais quente do que a anterior, diz o documento.

O texto completo será divulgado hoje e é o primeiro dos quatro volumes (os dos três grupos de trabalho mais a síntese) do Quinto Informe de Avaliação do IPCC, conhecido como AR5. No Hemisfério Norte, “as temperaturas entre 1983 e 2012 foram as mais quentes dos últimos 1.400 anos”, destacou Thomas Stocker, copresidente do Grupo de Trabalho I.

Em resposta às notícias dos últimos dias, falando sobre uma “interrupção do aquecimento”, Stocker disse que o sistema climático é dinâmico, sendo provável que nos últimos anos tenha ingressado mais calor nos oceanos e que tenha diminuído ligeiramente o ritmo dos aumentos das temperaturas superficiais. Há mais de cem anos pesquisadores demonstraram que o dióxido de carbono prende o calor do Sol.

A queima de combustíveis fósseis, o desmatamento, a agricultura e outras atividades humanas emitem esse gás na atmosfera, onde permanece quase para sempre. Os maiores volumes de dióxido de carbono, por sua vez, acrescentam mais calor, pois atuam como outra camada de isolamento. Mais de 90% desta energia térmica adicional é absorvida pelos oceanos, segundo o Resumo. Isso explica a causa de as temperaturas da superfície ainda não terem superado o aumento médio mundial de 0,8 grau.

O Resumo destaca que a redução da camada de gelo do Ártico nas últimas três décadas foi a maior dos últimos 1.450 anos. Embora o derretimento no último verão boreal tenha sido menor que no ano passado, o Ártico segue no caminho de ficar sem gelo na temporada estival de 2050, bem antes do projetado por relatórios anteriores.

O informe do IPCC, uma revisão das pesquisas científicas disponíveis, se baseou em trabalhos de 259 autores de 39 países, e contou com 54.677 comentários. O informe de avaliação anterior, ou AR4, é de 2007. O IPCC não realiza pesquisas próprias e está sob a direção de 110 governos que passaram os últimos quatro dias aprovando o texto do sumário. “Cada palavra contida nas 36 páginas foi debatida. Alguns parágrafos foram discutidos por mais de uma hora”, contou Stocker em entrevista coletiva em Estocolmo. “Nenhum outro relatório científico jamais passou por um escrutínio tão crítico”, ressaltou.

O Resumo, redigido com cautela, detalha e confirma os impactos observados, com o aumento das temperaturas, as mudanças nos padrões pluviométricos e os extremos climáticos. Também confirma que estes e outros impactos vão piorar na medida em que aumentarem as emissões de dióxido de carbono. As atuais emissões deste gás estão no rumo do pior cenário.

O resumo do AR5 diz que a camada gelada da Groenlândia perdeu, em média, 177 bilhões de toneladas de gelo por ano entre 1992 e 2001. Estudos mais recentes demonstram que o gelo perdido aumentou substancialmente desde então. Segundo o relatório, há uma possibilidade em cinco de a camada gelada da Groenlândia derreter totalmente se as temperaturas globais aumentarem entre 0,8 grau e mais de um, como parece inevitável agora.

Um dos motivos é que a elevação das temperaturas no Ártico é quase três vezes maior do que o aumento médio mundial. Se a temperatura média mundial aumentar cerca de quatro graus será desencadeado um derretimento na Groenlândia, que elevaria o nível do mar em até sete metros. Apesar disto, o AR5 diz que não se espera que o nível do mar suba mais de um metro neste século.

Outros cientistas, entre eles James Hansen, ex-diretor do Instituto Goddar de Estudos Espaciais da Nasa (agência espacial dos Estados Unidos), acreditam que o acelerado derretimento observado no Ártico, na Groenlândia, na Antártida e nas geleiras é um sinal de que neste século o mar pode subir vários metros, a menos que as emissões sejam reduzidas.

Mesmo antes de se tornar público o novo informe do IPCC, “os que negam a mudança climática” o atacaram e tergiversaram, tentando retratar suas conclusões como radicais ou extremistas, afirmou Charles Greene, professor de ciências atmosféricas e da terra na Cornell University, de Nova York. Greene se referia a um orquestrado esforço de propaganda de atores da indústria dos combustíveis fósseis e de organizações que tentam convencer o público de que o aquecimento global não é um assunto de extrema urgência. “Na verdade, o IPCC tem um longo histórico de subestimar os impactos” da mudança climática, afirmou Greene.

Embora a ação mundial contra o aquecimento esteja em ponto morto, algumas cidades já estão reduzindo suas emissões de carbono. Os membros do Iclei se comprometeram com uma redução de 20% até 2020 e de 80% até 2050. A maioria dos governos não está tomando iniciativas, o que revela claramente o poder e a influência do setor dos combustíveis fósseis, pontuou Cadman. “As cidades podem fazer até dez vezes mais, mas, simplesmente, não têm dinheiro”, ressaltou.

Envolverde/IPS

IPCC sustenta que aquecimento global é inequívoco (IPS)

Ambiente

30/9/2013 – 08h01

por Fabíola Ortiz, da IPS

IPCC1 IPCC sustenta que aquecimento global é inequívoco

A elevação do nível do mar pode alagar várias regiões do Recife, no Nordeste do Brasil. Foto: Alejandro Arigón/IPS

Rio de Janeiro, Brasil, 30/9/2013 – Em meio a rumores de que o aquecimento global se deteve nos últimos 15 anos, o novo informe do Grupo Internacional de Especialistas sobre a Mudança Climática (IPCC) indica que as três últimas décadas foram, sucessivamente, mais quentes do que qualquer outra desde 1850. O Resumo para Responsáveis por Políticas do informe do Grupo de Trabalho I – Bases de Ciência Física, foi divulgado no dia 27, em Estocolmo, na Suécia.

O texto completo, sem edições, será conhecido hoje e constitui o primeiro dos quatro volumes do Quinto Informe de Avaliação do IPCC. O aquecimento é “inequívoco”, afirma o IPCC. “A atmosfera e o oceano esquentam, a quantidade de neve e gelo diminui, o nível do mar sobe e as concentrações de gases-estufa aumentam”, destaca o estudo.

Para o especialista brasileiro em clima, Carlos Nobre, um dos autores principais do Quarto Informe de Avaliação, o novo relatório não dá “nenhuma razão para o otimismo. Cada uma das três últimas décadas foi sucessivamente mais quente do que qualquer outra desde 1850. No Hemisfério Norte, o período 1983-2012 representa, provavelmente, os 30 anos mais quentes dos últimos 1.400”, diz o novo resumo. Os dados “das temperaturas médias terrestres e da superfície do oceano, calculados como uma tendência linear, mostram um aquecimento de 0,85 grau no período 1880-2012”, acrescentou Nobre

A respeito da suposta pausa no aumento do calor, o IPCC afirma que “a taxa de aquecimento dos últimos 15 anos (1998-2012) – que foi de 0,05 grau por década e que começou com um potente El Niño (fase quente da Oscilação do Sul) – é menor do que a calculada entre 1951 e 2012, que foi de 0,12 grau por década”.

Entretanto, argumenta, “devido à variabilidade natural, as tendências baseadas em registros de períodos curtos são muito sensíveis às datas de começo e final, e não refletem, em geral, as tendências climáticas de longo prazo”. Resumindo, diz o documento, “é virtualmente certo (99% a 100% de certeza) que a troposfera esquenta desde meados do século 20”.

Nobre indicou à IPS que “o resumo observa com mais detalhe o que está mudando e reduz as incertezas com um conhecimento científico aperfeiçoado”. Além disso, confirma que as alterações do clima se originam principalmente nas ações humanas, destacou Nobre, secretário de Pesquisa e Desenvolvimento do Ministério de Ciência e Tecnologia.

A humanidade deverá se decidir a deixar grande parte dos combustíveis fósseis – responsáveis pela emissão de gases que esquentam a atmosfera – e passar para outras formas de energia renovável, advertiu Nobre. Tecnicamente, é possível, falta uma opção consciente de todos os países, ressaltou. “Essa transição tem um custo, mas que é cada vez menor do que o estimado há 15 anos. O problema não é a tecnologia, mas a decisão política”, acrescentou.

Para Carlos Rittl, coordenador do programa de mudanças climáticas e energia do WWF-Brasil (Fundo Mundial para a Natureza), “embora o aquecimento tenha apresentado uma aparente estabilização quanto à temperatura média, os anos mais quentes foram registrados na última década. Isso não nos deixa em uma situação de conforto”.

O informe do IPCC, uma revisão das pesquisas científicas disponíveis, se baseou em trabalhos de 259 autores de 39 países, e contou com 54.677 comentários. Sua divulgação veio permeada de uma renovada onda mediática de dúvidas e rumores sobre a existência do aquecimento global.

Com base em diferentes trajetórias de emissão de gases-estufa, o resumo projeta para este final de século uma elevação da temperatura superior a 1,5 grau em relação ao período 1850-1900 em todos os cenários, menos no de menor concentração de gases. Também afirma que, provavelmente, o aumento exceda os dois graus até 2100 nos dois cenários de maior emissão de gases. O informe anterior, de 2007, previa aumento de dois graus, no melhor dos casos, e de até seis, no pior.

Após o fracasso da conferência intergovernamental de Copenhague, em dezembro de 2009, quando os países não conseguiram obter um acordo para reduzir a contaminação climática, foram redobrados os questionamentos ao IPCC, em particular por uma errônea estimativa do derretimento das geleiras do Himalaia. A informação “foi usada de forma irresponsável por quem nega o aquecimento global”, alertou Rittl.

Seis anos depois, há mais e melhores evidências científicas para estimar, por exemplo, quanto o derretimento de gelos contribui para a elevação do nível do mar. Até o final de 2100 a elevação média do mar oscilará entre 24 centímetros e 63 centímetros, segundo o melhor e o pior cenários de contaminação atmosférica. As chuvas “aumentarão nas regiões mais úmidas e diminuirão naquelas onde há escassez pluviométrica”, detalhou Rittl, que é doutor em ecologia.

No Brasil os exemplos são a árida região do Nordeste e as mais úmidas do Sul e Sudeste. A incidência de chuvas crescerá entre 1% e 3% em zonas do sul, segundo a velocidade do aquecimento, enquanto as áridas apresentarão padrões de seca mais severos. Todas as tendências confirmadas pelo informe são “alarmantes”, apontou Rittl. “Nós, humanos, somos responsáveis por essas mudanças, que vão piorar o cenário atual em que já há centenas de milhões de pessoas sofrendo escassez de água, de comida e de condições adequadas para sobreviver”, ressaltou.

O primeiro volume do Quinto Informe de Avaliação do IPCC é divulgado dois meses antes de acontecer em Varsóvia, na Polônia, a 19ª Conferência das Partes da Convenção das Nações Unidas sobre Mudança Climática. Nesse encontro deverá ser feito um esforço mundial para garantir a transição para uma economia baixa em dióxido de carbono, disse Nobre. “Este informe é um choque de realidade”, ressaltou.

Em sua opinião, o Brasil é um dos “poucos bons exemplos”, pois conseguiu reduzir em 38,4% suas emissões de gases-estufa entre 2005 e 2010, devido à queda no desmatamento da Amazônia. “O Brasil adotou compromissos voluntários, mas no global não há um acordo ambicioso”, explicou Nobre. Quanto mais tempo se demorar em “adotar ações concretas, mais difícil e improvável será conseguir uma trajetória sustentável de acomodação e adaptação à mudança climática”, enfatizou.

Rittl acredita que os governos devem enfrentar a mudança climática como um desafio nacional para o desenvolvimento, a inclusão social e a redução da pobreza. “É necessário lidar com os riscos e as oportunidades com muita responsabilidade”, concluiu.

Envolverde/IPS

Ahead of IPCC report, fossil-fuel groups organize climate denial campaign (Grist)

By , 20 Sep 2013 11:38 AM

tiny man, huge megaphone
Shutterstock
If only they would shut up.

Watch out: A tsunami of stupidity is due to crash over the world next Friday.

That’s when the Intergovernmental Panel on Climate Change will release a summary of its big new climate assessment report, the first since 2007. But that’s not the stupid part.

A global campaign funded by fossil-fuel interests has been steadily building to discredit the report. That’s where the stupidity comes in. From The Guardian:

Organisations that dismiss the science behind climate change and oppose curbs on greenhouse gas pollution have made a big push to cloud the release of the IPCC report, the result of six years of work by hundreds of scientists.

Those efforts this week extended to promoting the fiction of a recovery in the decline of Arctic sea ice.

The IPCC assessments, produced every six or seven years, are seen as the definitive report on climate change, incorporating the key findings from some 9,000 articles published in scientific journals.

Climate deniers are pushing their messages out through blogs as well as old-fashioned outlets like the Daily MailFrom Skeptical Science:

Like the way a picnic on a sunny afternoon in August tends to attract lots of annoying wasps, major events on the climate change timeline tend to see certain contrarian figures and organisations dialling up the rhetorical output.

This is frustrating yet it has over the years become quite predictable: arguing with some climate change contrarians is similar to attempting debate with a well-trained parrot. Imagine: the parrot has memorised some twenty statements that it can squawk out at random. Thus it will follow up on ‘no warming since 1997′ with ‘in the 1970s they said there’d be an ice-age’ and so on. Another piece in the UK-based Daily Mail’s Sunday edition of September 8th 2013, written by a figure familiar to Skeptical Science readers, Mail and Vanity Fair journalist David Rose, gives a classic example of such parroting. There’s another in the UK’s Daily Telegraph along remarkably similar lines (it could even be the same parrot).

Meanwhile, the Koch-funded Heartland Institute has already published a compilation of lies in its own preemptive report. Its “experts” are so desperate to get their climate-denying messages out that they would probably turn up at your kid’s birthday party if you asked them nicely enough.

The news here is predictable but not terrible. Scientists and activists are doing better jobs of organizing communication strategies in the face of anti-science blather. From Inside Climate News:

Dozens of prominent scientists involved with drafting IPCC reports formed a Climate Science Rapid Response Team that punches back against misleading claims about climate research.

Kevin Trenberth is part of that team as well as a climate scientist at the National Center for Atmospheric Research and an author and editor on the forthcoming IPCC report. He explained that nearly every time there is a scientific paper linking man-made carbon dioxide emissions to climate change, the “denial-sphere” immediately responds with accusations that the research is wrong.

“The scientists get nasty emails. Certain websites comment. … So a bunch of us formed this rapid response team to deflate these arguments.” The group has been very busy in recent weeks.

Although those who make a living by denying climate science are screaming as loudly as ever, most people are getting better at tuning out their histrionics. Again from the Inside Climate News article:

Cindy Baxter, a longtime climate campaigner, said she thinks climate skeptics “are getting more shrill, but getting less notice,” because Americans are more convinced that global warming is real.

“Hurricane Sandy. Droughts. Flooding. Wildfires. People are feeling the effects of climate change. That makes it harder to deny,” said Baxter, who is also a co-author of Greenpeace’s new report Dealing in Doubt, which chronicles the history of climate skeptic campaigns. Polls say a larger majority of Americans from both parties see recent waves of deadly weather as a sign of climate change.

Unfortunately, this denier campaign will be around for a while. The IPCC report due out next Friday is just the first of four scheduled to be released over the coming year. Maybe after the final one comes out, the denial pushers will finally potter off to find new work, perhaps as baby-seal clubbers, orphan poisoners, or textbook reviewers for the Texas State Board of Education.

John Upton is a science fan and green news boffin who tweets, posts articles to Facebook, and blogs about ecology. He welcomes reader questions, tips, and incoherent rants: johnupton@gmail.com.

Quinto relatório do IPCC: repercussão

Documento divulgado nesta sexta (27/09) afirma que a temperatura do planeta pode subir quase 5 °C durante este século, o que poderá elevar o nível dos oceanos em até 82 centímetros (foto do Oceano Ártico: Nasa)

Especiais

Quinto relatório do IPCC mostra intensificação das mudanças climáticas

27/09/2013

Por Karina Toledo, de Londres

Agência FAPESP – Caso as emissões de gases do efeito estufa continuem crescendo às atuais taxas ao longo dos próximos anos, a temperatura do planeta poderá aumentar até 4,8 graus Celsius neste século – o que poderá resultar em uma elevação de até 82 centímetros no nível do mar e causar danos importantes na maior parte das regiões costeiras do globo.

O alerta foi feito pelos cientistas do Painel Intergovernamental sobre Mudanças Climáticas (IPCC, na sigla em inglês), da Organização das Nações Unidas (ONU), que divulgaram no dia 27 de setembro, em Estocolmo, na Suécia, a primeira parte de seu quinto relatório de avaliação (AR5). Com base na revisão de milhares de pesquisas realizadas nos últimos cinco anos, o documento apresenta as bases científicas da mudança climática global.

De acordo com Paulo Artaxo, professor do Instituto de Física da Universidade de São Paulo (USP) e um dos seis brasileiros que participaram da elaboração desse relatório, foram simulados quatro diferentes cenários de concentrações de gases de efeito estufa, possíveis de acontecer até o ano de 2100 – os chamados “Representative Concentration Pathways (RCPs)”.

“Para fazer a previsão do aumento da temperatura são necessários dois ingredientes básicos: um modelo climático e um cenário de emissões. No quarto relatório (divulgado em 2007) também foram simulados quatro cenários, mas se levou em conta apenas a quantidade de gases de efeito estufa emitida. Neste quinto relatório, nós usamos um sistema mais completo, que leva em conta os impactos dessas emissões, ou seja, o quanto haverá de alteração no balanço de radiação do sistema terrestre”, explicou Artaxo, que está em Londres para a FAPESP Week London, onde participou de um painel sobre mudança climática.

O balanço de radiação corresponde à razão entre a quantidade de energia solar que entra e que sai de nosso planeta, indicando o quanto ficou armazenada no sistema terrestre de acordo com as concentrações de gases de efeito estufa, partículas de aerossóis emitidas e outros agentes climáticos.

O cenário mais otimista prevê que o sistema terrestre armazenará 2,6 watts por metro quadrado (W/m2) adicionais. Nesse caso, o aumento da temperatura terrestre poderia variar entre 0,3 °C e 1,7 °C de 2010 até 2100 e o nível do mar poderia subir entre 26 e 55 centímetros ao longo deste século.

“Para que esse cenário acontecesse, seria preciso estabilizar as concentrações de gases do efeito estufa nos próximos 10 anos e atuar para sua remoção da atmosfera. Ainda assim, os modelos indicam um aumento adicional de quase 2 °C na temperatura – além do 0,9 °C que nosso planeta já aqueceu desde o ano 1750”, avaliou Artaxo.

O segundo cenário (RCP4.5) prevê um armazenamento de 4,5 W/m2. Nesse caso, o aumento da temperatura terrestre seria entre 1,1 °C e 2,6 °C e o nível do mar subiria entre 32 e 63 centímetros. No terceiro cenário, de 6,0 W/m2, o aumento da temperatura varia de 1,4 °C até 3,1 °C e o nível do mar subiria entre 33 e 63 centímetros.

Já o pior cenário, no qual as emissões continuam a crescer em ritmo acelerado, prevê um armazenamento adicional de 8,5 W/m2. Em tal situação, segundo o IPCC, a superfície da Terra poderia aquecer entre 2,6 °C e 4,8 °C ao longo deste século, fazendo com que o nível dos oceanos aumente entre 45 e 82 centímetros.

“O nível dos oceanos já subiu em média 20 centímetros entre 1900 e 2012. Se subir outros 60 centímetros, com as marés, o resultado será uma forte erosão nas áreas costeiras de todo o mundo. Rios como o Amazonas, por exemplo, sofrerão forte refluxo de água salgada, o que afeta todo o ecossistema local”, disse Artaxo.

Segundo o relatório AR5 do IPCC, em todos os cenários, é muito provável (90% de probabilidade) que a taxa de elevação dos oceanos durante o século 21 exceda a observada entre 1971 e 2010. A expansão térmica resultante do aumento da temperatura e o derretimento das geleiras seriam as principais causas.

O aquecimento dos oceanos, diz o relatório, continuará ocorrendo durante séculos, mesmo se as emissões de gases-estufa diminuírem ou permanecerem constantes. A região do Ártico é a que vai aquecer mais fortemente, de acordo com o IPCC.

Segundo Artaxo, o aquecimento das águas marinhas tem ainda outras consequências relevantes, que não eram propriamente consideradas nos modelos climáticos anteriores. Conforme o oceano esquenta, ele perde a capacidade de absorver dióxido de carbono (CO2) da atmosfera. Se a emissão atual for mantida, portanto, poderá haver uma aceleração nas concentrações desse gás na atmosfera.

“No relatório anterior, os capítulos dedicados ao papel dos oceanos nas mudanças climáticas careciam de dados experimentais. Mas nos últimos anos houve um enorme avanço na ciência do clima. Neste quinto relatório, por causa de medições feitas por satélites e de observações feitas com redes de boias – como as do Projeto Pirata que a FAPESP financia no Atlântico Sul –, a confiança sobre o impacto dos oceanos no clima melhorou muito”, afirmou Artaxo.

Acidificação dos oceanos

Em todos os cenários previstos no quinto relatório do IPCC, as concentrações de CO2 serão maiores em 2100 em comparação aos níveis atuais, como resultado do aumento cumulativo das emissões ocorrido durante os séculos 20 e 21. Parte do CO2 emitido pela atividade humana continuará a ser absorvida pelos oceanos e, portanto, é “virtualmente certo” (99% de probabilidade) que a acidificação dos mares vai aumentar. No melhor dos cenários – o RCP2,6 –, a queda no pH será entre 0,06 e 0,07. Na pior das hipóteses – o RCP8,5 –, entre 0,30 e 0,32.

“A água do mar é alcalina, com pH em torno de 8,12. Mas quando absorve CO2 ocorre a formação de compostos ácidos. Esses ácidos dissolvem a carcaça de parte dos microrganismos marinhos, que é feita geralmente de carbonato de cálcio. A maioria da biota marinha sofrerá alterações profundas, o que afeta também toda a cadeia alimentar”, afirmou Artaxo.

Ao analisar as mudanças já ocorridas até o momento, os cientistas do IPCC afirmam que as três últimas décadas foram as mais quentes em comparação com todas as anteriores desde 1850. A primeira década do século 21 foi a mais quente de todas. O período entre 1983 e 2012 foi “muito provavelmente” (90% de probabilidade) o mais quente dos últimos 800 anos. Há ainda cerca de 60% de probabilidade de que tenha sido o mais quente dos últimos 1.400 anos.

No entanto, o IPCC reconhece ter havido uma queda na taxa de aquecimento do planeta nos últimos 15 anos – passando de 0,12 °C por década (quando considerado o período entre 1951 e 2012) para 0,05°C (quando considerado apenas o período entre 1998 e 2012).

De acordo com Artaxo, o fenômeno se deve a dois fatores principais: a maior absorção de calor em águas profundas (mais de 700 metros) e a maior frequência de fenômenos La Niña, que alteram a taxa de transferência de calor da atmosfera aos oceanos. “O processo é bem claro e documentado em revistas científicas de prestígio. Ainda assim, o planeta continua aquecendo de forma significativa”, disse.

Há 90% de certeza de que o número de dias e noites frios diminuíram, enquanto os dias e noites quentes aumentaram na escala global. E cerca de 60% de certeza de que as ondas de calor também aumentaram. O relatório diz haver fortes evidências de degelo, principalmente na região do Ártico. Há 90% de certeza de que a taxa de redução da camada de gelo tenha sido entre 3,5% e 4,1% por década entre 1979 e 2012.

As concentrações de CO2 na atmosfera já aumentaram mais de 20% desde 1958, quando medições sistemáticas começaram a ser feitas, e cerca de 40% desde 1750. De acordo com o IPCC, o aumento é resultado da atividade humana, principalmente da queima de combustíveis fósseis e do desmatamento, havendo uma pequena participação da indústria cimenteira.

Para os cientistas há uma “confiança muito alta” (nove chances em dez) de que as taxas médias de CO2, metano e óxido nitroso do último século sejam as mais altas dos últimos 22 mil anos. Já mudanças na irradiação solar e a atividade vulcânica contribuíram com uma pequena fração da alteração climática. É “extremamente provável” (95% de certeza) de que a influência humana sobre o clima causou mais da metade do aumento da temperatura observado entre 1951 e 2010.

“Os efeitos da mudança climática já estão sendo sentidos, não é algo para o futuro. O aumento de ondas de calor, da frequência de furacões, das inundações e tempestades severas, das variações bruscas entre dias quentes e frios provavelmente está relacionado ao fato de que o sistema climático está sendo alterado”, disse Artaxo.

Impacto persistente

Na avaliação do IPCC, muitos aspectos da mudança climática vão persistir durante muitos séculos mesmo se as emissões de gases-estufa cessarem. É “muito provável” (90% de certeza) que mais de 20% do CO2 emitido permanecerá na atmosfera por mais de mil anos após as emissões cessarem, afirma o relatório.

“O que estamos alterando não é o clima da próxima década ou até o fim deste século. Existem várias publicações com simulações que mostram concentrações altas de CO2 até o ano 3000, pois os processos de remoção do CO2 atmosférico são muito lentos”, contou Artaxo.

Para o professor da USP, os impactos são significativos e fortes, mas não são catastróficos. “É certo que muitas regiões costeiras vão sofrer forte erosão e milhões de pessoas terão de ser removidas de onde vivem hoje. Mas claro que não é o fim do mundo. A questão é: como vamos nos adaptar, quem vai controlar a governabilidade desse sistema global e de onde sairão recursos para que países em desenvolvimento possam construir barreiras de contenção contra as águas do mar, como as que já estão sendo ampliadas na Holanda. Quanto mais cedo isso for planejado, menores serão os impactos socioeconômicos”, avaliou.

Os impactos e as formas de adaptação à nova realidade climática serão o tema da segunda parte do quinto relatório do IPCC, previsto para ser divulgado em janeiro de 2014. O documento contou com a colaboração de sete cientistas brasileiros. Outros 13 brasileiros participaram da elaboração da terceira parte do AR5, que discute formas de mitigar a mudança climática e deve sair em março.

De maneira geral, cresceu o número de cientistas vindos de países em desenvolvimento, particularmente do Brasil, dentro do IPCC. “O Brasil é um dos países líderes em pesquisas sobre mudança climática atualmente. Além disso, o IPCC percebeu que, se o foco ficasse apenas nos países desenvolvidos, informações importantes sobre o que está acontecendo nos trópicos poderiam deixar de ser incluídas. E é onde fica a Amazônia, um ecossistema-chave para o planeta”, disse Artaxo.

No dia 9 de setembro, o Painel Brasileiro de Mudanças Climáticas (PBMC) divulgou o sumário executivo de seu primeiro Relatório de Avaliação Nacional (RAN1). O documento, feito nos mesmos moldes do relatório do IPCC, indica que no Brasil o aumento de temperatura até 2100 será entre 1 °C e 6 °C, em comparação à registrada no fim do século 20. Como consequência, deverá diminuir significativamente a ocorrência de chuvas em grande parte das regiões central, Norte e Nordeste do país. Nas regiões Sul e Sudeste, por outro lado, haverá um aumento do número de precipitações.

“A humanidade nunca enfrentou um problema cuja relevância chegasse perto das mudanças climáticas, que vai afetar absolutamente todos os seres vivos do planeta. Não temos um sistema de governança global para implementar medidas de redução de emissões e verificação. Por isso, vai demorar ainda pelo menos algumas décadas para que o problema comece a ser resolvido”, opinou Artaxo.

Para o pesquisador, a medida mais urgente é a redução das emissões de gases de efeito estufa – compromisso que tem de ser assumido por todas as nações. “A consciência de que todos habitamos o mesmo barco é muito forte hoje, mas ainda não há mecanismos de governabilidade global para fazer esse barco andar na direção certa. Isso terá que ser construído pela nossa geração”, concluiu.

*   *   *

JC e-mail 4822, de 27 de Setembro de 2013.

IPCC lança na Suécia novo relatório sobre as mudanças climáticas (O Globo)

O documento reúne previsões do aquecimento global até 2100 apontando aumento de temperatura superior a 2 graus. ‘A influência humana no clima é clara’, diz trabalho

A previsão do Painel Intergovernamental sobre Mudanças Climáticas da ONU (IPCC) é que o aquecimento global até o final do século 21 seja “provavelmente superior” a 2 graus (com pelo menos 66% de chances disto acontecer), superando o limite considerado seguro pelos especialistas. Até 2100, o nível do mar deve aumentar perigosamente de 45 a 82 centímetros, considerando o pior cenário (ou de 26 a 55 centímetros, no melhor), e o gelo do Ártico pode diminuir até 94% durante o verão local.

Além disso, o relatório do IPCC — publicado na manhã desta sexta-feira em Estocolmo, na Suécia, com as bases científicas mais atualizadas sobre as mudanças climáticas — considera que há mais de 95% de certeza para afirmar que o homem causou mais da metade da elevação média da temperatura entre 1951 e 2010. Neste período, o nível dos oceanos aumentou 19 centímetros.

O documento, com o trabalho de 259 cientistas e representantes dos governos de 195 países, inclusive o Brasil, ressalta que parte das emissões de CO2 provocadas pelo homem continuará a ser absorvida pelos oceanos. Por isso, é praticamente certo (99% de probabilidade) que a acidificação dos mares vai aumentar, afetando profundamente a vida marinha.

– A mudança de temperatura da superfície do planeta deve exceder 1,5 grau, e, provavelmente, será superior a 2 graus – disse o copresidente do trabalho Thomas Stocker. – É muito provável que as ondas de calor ocorram com mais frequência e durem mais tempo. Com o aquecimento da Terra, esperamos ver regiões atualmente úmidas recebendo mais chuvas, e as áridas, menos, apesar de haver exceções.

Os especialistas fizeram quatro projeções considerando situações diferentes de emissões de gases-estufa. Em todas, há aumento de temperatura. As mais brandas ficam entre 0,3 ºC e 1,7 ºC. Nestes casos, seria necessário diminuir muito as emissões. Já no cenário mais pessimista, o aquecimento ficaria entre 2,6 ºC e 4,8 ºC.

– Na minha opinião, o relatório é muito bom, repleto de informações e todas muito bem fundamentadas – comentou a especialista brasileira Suzana Kahn, que faz parte do grupo de pesquisadores do IPCC em Estocolmo. – No fundo, o grande ganho é a comprovação do que tem sido dito há mais tempo, com muito mais informação sobre o papel dos oceanos, das nuvens e aerossóis. Isto é muito importante para o mundo científico, pois aponta para áreas que precisam ser mais investigadas.

Suzana conta que a dificuldade de fazer projeções e análises para o Hemisfério Sul causou muita polêmica. O grande problema é a falta de dados:

– Acho que isto foi interessante para comprovar que estávamos certos ao criar o Painel Brasileiro de Mudança Climática para suprir esta lacuna.

De acordo com Carlos Rittl, Coordenador do Programa de Mudanças Climáticas e Energias do WWF-Brasil, o relatório do IPCC deixa uma mensagem clara: o aquecimento global é incontestável. E é preciso começar agora contra os piores efeitos.

– O relatório do IPCC reafirma algumas certezas e vai além. Aponta a perda de massa de gelo, o aumento do nível dos oceanos, o incremento de chuvas onde já se chove muito, além da diminuição da umidade nas regiões mais áridas. No Brasil, o semiárido pode se tornar mais árido. Por outro lado, o Sul e Sudeste podem ter mais chuvas do que hoje – enumerou Rittl. – É importante ressaltar que o relatório fala de médias. Ou seja, em algumas regiões do país pode haver aumento de seis graus, com picos de mais de oito graus. A gente não está preparado para isso. E nossas emissões nos colocam numa trajetória de altíssimo risco, mais próxima dos piores cenários.

Para Rittl, a ciência está dando uma mensagem clara: é preciso diminuir as emissões. Isso significa levar os temas relacionados às mudanças climáticas para as decisões políticas:

– Hoje o aquecimento global ainda é um tema marginal. Os investimentos em infraestrutura, energia, o plano Safra, todos os grandes investimentos da escala de trilhões de reais não têm praticamente nenhuma vinculação com a lógica de baixo carbono. Estamos sendo pouco responsáveis.

Diplomacia e ciência
John Kerry, secretário de Estado dos EUA, ressalta que o relatório do IPCC não deve ser esquecido num armário, nem interpretado como uma peça política feita por políticos: “é ciência”, resume. De acordo com ele, os Estados Unidos estão “profundamente comprometidos em combater as mudanças climáticas”, reduzindo emissões de gases-estufa e investindo em formas eficientes de energia.

Este é mais um chamado de atenção: aqueles que negam a ciência ou procuram desculpas para evitar a ação estão brincando com fogo – afirmou Kerry. – O resumo do relatório do IPCC é este: a mudança climática é real, está acontecendo agora, os seres humanos são a causa dessa transformação, e somente a ação dos seres humanos pode salvar o mundo de seus piores impactos.

Lançado a cada seis anos, os relatórios do IPCC estão sob críticas de especialistas. O processo de análise dos documentos por representantes do governo acaba chegando a uma situação intermediária entre diplomacia e ciência, explica Emilio La Rovere, pesquisador da Coppe/UFRJ.

– O resultado é um meio termo entre diplomacia e ciência – afirmou Rovere.
Um dos pontos mais polêmicos são os chamados hiatos, períodos de cerca de 15 anos em que a temperatura média do planeta não aumenta. Por exemplo, a Agência Estatal de Meteorologia da Espanha anunciou, nesta semana, que o último trimestre na Península Ibérica foi o menos quente desde 2008. Entretanto, o relatório de 2007 do IPCC não citou estas pausas do aquecimento, dando argumento aos céticos.

Limite do aquecimento global
As informações do IPCC são importantes para a criação de estratégias de combate às mudanças climáticas. Na Convenção do Clima da Dinamarca (COP-15, realizada em 2009), foi criada a meta de limitar o aquecimento global em 2 graus. Para isso acontecer em 2050, explica Emilio La Rovere, da Coppe/UFRJ, seria necessário cortar 80% das emissões em comparação com 1990:

– Os modelos matemáticos simulando evolução demográfica, economia mundial, demanda e oferta de energia mostram que fica realmente quase impossível atingir este objetivo.

Este é o Quinto Relatório do IPCC, que será lançado em quatro partes, entre setembro de 2013 e novembro de 2014. Nesta sexta-feira, foi publicado o documento do Grupo de Trabalho I (sobre os aspectos científicos das mudanças climáticas). Entre os dias 25 e 29 de março de 2014, será a vez do Grupo de Trabalho II (analisando os impactos, a adaptação e a vulnerabilidade), que se reunirá em Yokohama, no Japão. O Grupo de Trabalho III (especializado na mitigação dos impactos das mudanças climáticas) está previsto para os dias 7 e 11 de abril em Berlim, na Alemanha. Por fim, será criado um relatório síntese, cujos trabalhos ocorrerão entre os dias 27 e 31 de outubro, em Copenhague, na Dinamarca.

(Cláudio Motta/O Globo)

http://oglobo.globo.com/ciencia/ipcc-lanca-na-suecia-novo-relatorio-sobre-as-mudancas-climaticas-10173671#ixzz2g6D6tmU6

Matéria complementar de O Globo:

Relatório destaca como a ação humana agrava o aquecimento
http://oglobo.globo.com/ciencia/relatorio-destaca-como-acao-humana-agrava-aquecimento-1-10171896#ixzz2g6MfZIsl

Veja mais em:

Folha de S.Paulo
Relatório sobe tom de alerta sobre aquecimento
http://www1.folha.uol.com.br/fsp/saudeciencia/131005-relatorio-sobe-tom-de-alerta-sobre-aquecimento.shtml

Agência Estado
ONU culpa atividades humanas por aquecimento global
http://www.territorioeldorado.limao.com.br/noticias/not297255.shtm

Zero Hora
Temperatura da Terra subirá entre 0,3°C e 4,8°C neste século, aponta IPCC
http://zerohora.clicrbs.com.br/rs/geral/planeta-ciencia/noticia/2013/09/temperatura-da-terra-subira-entre-0-3c-e-4-8c-neste-seculo-aponta-ipcc-4283083.html

Correio Braziliense
Temperatura do planeta subirá entre 0,3 e 4,8ºC no século 21, diz IPCC
http://www.correiobraziliense.com.br/app/noticia/ciencia-e-saude/2013/09/27/interna_ciencia_saude,390388/temperatura-do-planeta-subira-entre-0-3-e-4-8-c-no-seculo-21-diz-ipcc.shtml

 

And now it’s global COOLING! Record return of Arctic ice cap as it grows by 60% in a year (Daily Mail)

  • Almost a million more square miles of ocean covered with ice than in 2012
  • BBC reported in 2007 global warming would leave Arctic ice-free in summer by 2013
  • Publication of UN climate change report suggesting global warming caused by humans pushed back to later this month

By DAVID ROSE

PUBLISHED: 23:37 GMT, 7 September 2013 | UPDATED: 12:01 GMT, 8 September 2013

A chilly Arctic summer has left nearly a million more square miles of ocean covered with ice than at the same time last year – an increase of 60 per cent.

The rebound from 2012’s record low comes six years after the BBC reported that global warming would leave the Arctic ice-free in summer by 2013.

Instead, days before the annual autumn re-freeze is due to begin, an unbroken ice sheet more than half the size of Europe already stretches from the Canadian islands to Russia’s northern shores.

global cooling

The Northwest Passage from the Atlantic to the Pacific has remained blocked by pack-ice all year. More than 20 yachts that had planned to sail it have been left ice-bound and a cruise ship attempting the route was forced to turn back.

Some eminent scientists now believe the world is heading for a period of cooling that will not end until the middle of this century – a process that would expose computer forecasts of imminent catastrophic warming as dangerously misleading.

The disclosure comes 11 months after The Mail on Sunday triggered intense political and scientific debate by revealing that global warming has ‘paused’ since the beginning of 1997 – an event that the computer models used by climate experts failed to predict.

In March, this newspaper further revealed that temperatures are about to drop below the level that the models forecast with ‘90 per cent certainty’.

The pause – which has now been accepted as real by every major climate research centre – is important, because the models’ predictions of ever-increasing global temperatures have made many of the world’s economies divert billions of pounds into ‘green’ measures to counter  climate change.

Those predictions now appear gravely flawed.

THERE WON’T BE ANY ICE AT ALL! HOW THE BBC PREDICTED CHAOS IN 2007

Only six years ago, the BBC reported that the Arctic would be ice-free in summer by 2013, citing a scientist in the US who claimed this was a ‘conservative’ forecast. Perhaps it was their confidence that led more than 20 yachts to try to sail the Northwest Passage from the Atlantic to  the Pacific this summer. As of last week, all these vessels were stuck in the ice, some at the eastern end of the passage in Prince Regent Inlet, others further west at Cape Bathurst.

Shipping experts said the only way these vessels were likely to be freed was by the icebreakers of the Canadian coastguard. According to the official Canadian government website, the Northwest Passage has remained ice-bound and impassable  all summer.

The BBC’s 2007 report quoted scientist  Professor Wieslaw Maslowski, who based his views on super-computer models and the fact that ‘we use a high-resolution regional model for the Arctic Ocean and sea ice’.

He was confident his results were ‘much more realistic’ than other projections, which ‘underestimate the amount of heat delivered to the sea ice’. Also quoted was Cambridge University expert

Professor Peter Wadhams. He backed Professor Maslowski, saying his model was ‘more efficient’ than others because it ‘takes account of processes that happen internally in the ice’.

He added: ‘This is not a cycle; not just a fluctuation. In the end, it will all just melt away quite suddenly.’

BBC

The continuing furore caused by The Mail on Sunday’s revelations – which will now be amplified by the return of the Arctic ice sheet – has forced the UN’s climate change body to hold a crisis meeting.

The UN Intergovernmental Panel on Climate Change (IPCC) was due in October to start publishing its Fifth Assessment Report – a huge three-volume study issued every six or seven years. It will now hold a pre-summit in Stockholm later this month.

Leaked documents show that governments which support and finance the IPCC are demanding more than 1,500 changes to the report’s ‘summary for policymakers’. They say its current draft does not properly explain the pause.

At the heart of the row lie two questions: the extent to which temperatures will rise with carbon dioxide levels, as well as how much of the warming over the past 150 years – so far, just 0.8C – is down to human greenhouse gas emissions and how much is due to natural variability.

In its draft report, the IPCC says it is ‘95 per cent confident’ that global warming has been caused by humans – up from 90 per cent in 2007.

This claim is already hotly disputed. US climate expert Professor Judith Curry said last night: ‘In fact, the uncertainty is getting bigger. It’s now clear the models are way too sensitive to carbon dioxide. I cannot see any basis for the IPCC increasing its confidence level.’

She pointed to long-term cycles  in ocean temperature, which have a huge influence on climate and  suggest the world may be approaching a period similar to that from 1965 to 1975, when there was a clear cooling trend. This led some scientists at the time to forecast an imminent ice age.

Professor Anastasios Tsonis, of the University of Wisconsin, was one of the first to investigate the ocean cycles. He said: ‘We are already in a cooling trend, which I think will continue for the next 15 years at least. There is no doubt the warming of the 1980s and 1990s has stopped.

Then... NASA satelite images showing the spread of Artic sea ice 27th August 2012Then… NASA satelite images showing the spread of Artic sea ice 27th August 2012

...And now, much bigger: The spread of Artic sea ice on August 15 2013…And now, much bigger: The same Nasa image taken in 2013

‘The IPCC claims its models show a pause of 15 years can be expected. But that means that after only a very few years more, they will have to admit they are wrong.’

Others are more cautious. Dr Ed Hawkins, of Reading University, drew the graph published by The Mail on Sunday in March showing how far world temperatures have diverged from computer predictions. He admitted the cycles may have caused some of the recorded warming, but insisted that natural variability alone could not explain all of the temperature rise over the past 150 years.

Nonetheless, the belief that summer Arctic ice is about to disappear remains an IPCC tenet, frequently flung in the face of critics who point to the pause.

Yet there is mounting evidence that Arctic ice levels are cyclical. Data uncovered by climate historians show that there was a massive melt in the 1920s and 1930s, followed by intense re-freezes that ended only in 1979 – the year the IPCC says that shrinking began.

Professor Curry said the ice’s behaviour over the next five years would be crucial, both for understanding the climate and for future policy. ‘Arctic sea ice is the indicator to watch,’ she said.

Read more: http://www.dailymail.co.uk/news/article-2415191/Global-cooling-Arctic-ice-caps-grows-60-global-warming-predictions.html#ixzz2foMLmebr 
Follow us: @MailOnline on Twitter | DailyMail on Facebook

Seca no semiárido deve se agravar nos próximos anos – e outros artigos relacionados à Conclima (Fapesp)

Pesquisadores alertam para necessidade de executar ações urgentes de adaptação e mitigação aos impactos das mudanças climáticas previstos na região (foto: Fred Jordão/Acervo ASACom)

12/09/2013

Por Elton Alisson

Agência FAPESP – Os problemas de seca prolongada registrados atualmente no semiárido brasileiro devem se agravar ainda mais nos próximos anos por causa das mudanças climáticas globais. Por isso, é preciso executar ações urgentes de adaptação e mitigação desses impactos e repensar os tipos de atividades econômicas que podem ser desenvolvidas na região.

A avaliação foi feita por pesquisadores que participaram das discussões sobre desenvolvimento regional e desastres naturais realizadas no dia 10 de setembro durante a 1ª Conferência Nacional de Mudanças Climáticas Globais (Conclima).

Organizado pela FAPESP e promovido em parceria com a Rede Brasileira de Pesquisa e Mudanças Climáticas Globais (Rede Clima) e o Instituto Nacional de Ciência e Tecnologia para Mudanças Climáticas (INCT-MC), o evento ocorre até a próxima sexta-feira (13/09), no Espaço Apas, em São Paulo.

De acordo com dados do Centro Nacional de Gerenciamento de Riscos e Desastres (Cenad), só nos últimos dois anos foram registrados 1.466 alertas de municípios no semiárido que entraram em estado de emergência ou de calamidade pública em razão de seca e estiagem – os desastres naturais mais recorrentes no Brasil, segundo o órgão.

O Primeiro Relatório de Avaliação Nacional do Painel Brasileiro de Mudanças Climáticas (PBMC) – cujo sumário executivo foi divulgado no dia de abertura da Conclima – estima que esses eventos extremos aumentem principalmente nos biomas Amazônia, Cerrado e Caatinga e que as mudanças devem se acentuar a partir da metade e até o fim do século 21. Dessa forma, o semiárido sofrerá ainda mais no futuro com o problema da escassez de água que enfrenta hoje, alertaram os pesquisadores.

“Se hoje já vemos que a situação é grave, os modelos de cenários futuros das mudanças climáticas no Brasil indicam que o problema será ainda pior. Por isso, todas as ações de adaptação e mitigação pensadas para ser desenvolvidas ao longo dos próximos anos, na verdade, têm de ser realizadas agora”, disse Marcos Airton de Sousa Freitas, especialista em recursos hídricos e técnico da Agência Nacional de Águas (ANA).

Segundo o pesquisador, o semiárido – que abrange Bahia, Sergipe, Alagoas, Pernambuco, Rio Grande do Norte, Paraíba, Ceará, Piauí e o norte de Minas Gerais – vive hoje o segundo ano do período de seca, iniciado em 2011, que pode se prolongar por um tempo indefinido.

Um estudo realizado pelo órgão, com base em dados de vazão de bacias hidrológicas da região, apontou que a duração média dos períodos de seca no semiárido é de 4,5 anos. Estados como o Ceará, no entanto, já enfrentaram secas com duração de quase nove anos, seguidos por longos períodos nos quais choveu abaixo da média estimada.

De acordo com Freitas, a capacidade média dos principais reservatórios da região – com volume acima de 10 milhões de metros cúbicos de água e capacidade de abastecer os principais municípios por até três anos – está atualmente na faixa de 40%. E a tendência até o fim deste ano é de esvaziarem cada vez mais.

“Caso não haja um aporte considerável de água nesses grandes reservatórios em 2013, poderemos ter uma transição do problema de seca que se observa hoje no semiárido, mais rural, para uma seca ‘urbana’ – que atingiria a população de cidades abastecidas por meio de adutoras desses sistemas de reservatórios”, alertou Freitas.

Ações de adaptação

Uma das ações de adaptação que começou a ser implementada no semiárido nos últimos anos e que, de acordo com os pesquisadores, contribuiu para diminuir sensivelmente a vulnerabilidade do acesso à água, principalmente da população rural difusa, foi o Programa Um Milhão de Cisternas (P1MC).

Lançado em 2003 pela Articulação Semiárido Brasileiro (ASA) – rede formada por mais de mil organizações não governamentais (ONGs) que atuam na gestão e no desenvolvimento de políticas de convivência com a região semiárida –, o programa visa implementar um sistema nas comunidades rurais da região por meio do qual a água das chuvas é capturada por calhas, instaladas nos telhados das casas, e armazenada em cisternas cobertas e semienterradas. As cisternas são construídas com placas de cimento pré-moldadas, feitas pela própria comunidade, e têm capacidade de armazenar até 16 mil litros de água.

O programa tem contribuído para o aproveitamento da água da chuva em locais onde chove até 600 milímetros por ano – comparável ao volume das chuvas na Europa – que evaporam e são perdidos rapidamente sem um mecanismo que os represe, avaliaram os pesquisadores.

“Mesmo com a seca extrema na região nos últimos dois anos, observamos que a água para o consumo da população rural difusa tem sido garantida pelo programa, que já implantou cerca de 500 mil cisternas e é uma ação política de adaptação a eventos climáticos extremos. Com programas sociais, como o Bolsa Família, o programa Um Milhão de Cisternas tem contribuído para atenuar os impactos negativos causados pelas secas prolongadas na região”, afirmou Saulo Rodrigues Filho, professor da Universidade de Brasília (UnB).

Como a água tende a ser um recurso natural cada vez mais raro no semiárido nos próximos anos, Rodrigues defendeu a necessidade de repensar os tipos de atividades econômicas mais indicadas para a região.

“Talvez a agricultura não seja a atividade mais sustentável para o semiárido e há evidências de que é preciso diversificar as atividades produtivas na região, não dependendo apenas da agricultura familiar, que já enfrenta problemas de perda de mão de obra, uma vez que o aumento dos níveis de educação leva os jovens da região a se deslocar do campo para a cidade”, disse Rodrigues.

“Por meio de políticas de geração de energia mais sustentáveis, como a solar e a eólica, e de fomento a atividades como o artesanato e o turismo, é possível contribuir para aumentar a resiliência dessas populações a secas e estiagens agudas”, afirmou.

Outras medidas necessárias, apontada por Freitas, são de realocação de água entre os setores econômicos que utilizam o recurso e seleção de culturas agrícolas mais resistentes à escassez de água enfrentada na região.

“Há culturas no semiárido, como capim para alimentação de gado, que dependem de irrigação por aspersão. Não faz sentido ter esse tipo de cultura que demanda muito água em uma região que sofrerá muito os impactos das mudanças climáticas”, afirmou Freitas.

Transposição do Rio São Francisco

O pesquisador também defendeu que o projeto de transposição do Rio São Francisco tornou-se muito mais necessário agora – tendo em vista que a escassez de água deverá ser um problema cada vez maior no semiárido nas próximas décadas – e é fundamental para complementar as ações desenvolvidas na região para atenuar o risco de desabastecimento de água.

Alvo de críticas e previsto para ser concluído em 2015, o projeto prevê que as águas do Rio São Francisco cheguem às bacias do Rio Jaguaribe, que abastece o Ceará, e do Rio Piranhas-Açu, que abastece o Rio Grande do Norte e a Paraíba.

De acordo com um estudo realizado pela ANA, com financiamento do Banco Mundial e participação de pesquisadores da Universidade Federal do Ceará, entre outras instituições, a disponibilidade hídrica dessas duas bacias deve diminuir sensivelmente nos próximos anos, contribuindo para agravar ainda mais a deficiência hídrica do semiárido.

“A transposição do Rio Francisco tornou-se muito mais necessária e deveria ser acelerada porque contribuiria para minimizar o problema do déficit de água no semiárido agora, que deve piorar com a previsão de diminuição da disponibilidade hídrica nas bacias do Rio Jaguaribe e do Rio Piranhas-Açu”, disse Freitas à Agência FAPESP.

O Primeiro Relatório de Avaliação Nacional do PBMC, no entanto, indica que a vazão do Rio São Francisco deve diminuir em até 30% até o fim do século, o que colocaria o projeto de transposição sob ameaça.

Freitas, contudo, ponderou que 70% do volume de água do Rio São Francisco vem de bacias da região Sudeste, para as quais os modelos climáticos preveem aumento da vazão nas próximas décadas. Além disso, de acordo com ele, o volume total previsto para ser transposto para as bacias do Rio Jaguaribe e do Rio Piranhas-Açu corresponde a apenas 2% da vazão média da bacia do Rio São Francisco.

“É uma situação completamente diferente do caso do Sistema Cantareira, por exemplo, no qual praticamente 90% da água dos rios Piracicaba, Jundiaí e Capivari são transpostas para abastecer a região metropolitana de São Paulo”, comparou.

“Pode-se argumentar sobre a questão de custos da transposição do Rio São Francisco. Mas, em termos de necessidade de uso da água, o projeto reforçará a operação dos sistemas de reservatórios existentes no semiárido”, afirmou.

De acordo com o pesquisador, a água é distribuída de forma desigual no território brasileiro. Enquanto 48% do total do volume de chuvas que cai na Amazônia é escoado pela Bacia Amazônica, segundo Freitas, no semiárido apenas em média 7% do volume de água precipitada na região durante três a quatro meses chegam às bacias do Rio Jaguaribe e do Rio Piranhas-Açu. Além disso, grande parte desse volume de água é perdido pela evaporação. “Por isso, temos necessidade de armazenar essa água restante para os meses nos quais não haverá disponibilidade”, explicou.

As apresentações feitas pelos pesquisadores na conferência, que termina no dia 13, estarão disponíveis em: www.fapesp.br/conclima

 

Mudanças no clima do Brasil até 2100

10/09/2013

Por Elton Alisson*

Mais calor, menos chuva no Norte e Nordeste do país e mais chuva no Sul e Sudeste são algumas das projeções do Relatório de Avaliação Nacional do Painel Brasileiro de Mudanças Climáticas (foto: Eduardo Cesar/FAPESP)

Agência FAPESP – O clima no Brasil nas próximas décadas deverá ser mais quente – com aumento gradativo e variável da temperatura média em todas as regiões do país entre 1 ºC e 6 ºC até 2100, em comparação à registrada no fim do século 20.

No mesmo período, também deverá diminuir significativamente a ocorrência de chuvas em grande parte das regiões central, Norte e Nordeste do país. Nas regiões Sul e Sudeste, por outro lado, haverá um aumento do número de precipitações.

As conclusões são do primeiro Relatório de Avaliação Nacional (RAN1) do Painel Brasileiro de Mudanças Climáticas (PBMC), cujo sumário executivo foi divulgado nesta segunda-feira (09/08), durante a 1ª Conferência Nacional de Mudanças Climáticas Globais (Conclima). Organizado pela FAPESP e promovido com a Rede Brasileira de Pesquisa e Mudanças Climáticas Globais (Rede Clima) e o Instituto Nacional de Ciência e Tecnologia para Mudanças Climáticas (INCT-MC), oevento ocorre até a próxima sexta-feira (13/09), no Espaço Apas, em São Paulo.

De acordo com o relatório, tendo em vista que as mudanças climáticas e os impactos sobre as populações e os setores econômicos nos próximos anos não serão idênticos em todo o país, o Brasil precisa levar em conta as diferenças regionais no desenvolvimento de ações de adaptação e mitigação e de políticas agrícolas, de geração de energia e de abastecimento hídrico para essas diferentes regiões.

Dividido em três partes, o Relatório 1 – em fase final de elaboração – apresenta projeções regionalizadas das mudanças climáticas que deverão ocorrer nos seis diferentes biomas do Brasil até 2100, e indica quais são seus impactos estimados e as possíveis formas de mitigá-los.

As projeções foram feitas com base em revisões de estudos realizados entre 2007 e início de 2013 por 345 pesquisadores de diversas áreas, integrantes do PBMC, e em resultados científicos de modelagem climática global e regional.

“O Relatório está sendo preparado nos mesmos moldes dos relatórios publicados pelo Painel Intergovernamental das Mudanças Climáticas [IPCC, na sigla em inglês], que não realiza pesquisa, mas avalia os estudos já publicados”, disse José Marengo, pesquisador do Instituto Nacional de Pesquisas Espaciais (Inpe) e coordenador do encontro.

“Depois de muito trabalho e interação, chegamos aos resultados principais dos três grupos de trabalho [Bases científicas das mudanças climáticas; Impactos, vulnerabilidades e adaptação; e Mitigação das mudanças climáticas]”, ressaltou.

Principais conclusões

Uma das conclusões do relatório é de que os eventos extremos de secas e estiagens prolongadas, principalmente nos biomas da Amazônia, Cerrado e Caatinga, devem aumentar e essas mudanças devem se acentuar a partir da metade e no fim do século 21.

A temperatura na Amazônia deverá aumentar progressivamente de 1 ºC a 1,5 ºC até 2040 – com diminuição de 25% a 30% no volume de chuvas –, entre 3 ºC e 3,5 ºC no período de 2041 a 2070 – com redução de 40% a 45% na ocorrência de chuvas –, e entre 5 ºC a 6 ºC entre 2071 a 2100.

Enquanto as modificações do clima associadas às mudanças globais podem comprometer o bioma em longo prazo, a questão atual do desmatamento decorrente das intensas atividades de uso da terra representa uma ameaça mais imediata para a Amazônia, ponderam os autores do relatório.

Os pesquisadores ressaltam que estudos observacionais e de modelagem numérica sugerem que, caso o desmatamento alcance 40% na região no futuro, haverá uma mudança drástica no padrão do ciclo hidrológico, com redução de 40% na chuva durante os meses de julho a novembro – o que prolongaria a duração da estação seca e provocaria o aquecimento superficial do bioma em até 4 ºC.

Dessa forma, as mudanças regionais decorrentes do efeito do desmatamento se somariam às provenientes das mudanças globais e constituíram condições propícias para a savanização da Amazônia – problema que tende a ser mais crítico na região oriental, ressaltam os pesquisadores.

“As projeções permitirão analisar melhor esse problema de savanização da Amazônia, que, na verdade, percebemos que poderá ocorrer em determinados pontos da floresta, e não no bioma como um todo, conforme previam alguns estudos”, destacou Tércio Ambrizzi, um dos autores coordenadores do sumário executivo do grupo de trabalho sobre a base científica das mudanças climáticas.

A temperatura da Caatinga também deverá aumentar entre 0,5 ºC e 1 ºC e as chuvas no bioma diminuirão entre 10% e 20% até 2040. Entre 2041 e 2070 o clima da região deverá ficar de 1,5 ºC a 2,5 ºC mais quente e o padrão de chuva diminuir entre 25% e 35%. Até o final do século, a temperatura do bioma deverá aumentar progressivamente entre 3,5 ºC e 4,5 ºC  e a ocorrência de chuva diminuir entre 40% e 50%. Tais mudanças podem desencadear o processo de desertificação do bioma.

Por sua vez, a temperatura no Cerrado deverá aumentar entre 5 ºC e 5,5 ºC e as chuvas diminuirão entre 35% e 45% no bioma até 2100. No Pantanal, o aquecimento da temperatura deverá ser de 3,5ºC a 4,5ºC até o final do século, com diminuição acentuada dos padrões de chuva no bioma – com queda de 35% a 45%.

Já no caso da Mata Atlântica, como o bioma abrange áreas desde a região Sul do país, passando pelo Sudeste e chegando até o Nordeste, as projeções apontam dois regimes distintos de mudanças climáticas.

Na porção Nordeste deve ocorrer um aumento relativamente baixo na temperatura – entre 0,5 ºC e 1 ºC – e decréscimo nos níveis de precipitação (chuva) em torno de 10% até 2040. Entre 2041 e 2070, o aquecimento do clima da região deverá ser de 2 ºC a 3 ºC, com diminuição pluviométrica entre 20% e 25%. Já para o final do século – entre 2071 e 2100 –, estimam-se condições de aquecimento intenso – com aumento de 3 ºC a 4 ºC na temperatura – e diminuição de 30% a 35% na ocorrência de chuvas.

Nas porções Sul e Sudeste as projeções indicam aumento relativamente baixo de temperatura entre 0,5 ºC e 1 ºC até 2040, com aumento de 5% a 10% no número de chuva. Entre 2041 e 2070 deverão ser mantidas as tendências de aumento gradual de 1,5 ºC a 2 ºC na temperatura e de 15% a 20% de chuvas.

Tais tendências devem se acentuar ainda mais no final do século, quando o clima deverá ficar entre 2,5 ºC e 3 ºC mais quente e entre 25% e 30% mais chuvoso.

Por fim, para o Pampa, as projeções indicam que até 2040 o clima da região será entre 5% e 10% mais chuvoso e até 1 ºC mais quente. Já entre 2041 e 2070, a temperatura do bioma deverá aumentar entre 1 ºC e 1,5 ºC  e haverá uma intensificação das chuvas entre 15% e 20%. As projeções para o clima da região no período entre 2071 e 2100 são mais agravantes, com aumento de temperatura de 2,5 ºC a 3 ºC e ocorrência de chuvas entre 35% e 40% acima do normal.

“O que se observa, de forma geral, é que nas regiões Norte e Nordeste do Brasil a tendência é de um aumento de temperatura e de diminuição das chuvas ao longo do século”, resumiu Ambrizzi.

“Já nas regiões mais ao Sul essa tendência se inverte: há uma tendência tanto de aumento da temperatura – ainda que não intenso – e de precipitação”, comparou.

Impactos e adaptação

As mudanças nos padrões de precipitação nas diferentes regiões do país, causadas pelas mudanças climáticas, deverão ter impactos diretos na agricultura, na geração e distribuição de energia e nos recursos hídricos das regiões, uma vez que a água deve se tornar mais rara nas regiões Norte e Nordeste e mais abundante no Sul e Sudeste, alertam os pesquisadores.

Por isso, será preciso desenvolver ações de adaptação e mitigação específicas e rever decisões de investimento, como a construção de hidrelétricas nas regiões leste da Amazônia, onde os rios poderão ter redução da vazão da ordem de até 20%, ressalvaram os pesquisadores.

“Essas variações de impactos mostram que qualquer tipo de estratégia planejada para geração de energia no leste da Amazônia está ameaçada, porque há uma série de fragilidades”, disse Eduardo Assad, pesquisador da Empresa Brasileira de Pesquisa Agropecuária (Embrapa).

“Dará para contar com água. Mas até quando e onde encontrar água nessas regiões são incógnitas”, disse o pesquisador, que é um dos coordenadores do Grupo de Trabalho 2 do relatório, sobre Impactos, vulnerabilidades e adaptação.

De acordo com Assad, é muito caro realizar ações de adaptação às mudanças climáticas no Brasil em razão das fragilidades que o país apresenta tanto em termos naturais – com grandes variações de paisagens – como socioeconômicas.

“A maior parte da população brasileira – principalmente a que habita as regiões costeiras do país – está vulnerável aos impactos das mudanças climáticas. Resolver isso não será algo muito fácil”, estimou.

Entre os setores econômicos do país, segundo Assad, a agricultura é um dos poucos que vêm se adiantando para se adaptar aos impactos das mudanças climáticas.

“Já estamos trabalhando com condições de adaptação há mais de oito anos. É possível desenvolver cultivares tolerantes a temperaturas elevadas ou à deficiência hídrica [dos solos], disse Assad.

O pesquisador também ressaltou que os grupos populacionais com piores condições de renda, educação e moradia sofrerão mais intensamente os impactos das mudanças climáticas no país. “Teremos que tomar decisões rápidas para evitar que tragédias aconteçam.”

Mitigação

Mercedes Bustamante, professora da Universidade de Brasília (UnB), e uma das coordenadoras do Grupo de Trabalho 3, sobre Mitigação das Mudanças Climáticas, apresentou uma síntese de estudos e pesquisas sobre o tema, identificando lacunas do conhecimento e direcionamentos futuros em um cenário de aquecimento global.

Bustamante apontou que a redução das taxas de desmatamento entre 2005 e 2010 – de 2,03 bilhões de toneladas de CO2 equivalente para 1,25 bilhão de toneladas – já teve efeitos positivos na redução das emissões de gases de efeito estufa (GEE) decorrentes do uso da terra.

“As emissões decorrentes da geração de energia e da agricultura, no entanto, aumentaram em termos absolutos e relativos, indicando mudanças no perfil das emissões brasileiras”, disse.

Mantidas as políticas atuais, a previsão é de que as emissões decorrentes dos setores de energia e de transportes aumentem 97% até 2030. Será preciso mais eficiência energética, mais inovação tecnológica e políticas de incentivo ao uso de energia renovável para reverter esse quadro.

Na área de transporte, as recomendações vão desde a transformação de um modal – fortemente baseado no transporte rodoviário – e o uso de combustíveis tecnológicos. “É preciso transferir do individual para o coletivo, investindo, por exemplo, em sistemas aquaviários e em veículos elétricos e híbridos”, ressaltou Bustamante.

O novo perfil das emissões de GEE revela uma participação crescente do metano – de origem animal – e do óxido nitroso – relacionado ao uso de fertilizantes. “Apesar desses resultados, a agricultura avançou no desenvolvimento de estratégias de mitigação e adaptação”, ponderou.

Para a indústria, responsável por 4% das emissões de GEE, a lista de recomendações para a mitigação passa pela reciclagem, pela utilização de biomassa renovável, pela cogeração de energia, entre outros.

As estratégias de mitigação das mudanças climáticas exigem, ainda, uma revisão do planejamento urbano de forma a garantir a sustentabilidade também das edificações de forma a controlar, por exemplo, o consumo da madeira e garantir maior eficiência energética na construção civil.

Informação para a sociedade

Os pesquisadores participantes da redação do relatório destacaram que, entre as virtudes do documento, está a de reunir dados de estudos científicos realizados ao longo dos últimos anos no Brasil que estavam dispersos e disponibilizar à sociedade e aos tomadores de decisão informações técnico-científicas críveis capazes de auxiliar no desenvolvimento de estratégias de adaptação e mitigação para os possíveis impactos das mudanças climáticas.

“Nós, cientistas, temos o desafio de conseguir traduzir a seriedade e a gravidade do momento e as oportunidades que as mudanças climáticas globais encerram para a sociedade. Sabemos que a inação representa a ação menos inteligente que a sociedade pode tomar”, disse Paulo Nobre, coordenador da Rede Clima.

Por sua vez, Celso Lafer, presidente da FAPESP, destacou, na abertura do evento, que a Fundação tem interesse especial nas pesquisas sobre mudanças climáticas, expresso no Programa FAPESP de Pesquisa em Mudanças Climáticas Globais (PFPMCG), mantido pela instituição.

“Uma das preocupações básicas da FAPESP é pesquisar e averiguar o impacto das mudanças climáticas globais naquilo que afeta as especificidades do Brasil e do Estado de São Paulo”, afirmou.

Também participaram da abertura do evento Bruno Covas, secretário do Meio Ambiente do Estado de São Paulo, Carlos Nobre, secretário de Políticas e Programa de Pesquisa e Desenvolvimento (Seped) do Ministério da Ciência, Tecnologia e Inovação (MCTI), e Paulo Artaxo, membro da coordenação do PFPMCG.

Carlos Nobre ressaltou que o relatório será a principal fonte de informações que orientará o Plano Nacional de Mudanças Climáticas que, no momento, está em revisão.

“É muito importante que os resultados desse estudo orientem os trabalhos em Brasília e em várias partes do Brasil, em um momento crítico de reorientar a política nacional, que tem de ir na direção de tornar a economia, a sociedade e o ambiente mais resilientes às inevitáveis mudanças climáticas que estão por vir”, afirmou.

Segundo ele, o Brasil já sinalizou compromisso com a mitigação, materializado na Política Nacional de Mudanças Climáticas e que prevê redução de 10% e 15% das emissões entre 2010 e 2020, respectivamente, relativamente a 2005.

“São Paulo lançou, em 2009, um programa ambicioso, de redução de 20% das emissões, já que a questão da mudança no uso da terra não é uma questão tão importante no Estado, mas sim o avanço tecnológico na geração de energia e em processos produtivos. O Brasil é o único país em desenvolvimento com metas voluntárias para redução de emissões”.

Ele ressaltou, entretanto, que “a adaptação ficou desassistida”. “Não é só mitigar; é preciso também se adaptar às mudanças climáticas. As três redes de pesquisa – Clima, INCT e FAPESP – avançam na adaptação, que é o guia para o desenvolvimento sustentável.”

* Colaboraram Claudia Izique e Noêmia Lopes

 

Modelo para entender as variações climáticas no país

10/09/2013

Por Noêmia Lopes

Modelo Brasileiro do Sistema Terrestre é considerado fundamental para pesquisas sobre o clima(foto:Nasa)

Agência FAPESP – O desmatamento da Amazônia, as queimadas, os processos de interação entre o Oceano Atlântico e a atmosfera são algumas das questões climáticas particulares do Brasil que o Modelo Brasileiro do Sistema Terrestre (BESM, na sigla em inglês) leva em consideração e pretende dar conta de uma forma que mesmo os melhores modelos do mundo não são capazes de fazer.

O modelo foi apresentado em detalhes na segunda-feira (09/09) na abertura da 1ª Conferência Nacional de Mudanças Climáticas Globais, realizada em São Paulo.

Determinar o quanto o clima já mudou, o quanto as suas características ainda vão se alterar e onde isso ocorrerá é a base para o planejamento de políticas públicas de adaptação aos impactos das mudanças climáticas globais, segundo Paulo Nobre, coordenador geral do BESM e da Rede Brasileira de Pesquisa sobre Mudanças Climáticas Globais (Rede Clima), na abertura do evento.

“O BESM é um eixo estruturante da pesquisa em mudanças climáticas no Brasil e oferece subsídios para o Programa FAPESP de Pesquisa sobre Mudanças Climáticas Globais (PFPMCG), para a Rede Clima e para o Instituto Nacional de Ciência e Tecnologia para Mudanças Climáticas (INCT-MC)”, disse Nobre.

Com o intuito de colaborar com as previsões sobre as consequências do aquecimento global e o consequente aumento na frequência dos eventos extremos, o BESM roda no supercomputador Tupã da Rede Clima/PFPMCG, reunindo fluxos atmosféricos, oceânicos, de superfície e, em breve, químicos. Adquirido pelo Ministério da Ciência, Tecnologia e Inovação (MCTI) com apoio da FAPESP, o Tupã está instalado na unidade do Instituto Nacional de Pesquisas Espaciais (Inpe) de Cachoeira Paulista

Embora ainda não esteja em sua versão final e completa, o modelo já consegue, por exemplo, reconstituir a ocorrência dos últimos El Niños e estimar o retorno desse fenômeno, explicar as chuvas na Zona de Convergência do Atlântico Sul e até estimar a variação do gelo no globo de maneira satisfatória.

“Os cenários mais atuais, de 2013, estão disponíveis no supercomputador. Estamos tomando medidas para que o acesso seja disponibilizado ao público em geral até o final do ano”, afirmou Nobre.

Trinta pesquisadores estão diretamente envolvidos no desenvolvimento do BESM e outros 40 estão ligados indiretamente. De acordo com Nobre, esse número deve duplicar em cinco anos e duplicar novamente em 10 anos. “Para tanto, é preciso que cada vez mais jovens doutores integrem esse desafio.”

Nobre ressaltou ainda as graves consequências que o aumento da frequência dos eventos extremos – tais como secas, enchentes e tornados – tem sobre a vida das populações: “Nossas cidades e nossos campos não foram projetados para conviver com esse novo clima, o que torna essencial a entrada do Brasil na modelagem das mudanças climáticas”, disse.

Os quatro componentes

Iracema Cavalcanti, pesquisadora do Inpe, apresentou, durante o primeiro dia da conferência, os resultados já obtidos a partir do componente atmosférico do BESM.

Em comparação com os dados observados, o modelo consegue alcançar boas simulações de variabilidade sazonal (modificações climatológicas das estações), fluxos de umidade, variabilidade interanual (diferença entre anomalias de pressão), variabilidade da precipitação, entre outros. “Ainda há deficiências, como na maioria dos modelos globais. Mas as características gerais estão contempladas”, disse Cavalcanti.

Em relação aos oceanos, não é possível separá-los do que ocorre no modelo atmosférico, daí a modelagem acoplada ser estratégica para o funcionamento eficiente do BESM. A avaliação é de Leo Siqueira, também do Inpe, que apresentou os desafios desse componente.

“Já conseguimos boas representações, mas queremos melhorar as simulações de temperatura dos oceanos, principalmente nos trópicos, e do gelo marinho. Também queremos criar uma discussão saudável sobre qual modelo de gelo terrestre será adotado dentro do BESM”, disse Siqueira.

A modelagem do terceiro componente, superfície, foi apresentada por Marcos Heil Costa, da Universidade Federal de Viçosa (UFV). “A principal função de um modelo que integra processos superficiais é fornecer fluxos de energia e vapor d’água entre a vegetação e a atmosfera. Isso é o básico. Mas o aprimoramento do sistema, ocorrido ao longo dos últimos anos, permitiu que outros processos fossem incorporados”, afirmou.

Alguns desses processos são: fluxos de radiação, energia e massa; ciclos do carbono e do nitrogênio terrestres; recuperação de áreas abandonadas; incêndios na vegetação natural; culturas agrícolas; vazão e áreas inundadas sazonais; uso do solo por ações humanas (desmatamento); representação específica dos ecossistemas; fertilidade do solo; entre outros.

Química é o quarto e mais recente dos componentes. “Sem um modelo químico, os demais precisam ser constantemente ajustados”, disse Sérgio Correa, pesquisador da Universidade Estadual do Rio de Janeiro (UERJ). Correa está à frente de estudos sobre química atmosférica que permitirão refinar os dados do BESM quando esse componente for acoplado ao modelo – uma das próximas fases previstas.

Colaborações com pesquisadores de instituições estrangeiras e treinamentos são outras estratégias que estão em vigor para melhorar as representações do Brasil e da América do Sul, com alcance também global.

Mais informações sobre a 1ª Conferência Nacional de Mudanças Climáticas Globais: www.fapesp.br/conclima

Anthropology and the Anthropocene (Anthropology News)

Anthropology and Environment Society

September 2013

Amelia Moore

“The Anthropocene” is a label that is gaining popularity in the natural sciences.  It refers to the pervasive influence of human activities on planetary systems and biogeochemical processes.   Devised by Earth scientists, the term is poised to formally end the Holocene Epoch as the geological categorization for Earth’s recent past, present, and indefinite future.  The term is also poised to become the informal slogan of a revitalized environmental movement that has been plagued by popular indifference in recent years.

Climate change is the most well known manifestation of anthropogenic global change, but it is only one example of an Anthropocene event.  Other examples listed by the Earth sciences include biodiversity loss, changes in planetary nutrient cycling, deforestation, the hole in the ozone layer, fisheries decline, and the spread of invasive species.  This change is said to stem from the growth of the human population and the spread of resource intensive economies since the Industrial Revolution (though the initial boundary marker is in dispute with some scientists arguing for the Post-WWII era and others for the advent of agriculture as the critical tipping point).  Whatever the boundary, the Anthropocene signifies multiple anthropological opportunities.

What stance should we, as anthropologists, take towards the Anthropocene? I argue that there are two (and likely more), equally valid approaches to the Anthropocene: anthropology in the Anthropocene and anthropology of the Anthropocene.  Anthropology in the Anthropocene already exists in the form of climate ethnography and work that documents the lived experience of global environmental change.  Arguably, ethnographies of protected areas and transnational conservation strategies exemplify this field as well.  Anthropology in the Anthropocene is characterized by an active concern for the detrimental affects of anthropogenesis on populations and communities that have been marginalized to bear the brunt of global change impacts or who have been haphazardly caught up in global change solution strategies.  This work is engaged with environmental justice and oriented towards political action.

Anthropology of the Anthropocene is much smaller and less well known than anthropology in the Anthropocene, but it will be no less crucial.  Existing work in this vein includes those who take a critical stance towards climate science and politics as social processes with social consequences.  Beyond deconstruction, these critical scholars investigate what forms scientific and political assemblages create and how they participate in remaking the world anew.  Other existing research in this mode interrogates the idea of biodiversity and the historical and cultural context for the notion of anthropogenesis itself.  In the near future, we will see more work that can enquire into both the sociocultural and socioecological implications and manifestations of Anthropocene discourse, practice and logic.

I have only created cursory sketches of anthropology in the Anthropocene and anthropology of the Anthropocene here.  However, these modes are not at all mutually exclusive, and they should inspire many possibilities for future work.  The centrality of anthropos, the idea of the human, within the logics of the Anthropocene is an invitation for anthropology to renew its engagements with the natural sciences in research collaborations and as the object of research, especially the ecological and Earth sciences.

For starters, we should consider the implications of the Anthropocene idea for our understandings of history and collectivity.  If the natural world is finally gaining recognition within the authoritative sciences as intimately interconnected with human life such that these two worlds cease to be separate arenas of thought and action or take on different salience, then both the Humanities and the natural sciences need to devise more appropriate modes of analysis that can speak to emergent socioecologies.  This has begun in anthropology with some recent works of environmental health studies, political ecology, and multispecies ethnography, but is still in its infancy.

In terms of opportunities for legal and political engagement, the Anthropocene signifies possibilities for reconceptualizing environmentalism, conservation and development.  Anthropologists should be cognizant of new design paradigms and models for organizing socioecological collectives from the urban to the small island to the riparian.  We should also be on the lookout for new political collaborations and publics creating conversations utilizing multiple avenues for communication in the academic realm and beyond.  Emergent asymmetries in local and transnational markets and the formation of new multi-sited assemblages of governance should be of special importance.

In terms of science, the Anthropocene signals new horizons for studying and participating in global change science.  The rise of interdisciplinary socioecology, the biosciences of coupled natural and human complexity, geoengineering and the biotech interest in de-extinction are just a sampling of important transformations in research practices, research objects, and the shifting boundaries between the lab and the field.  Ongoing scientific reorientation will continue to yield new arguments about emergent forms of life that will participate in the creation of future assemblages, publics, and movements.

I would also like to caution against potentially unhelpful uses of the Anthropocene idea.  The term should not become a brand signifying a specific style of anthropological research.  It should not gloss over rigid solidifications of time, space, the human, or life.  We should not celebrate creativity in the Anthropocene while ignoring instances of stark social differentiation and capital accumulation, just as we should not focus on Anthropocene assemblages as only hegemonic in the oppressive sense.   Further, we should be cautious with our utilization of the crisis rhetoric surrounding events in the Anthropocene, recognizing that crisis for some can be turned into multiple forms of opportunity for others.  Finally, we must admit the possibility that the Anthropocene may not succeed in gaining lasting traction through formal designation or popularization, and we should not overstate its significance by assuming its universal acceptance.

In the next year, the Section News Column of the Anthropology and Environment Society will explore news, events, projects, and arguments from colleagues and students experimenting with various framings of the Anthropocene in addition to its regular content.  If you would like to contribute to this column, please contact Amelia Moore at a.moore4@miami.edu.

– See more at: http://www.anthropology-news.org/index.php/2013/09/09/anthropology-and-the-anthropocene/#sthash.vBo1RtuY.dpuf

The Myth of ‘Environmental Catastrophism’ (Monthly Review)

Between October 2010 and April 2012, over 250,000 people, including 133,000 children under five, died of hunger caused by drought in Somalia. Millions more survived only because they received food aid. Scientists at the UK Met Centre have shown that human-induced climate change made this catastrophe much worse than it would otherwise have been.1

This is only the beginning: the United Nations’ 2013 Human Development Report says that without coordinated global action to avert environmental disasters, especially global warming, the number of people living in extreme poverty could increase by up to 3 billion by 2050.2 Untold numbers of children will die, killed by climate change.

If a runaway train is bearing down on children, simple human solidarity dictates that anyone who sees it should shout a warning, that anyone who can should try to stop it. It is difficult to imagine how anyone could disagree with that elementary moral imperative.

And yet some do. Increasingly, activists who warn that the world faces unprecedented environmental danger are accused of catastrophism—of raising alarms that do more harm than good. That accusation, a standard feature of right-wing attacks on the environmental movement, has recently been advanced by some left-wing critics as well. While they are undoubtedly sincere, their critique of so-called environmental catastrophism does not stand up to scrutiny.

From the Right…

The word “catastrophism” originated in nineteenth-century geology, in the debate between those who believed all geological change had been gradual and those who believed there had been episodes of rapid change. Today, the word is most often used by right-wing climate change deniers for whom it is a synonym for “alarmism.”

  • The Heartland Institute: “Climate Catastrophism Picking Up Again in the U.S. and Across the World.”3
  • A right-wing German blog: “The Climate Catastrophism Cult.”4
  • The Australian journal Quadrant: “The Chilling Costs of Climate Catastrophism.”5

Examples could be multiplied. As environmental historian Franz Mauelshagen writes, “In climate denialist circles, the word ‘climate catastrophe’ has become synonymous with ‘climate lie,’ taking the anthropogenic green house effect for a scam.”6

Those who hold such views like to call themselves “climate change skeptics,” but a more accurate term is “climate science deniers.” While there are uncertainties about the speed of change and its exact effects, there is no question that global warming is driven by greenhouse-gas emissions caused by human activity, and that if business as usual continues, temperatures will reach levels higher than any seen since before human beings evolved. Those who disagree are not skeptical, they are denying the best scientific evidence and analysis available.

The right labels the scientific consensus “catastrophism” to belittle environmentalism, and to stifle consideration of measures to delay or prevent the crisis. The real problem, they imply, is not the onrushing train, but the people who are yelling “get off the track!” Leaving the track would disrupt business as usual, and that is to be avoided at all costs.

…And From the Left

Until very recently, “catastrophism” as a political expression was pretty much the exclusive property of conservatives. When it did occur in left-wing writing, it referred to economic debates, not ecology. But in 2007 two quite different left-wing voices almost simultaneously adopted “catastrophism” as a pejorative term for radical ideas about climate change they disagreed with.

The most prominent was the late Alexander Cockburn, who in 2007 was writing regularly forThe Nation and coediting the newsletter CounterPunch. To the shock of many of his admirers, he declared that “There is still zero empirical evidence that anthropogenic production of CO2 is making any measurable contribution to the world’s present warming trend,” and that “the human carbon footprint is of zero consequence.”7 Concern about climate change was, he wrote, the result of a conspiracy “between the Greenhouser fearmongers and the nuclear industry, now largely owned by oil companies.”8

Like critics on the right, Cockburn charged that the left was using climate change to sneak through reforms it could not otherwise win: “The left has bought into environmental catastrophism because it thinks that if it can persuade the world that there is indeed a catastrophe, then somehow the emergency response will lead to positive developments in terms of social and environmental justice.”9

While Cockburn’s assault on “environmental catastrophism” was shocking, his arguments did not add anything new to the climate debate. They were the same criticisms we had long heard from right-wing deniers, albeit with leftish vocabulary.

That was not the case with Leo Panitch and Colin Leys. These distinguished Marxist scholars are by no means deniers. They began their preface to the 2007 Socialist Register by noting that “environmental problems might be so severe as to potentially threaten the continuation of anything that might be considered tolerable human life,” and insisting that “the speed of development of globalized capitalism, epitomized by the dramatic acceleration of climate change, makes it imperative for socialists to deal seriously with these issues now.”

But then they wrote: “Nonetheless, it is important to try to avoid an anxiety-driven ecological catastrophism, parallel to the kind of crisis-driven economic catastrophism that announces the inevitable demise of capitalism.”10 They went on to argue that capitalism’s “dynamism and innovativeness” might enable it to use “green commerce” to escape environmental traps.

The problem with the Panitch–Leys formulation is that the threat of ecological catastrophe isnot “parallel” to the view that capitalism will destroy itself. The desire to avoid the kind of mechanical determinism that has often characterized Marxist politics, where every crisis was proclaimed to be the final battle, led these thoughtful writers to confuse two very different kinds of catastrophe.

The idea that capitalism will inevitably face an insurmountable economic crisis and collapse is based on a misunderstanding of Marxist economic theory. While economic crises are endemic to capitalism, the system can always continue—only class struggle, only a social revolution, can overthrow capitalism and end the crisis cycle.

Large-scale environmental damage is caused by our destructive economic system, but its effectis the potentially irreversible disruption of essential natural systems. The most dramatic example is global warming: recent research shows that the earth is now warmer than at any time in the past 6,000 years, and temperatures are rising much faster than at any time since the last Ice Age. Arctic ice and the Greenland ice sheet are disappearing faster than predicted, raising the specter of flooding in coastal areas where more than a billion people live. Extreme weather events, such as giant storms, heat waves, and droughts are becoming ever more frequent. So many species are going extinct that many scientists call it a mass extinction event, comparable to the time 66 million years ago when 75 percent of all species, including the dinosaurs, were wiped out.

As the editors of Monthly Review wrote in reply to Socialist Register, if these trends continue, “we will be faced with a different world—one in which life on the planet will be massively degraded on a scale not seen for tens of millions of years.”11 To call this “anxiety-driven ecological catastrophism, parallel to…economic catastrophism” is to equate an abstract error in economic theory with some of the strongest conclusions of modern science.

A New ‘Catastrophism’ Critique

Now a new essay, provocatively titled “The Politics of Failure Have Failed,” offers a different and more sweeping left-wing critique of “environmental catastrophism.” Author Eddie Yuen is associated with the Pacifica radio program Against the Grain, and is on the editorial board of the journal Capitalism Nature Socialism.

His paper is part of a broader effort to define and critique a body of political thought calledCatastrophism, in a book by that title.12 In the book’s introduction, Sasha Lilley offers this definition:

Catastrophism presumes that society is headed for a collapse, whether economic, ecological, social, or spiritual. This collapse is frequently, but not always, regarded as a great cleansing, out of which a new society will be born. Catastrophists tend to believe that an ever-intensified rhetoric of disaster will awaken the masses from their long slumber—if the mechanical failure of the system does not make such struggles superfluous. On the left, catastrophism veers between the expectation that the worse things become, the better they will be for radical fortunes, and the prediction that capitalism will collapse under its own weight. For parts of the right, worsening conditions are welcomed, with the hope they will trigger divine intervention or allow the settling of scores for any modicum of social advance over the last century.

A political category that includes both the right and the left—and that encompasses people whose concerns might be economic, ecological, social, or spiritual—is, to say the least, unreasonably broad. It is difficult to see any analytical value in a definition that lumps together anarchists, fascists, Christian fundamentalists, right-wing conspiracy nuts, pre–1914 socialists, peak-oil theorists, obscure Trotskyist groups, and even Mao Zedong.

The definition of catastrophism becomes even more problematic in Yuen’s essay.

One Of These Things Is Not Like The Others…

Years ago, the children’s television program Sesame Street would display four items—three circles and a square, three horses and a chair, and so on—while someone sang, “One of these things is not like the others, One of these things doesn’t belong.

I thought of that when I read Yuen’s essay.

While the book’s scope is broad, most of it focuses, as Yuen writes, on “instrumental, spurious, and sometimes maniacal versions of catastrophism—including rightwing racial paranoia, religious millenarianism, liberal panics over fascism, leftist fetishization of capitalist collapse, capitalist invocation of the ‘shock doctrine’ and pop culture cliché.”

But as Yuen admits in his first paragraph, environmentalism is a very different matter, because we are in “what is unquestionably a genuine catastrophic moment in human and planetary history…. Of all of the forms of catastrophic discourse on offer, the collapse of ecological systems is unique in that it is definitively verified by a consensus within the scientific community…. It is absolutely urgent to address this by effectively and rapidly changing the direction of human society.”

If the science is clear, if widespread ecological collapse unquestionably faces us unless action is taken, why is this topic included in a book devoted to criticizing false ideas? Does it make sense to use the same term for people who believe in an imaginary train crash and for people who are trying to stop a real crash from happening?

The answer, although he does not say so, is that Yuen is using a different definition than the one Lilley gave in her introduction. Her version used the word for the belief that some form of catastrophe will have positive results—that capitalism will collapse from internal contradictions, that God will punish all sinners, that peak oil or industrial collapse will save the planet. Yuen uses the same word for the idea that environmentalists should alert people to the threat of catastrophic environmental change and try to mobilize them to prevent or minimize it.

Thus, when he refers to “a shrill note of catastrophism” in the work of James Hansen, perhaps the world’s leading climate scientist, he is not challenging the accuracy of Hansen’s analysis, but only the “narrative strategy” of clearly stating the probable results of continuing business as usual.

Yuen insists that “the veracity of apocalyptic claims about ecological collapse are separate from their effects on social, political, and economic life.” Although “the best evidence points to cascading environmental disaster,” in his view it is self-defeating to tell people that. He makes two arguments, which we can label “practical” and “principled.”

His practical argument is that by talking about “apocalyptic scenarios” environmentalists have made people more apathetic, less likely to fight for progressive change. His principledargument is that exposing and campaigning to stop tendencies towards environmental collapse has “damaging and rightward-leaning effects”—it undermines the left, promotes reactionary policies and strengthens the ruling class.

In my opinion, he is wrong on both counts.

The Truth Shall Make You Apathetic?

In Yuen’s view, the most important question facing people who are concerned about environmental destruction is: “what narrative strategies are most likely to generate effective and radical social movements?”

He is vague about what “narrative strategies” might work, but he is very firm about what does not. He argues that environmentalists have focused on explaining the environmental crisis and warning of its consequences in the belief that this will lead people to rise up and demand change, but this is a fallacy. In reality, “once convinced of apocalyptic scenarios, many Americans become more apathetic.”

Given such a sweeping assertion, it is surprising to find that the only evidence Yuen offers is a news release describing one academic paper, based on a U.S. telephone survey conducted in 2008, that purported to show that “more informed respondents both feel less personally responsible for global warming, and also show less concern for global warming.”13

Note first that being “more informed” is not the same as being “convinced of apocalyptic scenarios” or being bombarded with “increasingly urgent appeals about fixed ecological tipping points.” On the face of it, this study does not appear to contribute to our understanding of the effects of “catastrophism.”

What’s more, reading the original paper reveals that the people described as “more informed” were self-reporting. If they said they were informed, that was accepted, and no one asked if they were listening to climate scientists or to conservative talk radio. That makes the paper’s conclusion meaningless.

Later in his essay, Yuen correctly criticizes some environmentalists and scientists who “speak of ‘everyone’ as a unified subject.” But here he accepts as credible a study that purports to show how all Americans respond to information about climate change, regardless of class, gender, race, or political leanings.

The problem with such undifferentiated claims is shown in a 2011 study that examined the impact of Americans’ political opinions on their feelings about climate change. It found that liberals and Democrats who report being well-informed are more worried about climate change, while conservatives and Republicans who report being well-informed are less worried.14 Obviously the two groups mean very different things by “well-informed.”

Even if we ignore that, the study Yuen cites is a one-time snapshot—it does not tells us what radicals really need to know, which is how things are changing. For that, a more useful survey is one that scientists at Yale University and George Mason University have conducted seven times since 2008 to show shifts in U.S. public opinion.15 Based on answers to questions about their opinions, respondents are categorized according to their attitude towards global warming. The surveys show:

  • The number of people identified as “Disengaged” or “Cautious”—those we might call apathetic or uncertain—has varied very little, accounting for between 31 percent and 35 percent of the respondents every time.
  • The categories “Dismissive” or “Doubtful”—those who lean towards denial—increased between 2008 and 2010. Since then, those groups have shrunk back almost to the 2008 level.
  • In parallel, the combined “Concerned” and “Alarmed” groups shrank between 2008 and 2010, but have since largely recovered. In September 2012—before Hurricane Sandy!—there were more than twice as many Americans in these two categories as in Dismissive/Doubtful.

Another study, published in the journal Climatic Change, used seventy-four independent surveys conducted between 2002 and 2011 to create a Climate Change Threat Index (CCTI)—a measure of public concern about climate change—and showed how it changed in response to public events. It found that public concern about climate change reached an all-time high in 2006–2007, when the Al Gore documentary An Inconvenient Truth was seen in theaters by millions of people and won an Academy Award.

The authors conclude: “Our results…show that advocacy efforts produce substantial changes in public perceptions related to climate change. Specifically, the film An Inconvenient Truth and the publicity surrounding its release produced a significant positive jump in the CCTI.”16

This directly contradicts Yuen’s view that more information about climate change causes Americans to become more apathetic. There is no evidence of a long-term increase in apathy or decrease in concern—and when scientific information about climate change reached millions of people, the result was not apathy but a substantial increase in support for action to reduce greenhouse gas emissions.

‘The Two Greatest Myths’

Yuen says environmentalists have deluged Americans with catastrophic warnings, and this strategy has produced apathy, not action. Writing of establishment politicians who make exactly the same claim, noted climate change analyst Joseph Romm says, “The two greatest myths about global warming communications are 1) constant repetition of doomsday messages has been a major, ongoing strategy and 2) that strategy doesn’t work and indeed is actually counterproductive!” Contrary to liberal mythology, the North American public has not been exposed to anything even resembling the first claim. Romm writes,

The broad American public is exposed to virtually no doomsday messages, let alone constant ones, on climate change in popular culture (TV and the movies and even online)…. The major energy companies bombard the airwaves with millions and millions of dollars of repetitious pro-fossil-fuel ads. The environmentalists spend far, far less money…. Environmentalists when they do appear in popular culture, especially TV, are routinely mocked…. It is total BS that somehow the American public has been scared and overwhelmed by repeated doomsday messaging into some sort of climate fatigue.17

The website Daily Climate, which tracks U.S. news stories about climate change, says coverage peaked in 2009, during the Copenhagen talks—but then it “fell off the map,” dropping 30 percent in 2010 and another 20 percent in 2011. In 2012, despite widespread droughts and Hurricane Sandy, news coverage fell another 2 percent. The decline in editorial interest was even more dramatic—in 2012 newspapers published fewer than half as many editorials about climate change as they did in 2009.18

It should be noted that these shifts occurred in the framework of very limited news coverage of climate issues. As a leading media analyst notes, “relative to other issues like health, medicine, business, crime and government, media attention to climate change remains a mere blip.”19 Similarly, a British study describes coverage of climate change in newspapers there as “lamentably thin”—a problem exacerbated by the fact that much of the coverage consists of “worryingly persistent climate denial stories.” The author concludes drily: “The limited coverage is unlikely to have convinced readers that climate change is a serious problem warranting immediate, decisive and potentially costly action.”20

Given Yuen’s concern that Americans do not recognize the seriousness of environmental crises, it is surprising how little he says about the massive fossil-fuel-funded disinformation campaigns that have confused and distorted media reporting. I can find just four sentences on the subject in his 9,000-word text, and not one that suggests denialist campaigns might have helped undermine efforts to build a climate change movement.

On the contrary, he downplays the influence of “the well-funded climate denial lobby,” by claiming that “far more corporate and elite energy has gone toward generating anxiety about global warming,” and that “mainstream climate science is much better funded.” He provides no evidence for either statement.

Of course, the fossil-fuel lobby is not the only force working to undermine public concern about climate change. It is also important to recognize the impact of Obama’s predictable unwillingness to confront the dominant forces in U.S. capitalism, and of the craven failure of mainstream environmentalist groups and NGOs to expose and challenge the Democrats’ anti-environmental policies.

With fossil-fuel denialists on one side, and Obama’s pale-green cheerleaders on the other, activists who want to get out the truth have barely been heard. In that context, it makes little sense to blame environmentalists for sabotaging environmentalism.

The Truth Will Help the Right?

Halfway through his essay, Yuen abruptly changes direction, leaving the practical argument behind and raising his principled concern. He now argues that what he calls catastrophism leads people to support reactionary policies and promotes “the most authoritarian solutions at the state level.” Focusing attention on what he agrees is a “cascading environmental disaster” is dangerous because it “disables the left but benefits the right and capital.” He says, “Increased awareness of environmental crisis will not likely translate into a more ecological lifestyle, let alone an activist orientation against the root causes of environmental degradation. In fact, right-wing and nationalist environmental politics have much more to gain from an embrace of catastrophism.”

Yuen says that many environmentalists, including scientists, “reflexively overlook class divisions,” and so do not realize that “some business and political elites feel that they can avoid the worst consequences of the environmental crisis, and may even be able to benefit from it.” Yuen apparently thinks those elites are right—while the insurance industry is understandably worried about big claims, he says, “the opportunities for other sectors of capitalism are colossal in scope.”

He devotes much of the rest of his essay to describing the efforts of pro-capitalist forces, conservative and liberal, to use concern about potential environmental disasters to promote their own interests, ranging from emissions trading schemes to military expansion to Malthusian attacks on the world’s poorest people. “The solution offered by global elites to the catastrophe is a further program of austerity, belt-tightening, and sacrifice, the brunt of which will be borne by the world’s poor.”

Some of this is overstated. His claim that “Malthusianism is at the core of most environmental discourse,” reflects either a very limited view of environmentalism or an excessively broad definition of Malthusianism. And he seems to endorse David Noble’s bizarre theory that public concern about global warming has been engineered by a corporate conspiracy to promote carbon trading schemes.21 Nevertheless he is correct that the ruling class will do its best to profit from concern about climate change, while simultaneously offloading the costs onto the world’s poorest people.

The question is, who is he arguing with? This book says it aims to “spur debate among radicals,” but none of this is new or controversial for radicals. The insight that the interests of the ruling class are usually opposed to the interests of the rest of us has been central to left-wing thought since before Marx was born. Capitalists always try to turn crises to their advantage no matter who gets hurt, and they always try to offload the costs of their crises onto the poor and oppressed.

What needs to be proved is not that pro-capitalist forces are trying to steer the environmental movement into profitable channels, and not that many sincere environmentalists have backward ideas about the social and economic causes of ecological crises. Radicals who are active in green movements know those things perfectly well. What needs to be proved is Yuen’s view that warning about environmental disasters and campaigning to prevent them has “damaging and rightward-leaning effects” that are so severe that radicals cannot overcome them.

But no proof is offered.

What is particularly disturbing about his argument is that he devotes pages to describing the efforts of reactionaries to misdirect concern about climate change—and none to the efforts of radical environmentalists to counter those forces. Earlier in his essay, he mentioned that “environmental and climate justice perspectives are steadily gaining traction in internal environmental debates,” but those thirteen words are all he has to say on the subject.

He says nothing about the historic 2010 Cochabamba Conference, where 30,000 environmental activists from 140 countries warned that if greenhouse gas emissions are not stopped, “the damages caused to our Mother Earth will be completely irreversible”—a statement Yuen would doubtless label “catastrophist.” Far from succumbing to apathy or reactionary policies, the participants explicitly rejected market solutions, identified capitalism as the cause of the crisis, and outlined a radical program to transform the global economy.

He is equally silent about the campaign against the fraudulent “green economy” plan adopted at last year’s Rio+20 conference. One of the principal organizers of that opposition is La Via Campesina, the world’s largest organization of peasants and farmers, which warns that the world’s governments are “propagating the same capitalist model that caused climate chaos and other deep social and environmental crises.”

His essay contains not a word about Idle No More, or Occupy, or the Indigenous-led fight against Canada’s tar sands, or the anti-fracking and anti-coal movements. By omitting them, Yuen leaves the false impression that the climate movement is helpless to resist reactionary forces.

Contrary to Yuen’s title, the effort to build a movement to save the planet has not failed. Indeed, Catastrophism was published just four months before the largest U.S. climate change demonstration ever!

The question before radicals is not what “narrative strategy” to adopt, but rather, how will we relate to the growing environmental movement? How will we support its goals while strengthening the forces that see the need for more radical solutions?

What Must Be Done?

Yuen opposes attempts to build a movement around rallies, marches, and other mass protests to get out the truth and to demand action against environmental destruction. He says that strategy worked in the 1960s, when Americans were well-off and naïve, but cannot be replicated in today’s “culture of atomized cynicism.”

Like many who know that decade only from history books or as distant memories, Yuen foreshortens the experience: he knows about the mass protests and dissent late in the decade, but ignores the many years of educational work and slow movement building in a deeply reactionary and racist time. It is not predetermined that the campaign against climate change will take as long as those struggles, or take similar forms, but the real experience of the 1960s should at least be a warning against premature declarations of failure.

Yuen is much less explicit about what he thinks would be an effective strategy, but he cites as positive examples the efforts of some to promote “a bottom-up and egalitarian transition” by:

ever-increasing numbers of people who are voluntarily engaging in intentional communities, sustainability projects, permaculture and urban farming, communing and mil­itant resistance to consumerism…we must consider the alterna­tive posed by the highly imaginative Italian left of the twentieth century. The explosively popular Slow Food movement was origi­nally built on the premise that a good life can be had not through compulsive excess but through greater conviviality and a shared commonwealth.

Compare that to this list of essential tasks, prepared recently by Pablo Solón, a leading figure in the global climate justice movement:

To reduce greenhouse gas emissions to a level that avoids catastrophe, we need to:

* Leave more than two-thirds of the fossil fuel reserves under the soil;

* Stop the exploitation of tar sands, shale gas and coal;

* Support small, local, peasant and indigenous community farming while we dismantle big agribusiness that deforests and heats the planet;

* Promote local production and consumption of products, reducing the free trade of goods that send millions of tons of CO2 while they travel around the world;

* Stop extractive industries from further destroying nature and contaminating our atmosphere and our land;

* Increase significantly public transport to reduce the unsustainable “car way of life”;

* Reduce the emissions of warfare by promoting genuine peace and dismantling the military and war industry and infrastructure.22

The projects that Yuen describes are worthwhile, but unless the participants are alsocommitted to building mass environmental campaigns, they will not be helping to achieve the vital objectives that Solón identifies. Posing local communes and slow food as alternatives to building a movement against global climate change is effectively a proposal to abandon the fight against capitalist ecocide in favor of creating greenish enclaves, while the world burns.

Bright-siding versus Movement Building

Whatever its merits in other contexts, it is not helpful or appropriate to use the wordcatastrophism as a synonym for telling the truth about the environmental dangers we face. Using the same language as right-wing climate science deniers gives the impression that the dangers are non-existent or exaggerated. Putting accurate environmental warnings in the same category as apocalyptic Christian fundamentalism and century-old misreadings of Marxist economic theory leads to underestimation of the threats we face and directs efforts away from mobilizing an effective counterforce.

Yuen’s argument against publicizing the scientific consensus on climate change echoes the myth that liberal politicians and journalists use to justify their failure to challenge the crimes of the fossil-fuel industry. People are tired of all that doom and gloom, they say. It is time for positivemessages! Or, to use Yuen’s vocabulary, environmentalists need to end “apocalyptic rhetoric” and find better “narrative strategies.”

This is fundamentally an elitist position: the people cannot handle the truth, so a knowledgeable minority must sugarcoat it, to make the necessary changes palatable.

David Spratt of the Australian organization Climate Code Red calls that approach “bright-siding,” a reference to the bitterly satirical Monty Python song, “Always Look on the Bright Side of Life.”

The problem is, Spratt writes: “If you avoid including an honest assessment of climate science and impacts in your narrative, it’s pretty difficult to give people a grasp about where the climate system is heading and what needs to be done to create the conditions for living in climate safety, rather than increasing and eventually catastrophic harm.”23 Joe Romm makes the same point: “You’d think it would be pretty obvious that the public is not going to be concerned about an issue unless one explains why they should be concerned.”24

Of course, this does not mean that we only need to explain the science. We need to propose concrete goals, as Pablo Solón has done. We need to show how the scientific consensus about climate change relates to local and national concerns such as pipelines, tar sands, fracking, and extreme weather. We need to work with everyone who is willing to confront any aspect of the crisis, from people who still have illusions about capitalism to convinced revolutionaries. Activists in the wealthy countries must be unstinting in their political and practical solidarity with the primary victims of climate change, indigenous peoples, and impoverished masses everywhere.

We need to do all of that and more.

But the first step is to tell the truth—about the danger we face, about its causes, and about the measures that must be taken to turn back the threat. In a time of universal deceit, telling the truth is a revolutionary act.

Notes

  1.  Fraser C. Lott, Nikolaos Christidis, and Peter A. Stott, “Can the 2011 East African Drought Be Attributed to Human-Induced Climate Change?,” Geophysical Research Letters 40, no. 6 ( March 2013): 1177–81.
  2.  UNDP, “’Rise of South’ Transforming Global Power Balance, Says 2013 Human Development Report,” March 14, 2013, http://undp.org.
  3.  Tom Harris, “Climate Catastrophism Picking Up Again in the U.S. and Across the World,”Somewhat Reasonable, October 10, 2012 http://blog.heartland.org.
  4.  Pierre Gosselin, “The Climate Catastrophism Cult,” NoTricksZone, February 12, 2011, http://notrickszone.com.
  5.  Ray Evans, “The Chilling Costs of Climate Catastrophism,” Quadrant Online, June 2008. http://quadrant.org.au.
  6.  Franz Mauelshagen, “Climate Catastrophism: The History of the Future of Climate Change,” in Andrea Janku, Gerrit Schenk, and Franz Mauelshagen, Historical Disasters in Context: Science, Religion, and Politics (New York: Routledge, 2012), 276.
  7.  Alexander Cockburn, “Is Global Warming a Sin?,” CounterPunch, April 28–30, 2007, http://counterpunch.org.
  8.  Alexander Cockburn, “Who are the Merchants of Fear?,” CounterPunch, May 12–14, 2007, http:// counterpunch.org.
  9.  Alexander Cockburn, “I Am An Intellectual Blasphemer,” Spiked Review of Books, January 9, 2008, http://spiked-online.com.
  10.  Leo Panitch and Colin Leys, “Preface,” Socialist Register 2007: Coming to Terms With Nature(London: Merlin Press/Monthly Review Press, 2006), ix–x.
  11. 11.“Notes from the Editors,” Monthly Review 58, no. 10 (March 2007), http://monthlyreview.org.
  12.  Sasha Lilley, David McNally, Eddie Yuen, and James Davis, Catastrophism: The Apocalyptic Politics of Collapse and Rebirth (Oakland: PM Press, 2012).
  13.  Yuen’s footnote cites an article which is identical to a news release issued the previous day by Texas A&M University; see “Increased Knowledge About Global Warming Leads to Apathy, Study Shows,” Science Daily, March 28, 2008, http://eurekalert.org. The original paper, which Yuen does not cite, is: P.M. Kellstedt, S. Zahran, and A. Vedlitz, “Personal Efficacy, the Information Environment, and Attitudes Towards Global Warming and Climate Change in the United State,”Risk Analysis 28, no. 1 (2008): 113–26.
  14.  Aaron M. McCright and Riley E. Dunlap, “The Politicization of Climate Change and Polarization in the American Public’s Views of Global Warming, 2001–2010,” The Sociological Quarterly 52 (2011): 155–94.
  15.  A. Leiserowitz, et. al., Global Warming’s Six Americas, September 2012 (New Haven, CT: Yale Project on Climate Change Communication, 2013), http://environment.yale.edu.
  16.  Robert J. Brulle, Jason Carmichael, and J. Craig Jenkins, “Shifting Public Opinion on Climate Change: An Empirical Assessment of Factors Influencing Concern Over Climate Change in the U.S., 2002–2010,” Climatic Change 114, no. 2 (September 2012): 169–88.
  17.  Joe Romm, “Apocalypse Not: The Oscars, The Media and the Myth of ‘Constant Repetition of Doomsday Messages’ on Climate,” Climate Progress, February 24, 2013, http://thinkprogress.org.
  18.  Douglas Fischer. “2010 in Review: The Year Climate Coverage ‘Fell off the Map,’” Daily Climate, January 3, 2011. http://dailyclimate.org; “Climate Coverage Down Again in 2011,” Daily Climate, January 3, 2012, http://dailyclimate.org; “Climate Coverage, Dominated by Weird Weather, Falls Further in 2012,” Daily Climate, January 2, 2013, http://dailyclimate.org.
  19.  Maxwell T. Boykoff, Who Speaks for the Climate?: Making Sense of Media Reporting on Climate Change (Cambridge: Cambridge University Press, 2011), 24.
  20.  Neil T. Gavin, “Addressing Climate Change: A Media Perspective,” Environmental Politics 18, no. 5 (September 2009): 765–80.
  21.  Two responses to David Noble are: Derrick O’Keefe, “Denying Time and Place in the Global Warming Debate,” Climate & Capitalism, June 7, 2007, http://climateandcapitalism.com; Justin Podur, “Global Warming Suspicions and Confusions,” ZNet, May 11, 2007, http://zcommunications.org.
  22.  Pablo Solón, “A Contribution to the Climate Space 2013: How to Overcome the Climate Crisis?,”Climate Space, March 14, 2013, http://climatespace2013.wordpress.com.
  23.  David Spratt, Always Look on the Bright Side of Life: Bright-siding Climate Advocacy and Its Consequences, April 2012, http://climatecodered.org.
  24.  Joe Romm, “Apocalypse Not.”

Ian Angus is editor of the online journal Climate & Capitalism. He is co-author of Too Many People? Population, Immigration, and the Environmental Crisis(Haymarket, 2011), and editor of The Global Fight for Climate Justice(Fernwood, 2010).
He would like to thank Simon Butler, Martin Empson, John Bellamy Foster, John Riddell, Javier Sethness, and Chris Williams for comments and suggestions.

Rising Seas (Nat Geo)

Picture of Seaside Heights, New Jersey, after Hurricane Sandy

As the planet warms, the sea rises. Coastlines flood. What will we protect? What will we abandon? How will we face the danger of rising seas?

By Tim Folger

Photographs by George Steinmetz

September 2013

By the time Hurricane Sandy veered toward the Northeast coast of the United States last October 29, it had mauled several countries in the Caribbean and left dozens dead. Faced with the largest storm ever spawned over the Atlantic, New York and other cities ordered mandatory evacuations of low-lying areas. Not everyone complied. Those who chose to ride out Sandy got a preview of the future, in which a warmer world will lead to inexorably rising seas.

Brandon d’Leo, a 43-year-old sculptor and surfer, lives on the Rockaway Peninsula, a narrow, densely populated, 11-mile-long sandy strip that juts from the western end of Long Island. Like many of his neighbors, d’Leo had remained at home through Hurricane Irene the year before. “When they told us the tidal surge from this storm would be worse, I wasn’t afraid,” he says. That would soon change.

D’Leo rents a second-floor apartment in a three-story house across the street from the beach on the peninsula’s southern shore. At about 3:30 in the afternoon he went outside. Waves were crashing against the five-and-a-half-mile-long boardwalk. “Water had already begun to breach the boardwalk,” he says. “I thought, Wow, we still have four and a half hours until high tide. In ten minutes the water probably came ten feet closer to the street.”

Back in his apartment, d’Leo and a neighbor, Davina Grincevicius, watched the sea as wind-driven rain pelted the sliding glass door of his living room. His landlord, fearing the house might flood, had shut off the electricity. As darkness fell, Grincevicius saw something alarming. “I think the boardwalk just moved,” she said. Within minutes another surge of water lifted the boardwalk again. It began to snap apart.

Three large sections of the boardwalk smashed against two pine trees in front of d’Leo’s apartment. The street had become a four-foot-deep river, as wave after wave poured water onto the peninsula. Cars began to float in the churning water, their wailing alarms adding to the cacophony of wind, rushing water, and cracking wood. A bobbing red Mini Cooper, its headlights flashing, became wedged against one of the pine trees in the front yard. To the west the sky lit up with what looked like fireworks—electrical transformers were exploding in Breezy Point, a neighborhood near the tip of the peninsula. More than one hundred homes there burned to the ground that night.

The trees in the front yard saved d’Leo’s house, and maybe the lives of everyone inside—d’Leo, Grincevicius, and two elderly women who lived in an apartment downstairs. “There was no option to get out,” d’Leo says. “I have six surfboards in my apartment, and I was thinking, if anything comes through the wall, I’ll try to get everyone on those boards and try to get up the block. But if we’d had to get in that water, it wouldn’t have been good.”

After a fitful night’s sleep d’Leo went outside shortly before sunrise. The water had receded, but thigh-deep pools still filled parts of some streets. “Everything was covered with sand,” he says. “It looked like another planet.”

A profoundly altered planet is what our fossil-fuel-driven civilization is creating, a planet where Sandy-scale flooding will become more common and more destructive for the world’s coastal cities. By releasing carbon dioxide and other heat-trapping gases into the atmosphere, we have warmed the Earth by more than a full degree Fahrenheit over the past century and raised sea level by about eight inches. Even if we stopped burning all fossil fuels tomorrow, the existing greenhouse gases would continue to warm the Earth for centuries. We have irreversibly committed future generations to a hotter world and rising seas.

In May the concentration of carbon dioxide in the atmosphere reached 400 parts per million, the highest since three million years ago. Sea levels then may have been as much as 65 feet above today’s; the Northern Hemisphere was largely ice free year-round. It would take centuries for the oceans to reach such catastrophic heights again, and much depends on whether we manage to limit future greenhouse gas emissions. In the short term scientists are still uncertain about how fast and how high seas will rise. Estimates have repeatedly been too conservative.

Global warming affects sea level in two ways. About a third of its current rise comes from thermal expansion—from the fact that water grows in volume as it warms. The rest comes from the melting of ice on land. So far it’s been mostly mountain glaciers, but the big concern for the future is the giant ice sheets in Greenland and Antarctica. Six years ago the Intergovernmental Panel on Climate Change (IPCC) issued a report predicting a maximum of 23 inches of sea-level rise by the end of this century. But that report intentionally omitted the possibility that the ice sheets might flow more rapidly into the sea, on the grounds that the physics of that process was poorly understood.

As the IPCC prepares to issue a new report this fall, in which the sea-level forecast is expected to be slightly higher, gaps in ice-sheet science remain. But climate scientists now estimate that Greenland and Antarctica combined have lost on average about 50 cubic miles of ice each year since 1992—roughly 200 billion metric tons of ice annually. Many think sea level will be at least three feet higher than today by 2100. Even that figure might be too low.

“In the last several years we’ve observed accelerated melting of the ice sheets in Greenland and West Antarctica,” says Radley Horton, a research scientist at Columbia University’s Earth Institute in New York City. “The concern is that if the acceleration continues, by the time we get to the end of the 21st century, we could see sea-level rise of as much as six feet globally instead of two to three feet.” Last year an expert panel convened by the National Oceanic and Atmospheric Administration adopted 6.6 feet (two meters) as its highest of four scenarios for 2100. The U.S. Army Corps of Engineers recommends that planners consider a high scenario of five feet.

One of the biggest wild cards in all sea-level-rise scenarios is the massive Thwaites Glacier in West Antarctica. Four years ago NASA sponsored a series of flights over the region that used ice-penetrating radar to map the seafloor topography. The flights revealed that a 2,000-foot-high undersea ridge holds the Thwaites Glacier in place, slowing its slide into the sea. A rising sea could allow more water to seep between ridge and glacier and eventually unmoor it. But no one knows when or if that will happen.“That’s one place I’m really nervous about,” says Richard Alley, a glaciologist at Penn State University and an author of the last IPCC report. “It involves the physics of ice fracture that we really don’t understand.” If the Thwaites Glacier breaks free from its rocky berth, that would liberate enough ice to raise sea level by three meters—nearly ten feet. “The odds are in our favor that it won’t put three meters in the ocean in the next century,” says Alley. “But we can’t absolutely guarantee that. There’s at least some chance that something very nasty will happen.”

Even in the absence of something very nasty, coastal cities face a twofold threat: Inexorably rising oceans will gradually inundate low-lying areas, and higher seas will extend the ruinous reach of storm surges. The threat will never go away; it will only worsen. By the end of the century a hundred-year storm surge like Sandy’s might occur every decade or less. Using a conservative prediction of a half meter (20 inches) of sea-level rise, the Organisation for Economic Co-operation and Development estimates that by 2070, 150 million people in the world’s large port cities will be at risk from coastal flooding, along with $35 trillion worth of property—an amount that will equal 9 percent of the global GDP. How will they cope?

“During the last ice age there was a mile or two of ice above us right here,” says Malcolm Bowman, as we pull into his driveway in Stony Brook, New York, on Long Island’s north shore. “When the ice retreated, it left a heap of sand, which is Long Island. All these rounded stones you see—look there,” he says, pointing to some large boulders scattered among the trees near his home. “They’re glacial boulders.”

Bowman, a physical oceanographer at the State University of New York at Stony Brook, has been trying for years to persuade anyone who will listen that New York City needs a harbor-spanning storm-surge barrier. Compared with some other leading ports, New York is essentially defenseless in the face of hurricanes and floods. London, Rotterdam, St. Petersburg, New Orleans, and Shanghai have all built levees and storm barriers in the past few decades. New York paid a high price for its vulnerability last October. Sandy left 43 dead in the city, of whom 35 drowned; it cost the city some $19 billion. And it was all unnecessary, says Bowman.

“If a system of properly designed storm-surge barriers had been built—and strengthened with sand dunes at both ends along the low-lying coastal areas—there would have been no flooding damage from Sandy,” he says.

Bowman envisions two barriers: one at Throgs Neck, to keep surges from Long Island Sound out of the East River, and a second one spanning the harbor south of the city. Gates would accommodate ships and tides, closing only during storms, much like existing structures in the Netherlands and elsewhere. The southern barrier alone, stretching five miles between Sandy Hook, New Jersey, and the Rockaway Peninsula, might cost $10 billion to $15 billion, Bowman estimates. He pictures a six-lane toll highway on top that would provide a bypass route around the city and a light-rail line connecting the Newark and John F. Kennedy Airports.

“It could be an asset to the region,” says Bowman. “Eventually the city will have to face up to this, because the problem is going to get worse. It might take five years of study and another ten years to get the political will to do it. By then there might have been another disaster. We need to start planning immediately. Otherwise we’re mortgaging the future and leaving the next generation to cope as best it can.”

Another way to safeguard New York might be to revive a bit of its past. In the 16th-floor loft of her landscape architectural firm in lower Manhattan, Kate Orff pulls out a map of New York Harbor in the 19th century. The present-day harbor shimmers outside her window, calm and unthreatening on an unseasonably mild morning three months to the day after Sandy hit.

“Here’s an archipelago that protected Red Hook,” Orff says, pointing on the map to a small cluster of islands off the Brooklyn shore. “There was another chain of shoals that connected Sandy Hook to Coney Island.”

The islands and shallows vanished long ago, demolished by harbor-dredging and landfill projects that added new real estate to a burgeoning city. Orff would re-create some of them, particularly the Sandy Hook–Coney Island chain, and connect them with sluice gates that would close during a storm, forming an eco-engineered barrier that would cross the same waters as Bowman’s more conventional one. Behind it, throughout the harbor, would be dozens of artificial reefs built from stone, rope, and wood pilings and seeded with oysters and other shellfish. The reefs would continue to grow as sea levels rose, helping to buffer storm waves—and the shellfish, being filter feeders, would also help clean the harbor. “Twenty-five percent of New York Harbor used to be oyster beds,” Orff says.

Orff estimates her “oystertecture” vision could be brought to life at relatively low cost. “It would be chump change compared with a conventional barrier. And it wouldn’t be money wasted: Even if another Sandy never happens, you’d have a cleaner, restored harbor in a more ecologically vibrant context and a healthier New York.”

In June, Mayor Michael Bloomberg outlined a $19.5 billion plan to defend New York City against rising seas. “Sandy was a temporary setback that can ultimately propel us forward,” he said. The mayor’s proposal calls for the construction of levees, local storm-surge barriers, sand dunes, oyster reefs, and more than 200 other measures. It goes far beyond anything planned by any other American city. But the mayor dismissed the idea of a harbor barrier. “A giant barrier across our harbor is neither practical nor affordable,” Bloomberg said. The plan notes that since a barrier would remain open most of the time, it would not protect the city from the inch-by-inch creep of sea-level rise.

Meanwhile, development in the city’s flood zones continues. Klaus Jacob, a geophysicist at Columbia University, says the entire New York metropolitan region urgently needs a master plan to ensure that future construction will at least not exacerbate the hazards from rising seas.

“The problem is we’re still building the city of the past,” says Jacob. “The people of the 1880s couldn’t build a city for the year 2000—of course not. And we cannot build a year-2100 city now. But we should not build a city now that we know will not function in 2100. There are opportunities to renew our infrastructure. It’s not all bad news. We just have to grasp those opportunities.”

Will New York grasp them after Bloomberg leaves office at the end of this year? And can a single storm change not just a city’s but a nation’s policy? It has happened before. The Netherlands had its own stormy reckoning 60 years ago, and it transformed the country.

The storm roared in from the North Sea on the night of January 31, 1953. Ria Geluk was six years old at the time and living where she lives today, on the island of Schouwen Duiveland in the southern province of Zeeland. She remembers a neighbor knocking on the door of her parents’ farmhouse in the middle of the night to tell them that the dike had failed. Later that day the whole family, along with several neighbors who had spent the night, climbed to the roof, where they huddled in blankets and heavy coats in the wind and rain. Geluk’s grandparents lived just across the road, but water swept into the village with such force that they were trapped in their home. They died when it collapsed.

“Our house kept standing,” says Geluk. “The next afternoon the tide came again. My father could see around us what was happening; he could see houses disappearing. You knew when a house disappeared, the people were killed. In the afternoon a fishing boat came to rescue us.”

In 1997 Geluk helped found the Watersnoodmuseum—the “flood museum”—on Schouwen Duiveland. The museum is housed in four concrete caissons that engineers used to plug dikes in 1953. The disaster killed 1,836 in all, nearly half in Zeeland, including a baby born on the night of the storm.

Afterward the Dutch launched an ambitious program of dike and barrier construction called the Delta Works, which lasted more than four decades and cost more than six billion dollars. One crucial project was the five-mile-long Oosterscheldekering, or Eastern Scheldt barrier, completed 27 years ago to defend Zeeland from the sea. Geluk points to it as we stand on a bank of the Scheldt estuary near the museum, its enormous pylons just visible on the horizon. The final component of the Delta Works, a movable barrier protecting Rotterdam Harbor and some 1.5 million people, was finished in 1997.

Like other primary sea barriers in the Netherlands, it’s built to withstand a 1-in-10,000-year storm—the strictest standard in the world. (The United States uses a 1-in-100 standard.) The Dutch government is now considering whether to upgrade the protection levels to bring them in line with sea-level-rise projections.

Such measures are a matter of national security for a country where 26 percent of the land lies below sea level. With more than 10,000 miles of dikes, the Netherlands is fortified to such an extent that hardly anyone thinks about the threat from the sea, largely because much of the protection is so well integrated into the landscape that it’s nearly invisible.

On a bitingly cold February afternoon I spend a couple of hours walking around Rotterdam with Arnoud Molenaar, the manager of the city’s Climate Proof program, which aims to make Rotterdam resistant to the sea levels expected by 2025. About 20 minutes into our walk we climb a sloping street next to a museum designed by the architect Rem Koolhaas. The presence of a hill in this flat city should have alerted me, but I’m surprised when Molenaar tells me that we’re walking up the side of a dike. He gestures to some nearby pedestrians. “Most of the people around us don’t realize this is a dike either,” he says. The Westzeedijk shields the inner city from the Meuse River a few blocks to the south, but the broad, busy boulevard on top of it looks like any other Dutch thoroughfare, with flocks of cyclists wheeling along in dedicated lanes.

As we walk, Molenaar points out assorted subtle flood-control structures: an underground parking garage designed to hold 10,000 cubic meters—more than 2.5 million gallons—of rainwater; a street flanked by two levels of sidewalks, with the lower one designed to store water, leaving the upper walkway dry. Late in the afternoon we arrive at Rotterdam’s Floating Pavilion, a group of three connected, transparent domes on a platform in a harbor off the Meuse. The domes, about three stories tall, are made of a plastic that’s a hundred times as light as glass.

Inside we have sweeping views of Rotterdam’s skyline; hail clatters overhead as low clouds scud in from the North Sea. Though the domes are used for meetings and exhibitions, their main purpose is to demonstrate the wide potential of floating urban architecture. By 2040 the city anticipates that as many as 1,200 homes will float in the harbor. “We think these structures will be important not just for Rotterdam but for many cities around the world,” says Bart Roeffen, the architect who designed the pavilion. The homes of 2040 will not necessarily be domes; Roeffen chose that shape for its structural integrity and its futuristic appeal. “To build on water is not new, but to develop floating communities on a large scale and in a harbor with tides—that is new,” says Molenaar. “Instead of fighting against water, we want to live with it.”

While visiting the Netherlands, I heard one joke repeatedly: “God may have built the world, but the Dutch built Holland.” The country has been reclaiming land from the sea for nearly a thousand years—much of Zeeland was built that way. Sea-level rise does not yet panic the Dutch.

“We cannot retreat! Where could we go? Germany?” Jan Mulder has to shout over the wind—we’re walking along a beach called Kijkduin as volleys of sleet exfoliate our faces. Mulder is a coastal morphologist with Deltares, a private coastal management firm. This morning he and Douwe Sikkema, a project manager with the province of South Holland, have brought me to see the latest in adaptive beach protection. It’s called the zandmotor—the sand engine.

The seafloor offshore, they explain, is thick with hundreds of feet of sand deposited by rivers and retreating glaciers. North Sea waves and currents once distributed that sand along the coast. But as sea level has risen since the Ice Age, the waves no longer reach deep enough to stir up sand, and the currents have less sand to spread around. Instead the sea erodes the coast here.

The typical solution would be to dredge sand offshore and dump it directly on the eroding beaches—and then repeat the process year after year as the sand washes away. Mulder and his colleagues recommended that the provincial government try a different strategy: a single gargantuan dredging operation to create the sandy peninsula we’re walking on—a hook-shaped stretch of beach the size of 250 football fields. If the scheme works, over the next 20 years the wind, waves, and tides will spread its sand 15 miles up and down the coast. The combination of wind, waves, tides, and sand is the zandmotor.

The project started only two years ago, but it seems to be working. Mulder shows me small dunes that have started to grow on a beach where there was once open water. “It’s very flexible,” he says. “If we see that sea-level rise increases, we can increase the amount of sand.” Sikkema adds, “And it’s much easier to adjust the amount of sand than to rebuild an entire system of dikes.”

Later Mulder tells me about a memorial inscription affixed to the Eastern Scheldt barrier in Zeeland: “It says, ‘Hier gaan over het tij, de maan, de wind, en wij—Here the tide is ruled by the moon, the wind, and us.’ ” It reflects the confidence of a generation that took for granted, as we no longer can, a reasonably stable world. “We have to understand that we are not ruling the world,” says Mulder. “We need to adapt.”

With the threats of climate change and sea-level rise looming over us all, cities around the world, from New York to Ho Chi Minh City, have turned to the Netherlands for guidance. One Dutch firm, Arcadis, has prepared a conceptual design for a storm-surge barrier in the Verrazano Narrows to protect New York City. The same company helped design a $1.1 billion, two-mile-long barrier that protected New Orleans from a 13.6-foot storm surge last summer, when Hurricane Isaac hit. The Lower Ninth Ward, which suffered so greatly during Hurricane Katrina, was unscathed.

“Isaac was a tremendous victory for New Orleans,” Piet Dircke, an Arcadis executive, tells me one night over dinner in Rotterdam. “All the barriers were closed; all the levees held; all the pumps worked. You didn’t hear about it? No, because nothing happened.”

New Orleans may be safe for a few decades, but the long-term prospects for it and other low-lying cities look dire. Among the most vulnerable is Miami. “I cannot envision southeastern Florida having many people at the end of this century,” says Hal Wanless, chairman of the department of geological sciences at the University of Miami. We’re sitting in his basement office, looking at maps of Florida on his computer. At each click of the mouse, the years pass, the ocean rises, and the peninsula shrinks. Freshwater wetlands and mangrove swamps collapse—a death spiral that has already started on the southern tip of the peninsula. With seas four feet higher than they are today—a distinct possibility by 2100—about two-thirds of southeastern Florida is inundated. The Florida Keys have almost vanished. Miami is an island.

When I ask Wanless if barriers might save Miami, at least in the short term, he leaves his office for a moment. When he returns, he’s holding a foot-long cylindrical limestone core. It looks like a tube of gray, petrified Swiss cheese. “Try to plug this up,” he says. Miami and most of Florida sit atop a foundation of highly porous limestone. The limestone consists of the remains of countless marine creatures deposited more than 65 million years ago, when a warm, shallow sea covered what is now Florida—a past that may resemble the future here.

A barrier would be pointless, Wanless says, because water would just flow through the limestone beneath it. “No doubt there will be some dramatic engineering feats attempted,” he says. “But the limestone is so porous that even massive pumping systems won’t be able to keep the water out.”

Sea-level rise has already begun to threaten Florida’s freshwater supply. About a quarter of the state’s 19 million residents depend on wells sunk into the enormous Biscayne aquifer. Salt water is now seeping into it from dozens of canals that were built to drain the Everglades. For decades the state has tried to control the saltwater influx by building dams and pumping stations on the drainage canals. These “salinity-control structures” maintain a wall of fresh water behind them to block the underground intrusion of salt water. To offset the greater density of salt water, the freshwater level in the control structures is generally kept about two feet higher than the encroaching sea.

But the control structures also serve a second function: During the state’s frequent rainstorms their gates must open to discharge the flood of fresh water to the sea.“We have about 30 salinity-control structures in South Florida,” says Jayantha Obeysekera, the chief hydrological modeler at the South Florida Water Management District. “At times now the water level in the sea is higher than the freshwater level in the canal.” That both accelerates saltwater intrusion and prevents the discharge of flood waters. “The concern is that this will get worse with time as the sea-level rise accelerates,” Obeysekera says.

Using fresh water to block the salt water will eventually become impractical, because the amount of fresh water needed would submerge ever larger areas behind the control structures, in effect flooding the state from the inside. “With 50 centimeters [about 20 inches] of sea-level rise, 80 percent of the salinity-control structures in Florida will no longer be functional,” says Wanless. “We’ll either have to drown communities to keep the freshwater head above sea level or have saltwater intrusion.” When sea level rises two feet, he says, Florida’s aquifers may be poisoned beyond recovery. Even now, during unusually high tides, seawater spouts from sewers in Miami Beach, Fort Lauderdale, and other cities, flooding streets.

In a state exposed to hurricanes as well as rising seas, people like John Van Leer, an oceanographer at the University of Miami, worry that one day they will no longer be able to insure—or sell—their houses. “If buyers can’t insure it, they can’t get a mortgage on it. And if they can’t get a mortgage, you can only sell to cash buyers,” Van Leer says. “What I’m looking for is a climate-change denier with a lot of money.”

Unless we change course dramatically in the coming years, our carbon emissions will create a world utterly different in its very geography from the one in which our species evolved. “With business as usual, the concentration of carbon dioxide in the atmosphere will reach around a thousand parts per million by the end of the century,” says Gavin Foster, a geochemist at the University of Southampton in England. Such concentrations, he says, haven’t been seen on Earth since the early Eocene epoch, 50 million years ago, when the planet was completely ice free. According to the U.S. Geological Survey, sea level on an iceless Earth would be as much as 216 feet higher than it is today. It might take thousands of years and more than a thousand parts per million to create such a world—but if we burn all the fossil fuels, we will get there.

No matter how much we reduce our greenhouse gas emissions, Foster says, we’re already locked in to at least several feet of sea-level rise, and perhaps several dozens of feet, as the planet slowly adjusts to the amount of carbon that’s in the atmosphere already. A recent Dutch study predicted that the Netherlands could engineer solutions at a manageable cost to a rise of as much as five meters, or 16 feet. Poorer countries will struggle to adapt to much less. At different times in different places, engineering solutions will no longer suffice. Then the retreat from the coast will begin. In some places there will be no higher ground to retreat to.

By the next century, if not sooner, large numbers of people will have to abandon coastal areas in Florida and other parts of the world. Some researchers fear a flood tide of climate-change refugees. “From the Bahamas to Bangladesh and a major amount of Florida, we’ll all have to move, and we may have to move at the same time,” says Wanless. “We’re going to see civil unrest, war. You just wonder how—or if—civilization will function. How thin are the threads that hold it all together? We can’t comprehend this. We think Miami has always been here and will always be here. How do you get people to realize that Miami—or London—will not always be there?”

What will New York look like in 200 years? Klaus Jacob, the Columbia geophysicist, sees downtown Manhattan as a kind of Venice, subject to periodic flooding, perhaps with canals and yellow water cabs. Much of the city’s population, he says, will gather on high ground in the other boroughs. “High ground will become expensive, waterfront will become cheap,” he says. But among New Yorkers, as among the rest of us, the idea that the sea is going to rise—a lot—hasn’t really sunk in yet. Of the thousands of people in New York State whose homes were badly damaged or destroyed by Sandy’s surge, only 10 to 15 percent are expected to accept the state’s offer to buy them out at their homes’ pre-storm value. The rest plan to rebuild.

The Science of Why We Don’t Believe Science (Mother Jones)

How our brains fool us on climate, creationism, and the end of the world.

By  | Mon Apr. 18, 2011 3:00 AM PDT


“A MAN WITH A CONVICTION is a hard man to change. Tell him you disagree and he turns away. Show him facts or figures and he questions your sources. Appeal to logic and he fails to see your point.” So wrote the celebrated Stanford University psychologist Leon Festinger [1] (PDF), in a passage that might have been referring to climate change denial—the persistent rejection, on the part of so many Americans today, of what we know about global warming and its human causes. But it was too early for that—this was the 1950s—and Festinger was actually describing a famous case study [2] in psychology.

Festinger and several of his colleagues had infiltrated the Seekers, a small Chicago-area cult whose members thought they were communicating with aliens—including one, “Sananda,” who they believed was the astral incarnation of Jesus Christ. The group was led by Dorothy Martin, a Dianetics devotee who transcribed the interstellar messages through automatic writing.

Through her, the aliens had given the precise date of an Earth-rending cataclysm: December 21, 1954. Some of Martin’s followers quit their jobs and sold their property, expecting to be rescued by a flying saucer when the continent split asunder and a new sea swallowed much of the United States. The disciples even went so far as to remove brassieres and rip zippers out of their trousers—the metal, they believed, would pose a danger on the spacecraft.

Festinger and his team were with the cult when the prophecy failed. First, the “boys upstairs” (as the aliens were sometimes called) did not show up and rescue the Seekers. Then December 21 arrived without incident. It was the moment Festinger had been waiting for: How would people so emotionally invested in a belief system react, now that it had been soundly refuted?

Read also: the truth about Climategate. [3]. Read also: the truth about Climategate [4].

At first, the group struggled for an explanation. But then rationalization set in. A new message arrived, announcing that they’d all been spared at the last minute. Festinger summarized the extraterrestrials’ new pronouncement: “The little group, sitting all night long, had spread so much light that God had saved the world from destruction.” Their willingness to believe in the prophecy had saved Earth from the prophecy!

From that day forward, the Seekers, previously shy of the press and indifferent toward evangelizing, began to proselytize. “Their sense of urgency was enormous,” wrote Festinger. The devastation of all they had believed had made them even more certain of their beliefs.

In the annals of denial, it doesn’t get much more extreme than the Seekers. They lost their jobs, the press mocked them, and there were efforts to keep them away from impressionable young minds. But while Martin’s space cult might lie at on the far end of the spectrum of human self-delusion, there’s plenty to go around. And since Festinger’s day, an array of new discoveries in psychology and neuroscience has further demonstrated how our preexisting beliefs, far more than any new facts, can skew our thoughts and even color what we consider our most dispassionate and logical conclusions. This tendency toward so-called “motivated reasoning [5]” helps explain why we find groups so polarized over matters where the evidence is so unequivocal: climate change, vaccines, “death panels,” the birthplace and religion of the president [6] (PDF), and much else. It would seem that expecting people to be convinced by the facts flies in the face of, you know, the facts.

The theory of motivated reasoning builds on a key insight of modern neuroscience [7] (PDF): Reasoning is actually suffused with emotion (or what researchers often call “affect”). Not only are the two inseparable, but our positive or negative feelings about people, things, and ideas arise much more rapidly than our conscious thoughts, in a matter of milliseconds—fast enough to detect with an EEG device, but long before we’re aware of it. That shouldn’t be surprising: Evolution required us to react very quickly to stimuli in our environment. It’s a “basic human survival skill,” explains political scientist Arthur Lupia[8] of the University of Michigan. We push threatening information away; we pull friendly information close. We apply fight-or-flight reflexes not only to predators, but to data itself.

We apply fight-or-flight reflexes not only to predators, but to data itself.

We’re not driven only by emotions, of course—we also reason, deliberate. But reasoning comes later, works slower—and even then, it doesn’t take place in an emotional vacuum. Rather, our quick-fire emotions can set us on a course of thinking that’s highly biased, especially on topics we care a great deal about.

Consider a person who has heard about a scientific discovery that deeply challenges her belief in divine creation—a new hominid, say, that confirms our evolutionary origins. What happens next, explains political scientist Charles Taber [9] of Stony Brook University, is a subconscious negative response to the new information—and that response, in turn, guides the type of memories and associations formed in the conscious mind. “They retrieve thoughts that are consistent with their previous beliefs,” says Taber, “and that will lead them to build an argument and challenge what they’re hearing.”

In other words, when we think we’re reasoning, we may instead be rationalizing. Or to use an analogy offered by University of Virginia psychologist Jonathan Haidt [10]: We may think we’re being scientists, but we’re actually being lawyers [11] (PDF). Our “reasoning” is a means to a predetermined end—winning our “case”—and is shot through with biases. They include “confirmation bias,” in which we give greater heed to evidence and arguments that bolster our beliefs, and “disconfirmation bias,” in which we expend disproportionate energy trying to debunk or refute views and arguments that we find uncongenial.

That’s a lot of jargon, but we all understand these mechanisms when it comes to interpersonal relationships. If I don’t want to believe that my spouse is being unfaithful, or that my child is a bully, I can go to great lengths to explain away behavior that seems obvious to everybody else—everybody who isn’t too emotionally invested to accept it, anyway. That’s not to suggest that we aren’t also motivated to perceive the world accurately—we are. Or that we never change our minds—we do. It’s just that we have other important goals besides accuracy—including identity affirmation and protecting one’s sense of self—and often those make us highly resistant to changing our beliefs when the facts say we should.

Modern science originated from an attempt to weed out such subjective lapses—what that great 17th century theorist of the scientific method, Francis Bacon, dubbed the “idols of the mind.” Even if individual researchers are prone to falling in love with their own theories, the broader processes of peer review and institutionalized skepticism are designed to ensure that, eventually, the best ideas prevail.

Scientific evidence is highly susceptible to misinterpretation. Giving ideologues scientific data that’s relevant to their beliefs is like unleashing them in the motivated-reasoning equivalent of a candy store.

Our individual responses to the conclusions that science reaches, however, are quite another matter. Ironically, in part because researchers employ so much nuance and strive to disclose all remaining sources of uncertainty, scientific evidence is highly susceptible to selective reading and misinterpretation. Giving ideologues or partisans scientific data that’s relevant to their beliefs is like unleashing them in the motivated-reasoning equivalent of a candy store.

Sure enough, a large number of psychological studies have shown that people respond to scientific or technical evidence in ways that justify their preexisting beliefs. In a classic 1979 experiment [12] (PDF), pro- and anti-death penalty advocates were exposed to descriptions of two fake scientific studies: one supporting and one undermining the notion that capital punishment deters violent crime and, in particular, murder. They were also shown detailed methodological critiques of the fake studies—and in a scientific sense, neither study was stronger than the other. Yet in each case, advocates more heavily criticized the study whose conclusions disagreed with their own, while describing the study that was more ideologically congenial as more “convincing.”

Since then, similar results have been found for how people respond to “evidence” about affirmative action, gun control, the accuracy of gay stereotypes [13], and much else. Even when study subjects are explicitly instructed to be unbiased and even-handed about the evidence, they often fail.

And it’s not just that people twist or selectively read scientific evidence to support their preexisting views. According to research by Yale Law School professor Dan Kahan [14] and his colleagues, people’s deep-seated views about morality, and about the way society should be ordered, strongly predict whom they consider to be a legitimate scientific expert in the first place—and thus where they consider “scientific consensus” to lie on contested issues.

In Kahan’s research [15] (PDF), individuals are classified, based on their cultural values, as either “individualists” or “communitarians,” and as either “hierarchical” or “egalitarian” in outlook. (Somewhat oversimplifying, you can think of hierarchical individualists as akin to conservative Republicans, and egalitarian communitarians as liberal Democrats.) In one study, subjects in the different groups were asked to help a close friend determine the risks associated with climate change, sequestering nuclear waste, or concealed carry laws: “The friend tells you that he or she is planning to read a book about the issue but would like to get your opinion on whether the author seems like a knowledgeable and trustworthy expert.” A subject was then presented with the résumé of a fake expert “depicted as a member of the National Academy of Sciences who had earned a Ph.D. in a pertinent field from one elite university and who was now on the faculty of another.” The subject was then shown a book excerpt by that “expert,” in which the risk of the issue at hand was portrayed as high or low, well-founded or speculative. The results were stark: When the scientist’s position stated that global warming is real and human-caused, for instance, only 23 percent of hierarchical individualists agreed the person was a “trustworthy and knowledgeable expert.” Yet 88 percent of egalitarian communitarians accepted the same scientist’s expertise. Similar divides were observed on whether nuclear waste can be safely stored underground and whether letting people carry guns deters crime. (The alliances did not always hold. Inanother study [16] (PDF), hierarchs and communitarians were in favor of laws that would compel the mentally ill to accept treatment, whereas individualists and egalitarians were opposed.)

Head-on attempts to persuade can sometimes trigger a backfire effect, where people not only fail to change their minds when confronted with the facts—they may hold their wrong views more tenaciously than ever.

In other words, people rejected the validity of a scientific source because its conclusion contradicted their deeply held views—and thus the relative risks inherent in each scenario. A hierarchal individualist finds it difficult to believe that the things he prizes (commerce, industry, a man’s freedom to possess a gun to defend his family [16]) (PDF) could lead to outcomes deleterious to society. Whereas egalitarian communitarians tend to think that the free market causes harm, that patriarchal families mess up kids, and that people can’t handle their guns. The study subjects weren’t “anti-science”—not in their own minds, anyway. It’s just that “science” was whatever they wanted it to be. “We’ve come to a misadventure, a bad situation where diverse citizens, who rely on diverse systems of cultural certification, are in conflict,” says Kahan [17].

And that undercuts the standard notion that the way to persuade people is via evidence and argument. In fact, head-on attempts to persuade can sometimes trigger a backfire effect, where people not only fail to change their minds when confronted with the facts—they may hold their wrong views more tenaciously than ever.

Take, for instance, the question of whether Saddam Hussein possessed hidden weapons of mass destruction just before the US invasion of Iraq in 2003. When political scientists Brendan Nyhan and Jason Reifler showed subjects fake newspaper articles [18] (PDF) in which this was first suggested (in a 2004 quote from President Bush) and then refuted (with the findings of the Bush-commissioned Iraq Survey Group report, which found no evidence of active WMD programs in pre-invasion Iraq), they found that conservatives were more likely than before to believe the claim. (The researchers also tested how liberals responded when shown that Bush did not actually “ban” embryonic stem-cell research. Liberals weren’t particularly amenable to persuasion, either, but no backfire effect was observed.)

Another study gives some inkling of what may be going through people’s minds when they resist persuasion. Northwestern University sociologist Monica Prasad [19] and her colleagues wanted to test whether they could dislodge the notion that Saddam Hussein and Al Qaeda were secretly collaborating among those most likely to believe it—Republican partisans from highly GOP-friendly counties. So the researchers set up a study [20] (PDF) in which they discussed the topic with some of these Republicans in person. They would cite the findings of the 9/11 Commission, as well as a statement in which George W. Bush himself denied his administration had “said the 9/11 attacks were orchestrated between Saddam and Al Qaeda.”

One study showed that not even Bush’s own words could change the minds of Bush voters who believed there was an Iraq-Al Qaeda link.

As it turned out, not even Bush’s own words could change the minds of these Bush voters—just 1 of the 49 partisans who originally believed the Iraq-Al Qaeda claim changed his or her mind. Far more common was resisting the correction in a variety of ways, either by coming up with counterarguments or by simply being unmovable:

Interviewer: [T]he September 11 Commission found no link between Saddam and 9/11, and this is what President Bush said. Do you have any comments on either of those?

Respondent: Well, I bet they say that the Commission didn’t have any proof of it but I guess we still can have our opinions and feel that way even though they say that.

The same types of responses are already being documented on divisive topics facing the current administration. Take the “Ground Zero mosque.” Using information from the political myth-busting site FactCheck.org [21], a team at Ohio State presented subjects [22] (PDF) with a detailed rebuttal to the claim that “Feisal Abdul Rauf, the Imam backing the proposed Islamic cultural center and mosque, is a terrorist-sympathizer.” Yet among those who were aware of the rumor and believed it, fewer than a third changed their minds.

A key question—and one that’s difficult to answer—is how “irrational” all this is. On the one hand, it doesn’t make sense to discard an entire belief system, built up over a lifetime, because of some new snippet of information. “It is quite possible to say, ‘I reached this pro-capital-punishment decision based on real information that I arrived at over my life,'” explains Stanford social psychologist Jon Krosnick [23]. Indeed, there’s a sense in which science denial could be considered keenly “rational.” In certain conservative communities, explains Yale’s Kahan, “People who say, ‘I think there’s something to climate change,’ that’s going to mark them out as a certain kind of person, and their life is going to go less well.”

This may help explain a curious pattern Nyhan and his colleagues found when they tried to test the fallacy [6] (PDF) that President Obama is a Muslim. When a nonwhite researcher was administering their study, research subjects were amenable to changing their minds about the president’s religion and updating incorrect views. But when only white researchers were present, GOP survey subjects in particular were more likely to believe the Obama Muslim myth than before. The subjects were using “social desirabililty” to tailor their beliefs (or stated beliefs, anyway) to whoever was listening.

Which leads us to the media. When people grow polarized over a body of evidence, or a resolvable matter of fact, the cause may be some form of biased reasoning, but they could also be receiving skewed information to begin with—or a complicated combination of both. In the Ground Zero mosque case, for instance, a follow-up study [24] (PDF) showed that survey respondents who watched Fox News were more likely to believe the Rauf rumor and three related ones—and they believed them more strongly than non-Fox watchers.

Okay, so people gravitate toward information that confirms what they believe, and they select sources that deliver it. Same as it ever was, right? Maybe, but the problem is arguably growing more acute, given the way we now consume information—through the Facebook links of friends, or tweets that lack nuance or context, or “narrowcast [25]” and often highly ideological media that have relatively small, like-minded audiences. Those basic human survival skills of ours, says Michigan’s Arthur Lupia, are “not well-adapted to our information age.”

A predictor of whether you accept the science of global warming? Whether you’re a Republican or a Democrat.

If you wanted to show how and why fact is ditched in favor of motivated reasoning, you could find no better test case than climate change. After all, it’s an issue where you have highly technical information on one hand and very strong beliefs on the other. And sure enough, one key predictor of whether you accept the science of global warming is whether you’re a Republican or a Democrat. The two groups have been growing more divided in their views about the topic, even as the science becomes more unequivocal.

So perhaps it should come as no surprise that more education doesn’t budge Republican views. On the contrary: In a 2008 Pew survey [26], for instance, only 19 percent of college-educated Republicans agreed that the planet is warming due to human actions, versus 31 percent of non-college educated Republicans. In other words, a higher education correlated with an increased likelihood of denying the science on the issue. Meanwhile, among Democrats and independents, more education correlated with greater acceptance of the science.

Other studies have shown a similar effect: Republicans who think they understand the global warming issue best are least concerned about it; and among Republicans and those with higher levels of distrust of science in general, learning more about the issue doesn’t increase one’s concern about it. What’s going on here? Well, according to Charles Taber and Milton Lodge of Stony Brook, one insidious aspect of motivated reasoning is that political sophisticates are prone to be more biased than those who know less about the issues. “People who have a dislike of some policy—for example, abortion—if they’re unsophisticated they can just reject it out of hand,” says Lodge. “But if they’re sophisticated, they can go one step further and start coming up with counterarguments.” These individuals are just as emotionally driven and biased as the rest of us, but they’re able to generate more and better reasons to explain why they’re right—and so their minds become harder to change.

That may be why the selectively quoted emails of Climategate were so quickly and easily seized upon by partisans as evidence of scandal. Cherry-picking is precisely the sort of behavior you would expect motivated reasoners to engage in to bolster their views—and whatever you may think about Climategate, the emails were a rich trove of new information upon which to impose one’s ideology.

Climategate had a substantial impact on public opinion, according to Anthony Leiserowitz [27], director of the Yale Project on Climate Change Communication [28]. It contributed to an overall drop in public concern about climate change and a significant loss of trust in scientists. But—as we should expect by now—these declines were concentrated among particular groups of Americans: Republicans, conservatives, and those with “individualistic” values. Liberals and those with “egalitarian” values didn’t lose much trust in climate science or scientists at all. “In some ways, Climategate was like a Rorschach test,” Leiserowitz says, “with different groups interpreting ambiguous facts in very different ways.”

Is there a case study of science denial that largely occupies the political left? Yes: the claim that childhood vaccines are causing an epidemic of autism.

So is there a case study of science denial that largely occupies the political left? Yes: the claim that childhood vaccines are causing an epidemic of autism. Its most famous proponents are an environmentalist (Robert F. Kennedy Jr. [29]) and numerous Hollywood celebrities (most notably Jenny McCarthy [30] and Jim Carrey). TheHuffington Post gives a very large megaphone to denialists. And Seth Mnookin [31], author of the new book The Panic Virus [32], notes that if you want to find vaccine deniers, all you need to do is go hang out at Whole Foods.

Vaccine denial has all the hallmarks of a belief system that’s not amenable to refutation. Over the past decade, the assertion that childhood vaccines are driving autism rateshas been undermined [33] by multiple epidemiological studies—as well as the simple fact that autism rates continue to rise, even though the alleged offending agent in vaccines (a mercury-based preservative called thimerosal) has long since been removed.

Yet the true believers persist—critiquing each new study that challenges their views, and even rallying to the defense of vaccine-autism researcher Andrew Wakefield, afterhis 1998 Lancet paper [34]—which originated the current vaccine scare—was retracted and he subsequently lost his license [35] (PDF) to practice medicine. But then, why should we be surprised? Vaccine deniers created their own partisan media, such as the website Age of Autism, that instantly blast out critiques and counterarguments whenever any new development casts further doubt on anti-vaccine views.

It all raises the question: Do left and right differ in any meaningful way when it comes to biases in processing information, or are we all equally susceptible?

There are some clear differences. Science denial today is considerably more prominent on the political right—once you survey climate and related environmental issues, anti-evolutionism, attacks on reproductive health science by the Christian right, and stem-cell and biomedical matters. More tellingly, anti-vaccine positions are virtually nonexistent among Democratic officeholders today—whereas anti-climate-science views are becoming monolithic among Republican elected officials.

Some researchers have suggested that there are psychological differences between the left and the right that might impact responses to new information—that conservatives are more rigid and authoritarian, and liberals more tolerant of ambiguity. Psychologist John Jost of New York University has further argued that conservatives are “system justifiers”: They engage in motivated reasoning to defend the status quo.

This is a contested area, however, because as soon as one tries to psychoanalyze inherent political differences, a battery of counterarguments emerges: What about dogmatic and militant communists? What about how the parties have differed through history? After all, the most canonical case of ideologically driven science denial is probably the rejection of genetics in the Soviet Union, where researchers disagreeing with the anti-Mendelian scientist (and Stalin stooge) Trofim Lysenko were executed, and genetics itself was denounced as a “bourgeois” science and officially banned.

The upshot: All we can currently bank on is the fact that we all have blinders in some situations. The question then becomes: What can be done to counteract human nature itself?

We all have blinders in some situations. The question then becomes: What can be done to counteract human nature?

Given the power of our prior beliefs to skew how we respond to new information, one thing is becoming clear: If you want someone to accept new evidence, make sure to present it to them in a context that doesn’t trigger a defensive, emotional reaction.

This theory is gaining traction in part because of Kahan’s work at Yale. In one study [36], he and his colleagues packaged the basic science of climate change into fake newspaper articles bearing two very different headlines—”Scientific Panel Recommends Anti-Pollution Solution to Global Warming” and “Scientific Panel Recommends Nuclear Solution to Global Warming”—and then tested how citizens with different values responded. Sure enough, the latter framing made hierarchical individualists much more open to accepting the fact that humans are causing global warming. Kahan infers that the effect occurred because the science had been written into an alternative narrative that appealed to their pro-industry worldview.

You can follow the logic to its conclusion: Conservatives are more likely to embrace climate science if it comes to them via a business or religious leader, who can set the issue in the context of different values than those from which environmentalists or scientists often argue. Doing so is, effectively, to signal a détente in what Kahan has called a “culture war of fact.” In other words, paradoxically, you don’t lead with the facts in order to convince. You lead with the values—so as to give the facts a fighting chance.


Links:
[1] https://motherjones.com/files/lfestinger.pdf
[2] http://www.powells.com/biblio/61-9781617202803-1
[3] http://motherjones.com/environment/2011/04/history-of-climategate
[4] http://motherjones.com/environment/2011/04/field-guide-climate-change-skeptics
[5] http://www.ncbi.nlm.nih.gov/pubmed/2270237
[6] http://www-personal.umich.edu/~bnyhan/obama-muslim.pdf
[7] https://motherjones.com/files/descartes.pdf
[8] http://www-personal.umich.edu/~lupia/
[9] http://www.stonybrook.edu/polsci/ctaber/
[10] http://people.virginia.edu/~jdh6n/
[11] https://motherjones.com/files/emotional_dog_and_rational_tail.pdf
[12] http://synapse.princeton.edu/~sam/lord_ross_lepper79_JPSP_biased-assimilation-and-attitude-polarization.pdf
[13] http://psp.sagepub.com/content/23/6/636.abstract
[14] http://www.law.yale.edu/faculty/DKahan.htm
[15] https://motherjones.com/files/kahan_paper_cultural_cognition_of_scientific_consesus.pdf
[16] http://digitalcommons.law.yale.edu/cgi/viewcontent.cgi?article=1095&context=fss_papers
[17] http://seagrant.oregonstate.edu/blogs/communicatingclimate/transcripts/Episode_10b_Dan_Kahan.html
[18] http://www-personal.umich.edu/~bnyhan/nyhan-reifler.pdf
[19] http://www.sociology.northwestern.edu/faculty/prasad/home.html
[20] http://sociology.buffalo.edu/documents/hoffmansocinquiryarticle_000.pdf
[21] http://www.factcheck.org/
[22] http://www.comm.ohio-state.edu/kgarrett/FactcheckMosqueRumors.pdf
[23] http://communication.stanford.edu/faculty/krosnick/
[24] http://www.comm.ohio-state.edu/kgarrett/MediaMosqueRumors.pdf
[25] http://en.wikipedia.org/wiki/Narrowcasting
[26] http://people-press.org/report/417/a-deeper-partisan-divide-over-global-warming
[27] http://environment.yale.edu/profile/leiserowitz/
[28] http://environment.yale.edu/climate/
[29] http://www.huffingtonpost.com/robert-f-kennedy-jr-and-david-kirby/vaccine-court-autism-deba_b_169673.html
[30] http://www.huffingtonpost.com/jenny-mccarthy/vaccine-autism-debate_b_806857.html
[31] http://sethmnookin.com/
[32] http://www.powells.com/biblio/1-9781439158647-0
[33] http://discovermagazine.com/2009/jun/06-why-does-vaccine-autism-controversy-live-on/article_print
[34] http://www.thelancet.com/journals/lancet/article/PIIS0140673697110960/fulltext
[35] http://www.gmc-uk.org/Wakefield_SPM_and_SANCTION.pdf_32595267.pdf
[36] http://www.scribd.com/doc/3446682/The-Second-National-Risk-and-Culture-Study-Making-Sense-of-and-Making-Progress-In-The-American-Culture-War-of-Fact

Climate change poses grave threat to security, says UK envoy (The Guardian)

Rear Admiral Neil Morisetti, special representative to foreign secretary, says governments can’t afford to wait for 100% certainty

The Guardian, Sunday 30 June 2013 18.19 BST

Flooding in Thailand in 2011

Flooding in Thailand in 2011. Photograph: Narong Sangnak/EPA

Climate change poses as grave a threat to the UK’s security and economic resilience as terrorism and cyber-attacks, according to a senior military commander who was appointed as William Hague’s climate envoy this year.

In his first interview since taking up the post, Rear Admiral Neil Morisetti said climate change was “one of the greatest risks we face in the 21st century”, particularly because it presented a global threat. “By virtue of our interdependencies around the world, it will affect all of us,” he said.

He argued that climate change was a potent threat multiplier at choke points in the global trade network, such as the Straits of Hormuz, through which much of the world’s traded oil and gas is shipped.

Morisetti left a 37-year naval career to become the foreign secretary’s special representative for climate change, and represents the growing influence of hard-headed military thinking in the global warming debate.

The link between climate change and global security risks is on the agenda of the UK’s presidency of the G8, including a meeting to be chaired by Morissetti in July that will include assessment of hotspots where climate stress is driving migration.

Morisetti’s central message was simple and stark: “The areas of greatest global stress and greatest impacts of climate change are broadly coincidental.”

He said governments could not afford to wait until they had all the information they might like. “If you wait for 100% certainty on the battlefield, you’ll be in a pretty sticky state,” he said.

The increased threat posed by climate change arises because droughts, storms and floods are exacerbating water, food, population and security tensions in conflict-prone regions.

“Just because it is happening 2,000 miles away does not mean it is not going to affect the UK in a globalised world, whether it is because food prices go up, or because increased instability in an area – perhaps around the Middle East or elsewhere – causes instability in fuel prices,” Morisetti said.

“In fact it is already doing so,” he added, noting that Toyota’s UK car plants had been forced to switch to a three-day week after extreme floods in Thailand cut the supply chain. Computer firms in California and Poland were left short of microchips by the same floods.

Morisetti is far from the only military figure emphasising the climate threat to security. America’s top officer tackling the threat from North Korea and China has said the biggest long-term security issue in the region is climate change.

In a recent interview, Admiral Samuel J Locklear III, who led the US naval action in Libya that helped topple Muammar Gaddafi, said a significant event related to the warming planet was “the most likely thing that is going to happen that will cripple the security environment, probably more likely than the other scenarios we all often talk about”.

There is a reason why the military are so clear-headed about the climate threat, according to Professor John Schellnhuber, a scientist who briefed the UN security council on the issue in February and formerly advised the German chancellor, Angela Merkel.

“The military do not deal with ideology. They cannot afford to: they are responsible for the lives of people and billions of pounds of investment in equipment,” he said. “When the climate change deniers took their stance after the Copenhagen summit in 2009, it is very interesting that the military people were never shaken from the idea that we are about to enter a very difficult period.”

He added: “This danger of the creation of violent conflicts is the strongest argument why we should keep climate change under control, because the international system is not stable, and the slightest thing, like the food riots in the Middle East, could make the whole system explode.”

The military has been quietly making known its concern about the climate threat to security for some time. General Wesley Clark, who commanded the Nato bombing of Yugoslavia during the Kosovo war, said in 2005: “Stopping global warming is not just about saving the environment, it’s about securing America for our children and our children’s children, as well.”

In the same year Chuck Hagel, now Obama’s defence secretary, said: “I don’t think you can separate environmental policy from economic policy or energy policy.”

Morisetti said there was also a direct link between climate change and the military because of the latter’s huge reliance on fossil fuels. “In Afghanistan, where we have had to import all our energy into the country along a single route that has been disrupted, the US military have calculated that for every 24 convoys there has been a casualty. There is a cost associated in bringing in that energy in both blood and treasure.

“So to drive up efficiency and to use alternative fuels, wind, solar, makes eminent sense to the military,” he said, noting that the use of solar blankets in Afghanistan meant fewer fuel resupply missions. “The principles of delivering your outputs more effectively, reducing your risks and reducing your costs reads across far more widely than just the military: most businesses would be looking for that too.”

Morisetti’s former employer, the Ministry of Defence, agrees that the climate threat is a serious one. The last edition of the Global Strategic Trends analysis published by the MoD’s Development, Concepts and Doctrine Centre concludes: “Climate change will amplify existing social, political and resource stresses, shifting the tipping point at which conflict ignites … Out to 2040, there are few convincing reasons to suggest that the world will become more peaceful.”

Schellnhuber was also clear about the consequences of failing to curb global warming. “The last 11,000 years – the Holocene – was characterised by the extreme stability of global climate. It is the only period when human civilisation could have developed at all,” he said. “But I don’t think a global, interconnected world can be managed in peace if climate change means we are leaving the Holocene. Let’s pray we will have a Lincoln or a Gorbachev to lead us.”

Centro brasileiro aumenta em quatro vezes a precisão da previsão do tempo (JC/O Globo)

JC e-mail 4746, de 13 de Junho de 2013.

Novo modelo do CPTEC, que usa o supercomputador Tupã, consegue mapear com resolução de cinco quilômetros quadrados. Reportagem de O Globo

Os olhos da previsão do tempo no Brasil passaram a enxergar melhor. Com quatro vezes mais precisão, mais precisamente. O Centro de Previsão do Tempo e Estudos Climáticos (CPTEC/INPE) lançou uma atualização do modelo Brams de previsão, turbinado agora pela alta capacidade de processamento do supercomputador Tupã, instalado em Cachoeira Paulista. Antes, o Brams fazia previsões de até uma semana com nitidez de 20 quilômetros quadrados. Agora, a resolução é de 5 quilômetros quadrados para os mesmos sete dias.

Com a nova versão, o nível de detalhe da previsão, que antes se limitava a uma cidade ou região, desta vez consegue diferenciar um bairro do outro. A consulta ao novo modelo meteorológico é gratuita e está disponível no site do CPTEC.

Para cobrir toda a América do Sul, o Brams dividiu o território como num grande jogo de batalha naval, com 1360 por 1480 células de área. Como é um modelo em três dimensões, há também 55 níveis verticais para cada uma destas células. No total, são 110 milhões de pontos, processados simultaneamente nos 9.600 processadores do Tupã.

Segundo o CPTEC, a versão 5.0 do Brams coloca o Brasil em posição de competitividade com os principais centros operacionais do mundo. O centro de previsão do National Centers for Environmental Prediction (NCEP), por exemplo, gera previsões a partir de um modelo similar – o National Mesoscale Model – de 4 quilômetros, 70 níveis verticais e grade de 1371 x 1100 células, que cobre toda a região continental dos Estados Unidos.

Para desenvolver esta nova versão do modelo BRAMS, também utilizado para a previsão e monitoramento da poluição do ar, são usados dados de estações meteorológicas de todo o país, de satélites, boias oceânicas e imagens de avião.

http://oglobo.globo.com/ciencia/centro-brasileiro-aumenta-em-quatro-vezes-precisao-da-previsao-do-tempo-8667823#ixzz2W6YkxWkF

* * *

Inpe lança modelo de previsão de tempo com altíssima resolução

O novo padrão cobre toda a América do Sul

Uma nova versão do modelo regional Brams de previsão de tempo, que cobre toda a América do Sul, foi lançada pelo Centro de Previsão do Tempo e Estudos Climáticos (CPTEC)do Instituto Nacional de Pesquisas Espaciais (Inpe/MCTI). O Brams, versão 5.0, já está operacional para até sete dias.

O modelo gera previsões com resolução espacial de 5 quilômetros, enquanto o anterior fornecia previsões com resolução de 20 quilômetros. O avanço só foi possível devido À alta capacidade de processamento do novo supercomputador Cray, do Inpe, o Tupã, instalado no CPTEC, em Cachoeira Paulista (SP).

Os desenvolvimentos para tornar a nova versão do Brams operacional levaram cerca de um ano. Para cobrir toda a extensão da América do Sul, foram necessárias 1.360 x 1.480 células horizontais e 55 níveis verticais. As células de grade, num total de 110 milhões, aproximadamente, são processadas simultaneamente nos 9.600 processadores do Cray, em computação paralela.

Este esforço coordenado pelo Grupo de Modelagem Atmosférica e Interfaces (Gmai) colocou o CPTEC em posição de competitividade em relação aos principais centros operacionais do mundo. O centro de previsão do National Centers for Environmental Prediction (NCEP), por exemplo, gera previsões a partir de um modelo similar – o National Mesoscale Model – de 4 quilômetros, 70 níveis verticais e grade de 1.371 x 1.100 células, que cobre toda a região continental dos Estados Unidos.

Para desenvolver esta nova versão do modelo Brams, também utilizado para a previsão e monitoramento da poluição do ar, utilizou-se um modelo não hidrostático, que representa com maior precisão processos físicos de menor escala, como o desenvolvimento e a dissipação de nuvens e chuvas. Diversos avanços em parametrização (representações matemáticas de processos físicos) foram realizados para nuvens, radiação solar e processos e dinâmicas de superfície.

(Ascom do Inpe)

Climate Researchers Discover New Rhythm for El Niño (Science Daily)

May 27, 2013 — El Niño wreaks havoc across the globe, shifting weather patterns that spawn droughts in some regions and floods in others. The impacts of this tropical Pacific climate phenomenon are well known and documented.

This is a schematic figure for the suggested generation mechanism of the combination tone: The annual cycle (Tone 1), together with the El Niño sea surface temperature anomalies (Tone 2) produce the combination tone. (Credit: Malte Stuecker)

A mystery, however, has remained despite decades of research: Why does El Niño always peak around Christmas and end quickly by February to April?

Now there is an answer: An unusual wind pattern that straddles the equatorial Pacific during strong El Niño events and swings back and forth with a period of 15 months explains El Niño’s close ties to the annual cycle. This finding is reported in the May 26, 2013, online issue of Nature Geoscience by scientists from the University of Hawai’i at Manoa Meteorology Department and International Pacific Research Center.

“This atmospheric pattern peaks in February and triggers some of the well-known El Niño impacts, such as droughts in the Philippines and across Micronesia and heavy rainfall over French Polynesia,” says lead author Malte Stuecker.

When anomalous trade winds shift south they can terminate an El Niño by generating eastward propagating equatorial Kelvin waves that eventually resume upwelling of cold water in the eastern equatorial Pacific. This wind shift is part of the larger, unusual atmospheric pattern accompanying El Niño events, in which a high-pressure system hovers over the Philippines and the major rain band of the South Pacific rapidly shifts equatorward.

With the help of numerical atmospheric models, the scientists discovered that this unusual pattern originates from an interaction between El Niño and the seasonal evolution of temperatures in the western tropical Pacific warm pool.

“Not all El Niño events are accompanied by this unusual wind pattern” notes Malte Stuecker, “but once El Niño conditions reach a certain threshold amplitude during the right time of the year, it is like a jack-in-the-box whose lid pops open.”

A study of the evolution of the anomalous wind pattern in the model reveals a rhythm of about 15 months accompanying strong El Niño events, which is considerably faster than the three- to five-year timetable for El Niño events, but slower than the annual cycle.

“This type of variability is known in physics as a combination tone,” says Fei-Fei Jin, professor of Meteorology and co-author of the study. Combination tones have been known for more than three centuries. They where discovered by violin builder Tartini, who realized that our ear can create a third tone, even though only two tones are played on a violin.

“The unusual wind pattern straddling the equator during an El Niño is such a combination tone between El Niño events and the seasonal march of the sun across the equator” says co-author Axel Timmermann, climate scientist at the International Pacific Research Center and professor at the Department of Oceanography, University of Hawai’i. He adds, “It turns out that many climate models have difficulties creating the correct combination tone, which is likely to impact their ability to simulate and predict El Niño events and their global impacts.”

The scientists are convinced that a better representation of the 15-month tropical Pacific wind pattern in climate models will improve El Niño forecasts. Moreover, they say the latest climate model projections suggest that El Niño events will be accompanied more often by this combination tone wind pattern, which will also change the characteristics of future El Niño rainfall patterns.

Journal Reference:

  1. Malte F. Stuecker, Axel Timmermann, Fei-Fei Jin, Shayne McGregor, Hong-Li Ren. A combination mode of the annual cycle and the El Niño/Southern Oscillation.Nature Geoscience, 2013; DOI: 10.1038/ngeo1826

Global Warming Caused by CFCs, Not Carbon Dioxide, Researcher Claims in Controversial Study (Science Daily)

May 30, 2013 — Chlorofluorocarbons (CFCs) are to blame for global warming since the 1970s and not carbon dioxide, according to a researcher from the University of Waterloo in a controversial new study published in the International Journal of Modern Physics B this week.

Annual global temperature over land and ocean. (Credit: Image by Q.-B. Lu)

CFCs are already known to deplete ozone, but in-depth statistical analysis now suggests that CFCs are also the key driver in global climate change, rather than carbon dioxide (CO2) emissions, the researcher argues.

“Conventional thinking says that the emission of human-made non-CFC gases such as carbon dioxide has mainly contributed to global warming. But we have observed data going back to the Industrial Revolution that convincingly shows that conventional understanding is wrong,” said Qing-Bin Lu, a professor of physics and astronomy, biology and chemistry in Waterloo’s Faculty of Science. “In fact, the data shows that CFCs conspiring with cosmic rays caused both the polar ozone hole and global warming.”

“Most conventional theories expect that global temperatures will continue to increase as CO2 levels continue to rise, as they have done since 1850. What’s striking is that since 2002, global temperatures have actually declined — matching a decline in CFCs in the atmosphere,” Professor Lu said. “My calculations of CFC greenhouse effect show that there was global warming by about 0.6 °C from 1950 to 2002, but the earth has actually cooled since 2002. The cooling trend is set to continue for the next 50-70 years as the amount of CFCs in the atmosphere continues to decline.”

The findings are based on in-depth statistical analyses of observed data from 1850 up to the present time, Professor Lu’s cosmic-ray-driven electron-reaction (CRE) theory of ozone depletion and his previous research into Antarctic ozone depletion and global surface temperatures.

“It was generally accepted for more than two decades that the Earth’s ozone layer was depleted by the sun’s ultraviolet light-induced destruction of CFCs in the atmosphere,” he said. “But in contrast, CRE theory says cosmic rays — energy particles originating in space — play the dominant role in breaking down ozone-depleting molecules and then ozone.”

Lu’s theory has been confirmed by ongoing observations of cosmic ray, CFC, ozone and stratospheric temperature data over several 11-year solar cycles. “CRE is the only theory that provides us with an excellent reproduction of 11-year cyclic variations of both polar ozone loss and stratospheric cooling,” said Professor Lu. “After removing the natural cosmic-ray effect, my new paper shows a pronounced recovery by ~20% of the Antarctic ozone hole, consistent with the decline of CFCs in the polar stratosphere.”

By demonstrating the link between CFCs, ozone depletion and temperature changes in the Antarctic, Professor Lu was able to draw almost perfect correlation between rising global surface temperatures and CFCs in the atmosphere.

“The climate in the Antarctic stratosphere has been completely controlled by CFCs and cosmic rays, with no CO2 impact. The change in global surface temperature after the removal of the solar effect has shown zero correlation with CO2 but a nearly perfect linear correlation with CFCs — a correlation coefficient as high as 0.97.”

Data recorded from 1850 to 1970, before any significant CFC emissions, show that CO2 levels increased significantly as a result of the Industrial Revolution, but the global temperature, excluding the solar effect, kept nearly constant. The conventional warming model of CO2, suggests the temperatures should have risen by 0.6°C over the same period, similar to the period of 1970-2002.

The analyses support Lu’s CRE theory and point to the success of the Montreal Protocol on Substances that Deplete the Ozone Layer.

“We’ve known for some time that CFCs have a really damaging effect on our atmosphere and we’ve taken measures to reduce their emissions,” Professor Lu said. “We now know that international efforts such as the Montreal Protocol have also had a profound effect on global warming but they must be placed on firmer scientific ground.”

“This study underlines the importance of understanding the basic science underlying ozone depletion and global climate change,” said Terry McMahon, dean of the faculty of science. “This research is of particular importance not only to the research community, but to policy makers and the public alike as we look to the future of our climate.”

Professor Lu’s paper, “Cosmic-Ray-Driven Reaction and Greenhouse Effect of Halogenated Molecules: Culprits for Atmospheric Ozone Depletion and Global Climate Change,” also predicts that the global sea level will continue to rise for some years as the hole in the ozone recovers increasing ice melting in the polar regions.

“Only when the effect of the global temperature recovery dominates over that of the polar ozone hole recovery, will both temperature and polar ice melting drop concurrently,” says Lu.

The peer-reviewed paper published this week not only provides new fundamental understanding of the ozone hole and global climate change but has superior predictive capabilities, compared with the conventional sunlight-driven ozone-depleting and CO2-warming models, Lu argues.

Journal Reference:

  1. Q.-B. Lu. Cosmic-Ray-Driven Reaction and Greenhouse Effect of Halogenated Molecules: Culprits for Atmospheric Ozone Depletion and Global Climate ChangeInternational Journal of Modern Physics B, 2013; 1350073 DOI: 10.1142/S0217979213500732