Arquivo da tag: Previsão

Global Networks Must Be Redesigned, Experts Urge (Science Daily)

May 1, 2013 — Our global networks have generated many benefits and new opportunities. However, they have also established highways for failure propagation, which can ultimately result in human-made disasters. For example, today’s quick spreading of emerging epidemics is largely a result of global air traffic, with serious impacts on global health, social welfare, and economic systems.

Our global networks have generated many benefits and new opportunities. However, they have also established highways for failure propagation, which can ultimately result in human-made disasters. For example, today’s quick spreading of emerging epidemics is largely a result of global air traffic, with serious impacts on global health, social welfare, and economic systems. (Credit: © Angie Lingnau / Fotolia)

Helbing’s publication illustrates how cascade effects and complex dynamics amplify the vulnerability of networked systems. For example, just a few long-distance connections can largely decrease our ability to mitigate the threats posed by global pandemics. Initially beneficial trends, such as globalization, increasing network densities, higher complexity, and an acceleration of institutional decision processes may ultimately push human-made or human-influenced systems towards systemic instability, Helbing finds. Systemic instability refers to a system, which will get out of control sooner or later, even if everybody involved is well skilled, highly motivated and behaving properly. Crowd disasters are shocking examples illustrating that many deaths may occur even when everybody tries hard not to hurt anyone.

Our Intuition of Systemic Risks Is Misleading

Networking system components that are well-behaved in separation may create counter-intuitive emergent system behaviors, which are not well-behaved at all. For example, cooperative behavior might unexpectedly break down as the connectivity of interaction partners grows. “Applying this to the global network of banks, this might actually have caused the financial meltdown in 2008,” believes Helbing.

Globally networked risks are difficult to identify, map and understand, since there are often no evident, unique cause-effect relationships. Failure rates may change depending on the random path taken by the system, with the consequence of increasing risks as cascade failures progress, thereby decreasing the capacity of the system to recover. “In certain cases, cascade effects might reach any size, and the damage might be practically unbounded,” says Helbing. “This is quite disturbing and hard to imagine.” All of these features make strongly coupled, complex systems difficult to predict and control, such that our attempts to manage them go astray.

“Take the financial system,” says Helbing. “The financial crisis hit regulators by surprise.” But back in 2003, the legendary investor Warren Buffet warned of mega-catastrophic risks created by large-scale investments into financial derivatives. It took 5 years until the “investment time bomb” exploded, causing losses of trillions of dollars to our economy. “The financial architecture is not properly designed,” concludes Helbing. “The system lacks breaking points, as we have them in our electrical system.” This allows local problems to spread globally, thereby reaching catastrophic dimensions.

A Global Ticking Time Bomb?

Have we unintentionally created a global time bomb? If so, what kinds of global catastrophic scenarios might humans face in complex societies? A collapse of the world economy or of our information and communication systems? Global pandemics? Unsustainable growth or environmental change? A global food or energy crisis? A cultural clash or global-scale conflict? Or will we face a combination of these contagious phenomena — a scenario that the World Economic Forum calls the “perfect storm”?

“While analyzing such global risks,” says Helbing, “one must bear in mind that the propagation speed of destructive cascade effects might be slow, but nevertheless hard to stop. It is time to recognize that crowd disasters, conflicts, revolutions, wars, and financial crises are the undesired result of operating socio-economic systems in the wrong parameter range, where systems are unstable.” In the past, these social problems seemed to be puzzling, unrelated, and almost “God-given” phenomena one had to live with. Nowadays, thanks to new complexity science models and large-scale data sets (“Big Data”), one can analyze and understand the underlying mechanisms, which let complex systems get out of control.

Disasters should not be considered “bad luck.” They are a result of inappropriate interactions and institutional settings, caused by humans. Even worse, they are often the consequence of a flawed understanding of counter-intuitive system behaviors. “For example, it is surprising that we didn’t have sufficient precautions against a financial crisis and well-elaborated contingency plans,” states Helbing. “Perhaps, this is because there should not be any bubbles and crashes according to the predominant theoretical paradigm of efficient markets.” Conventional thinking can cause fateful decisions and the repetition of previous mistakes. “In other words: While we want to do the right thing, we often do wrong things,” concludes Helbing. This obviously calls for a paradigm shift in our thinking. “For example, we may try to promote innovation, but suffer economic decline, because innovation requires diversity more than homogenization.”

Global Networks Must Be Re-Designed

Helbing’s publication explores why today’s risk analysis falls short. “Predictability and controllability are design issues,” stresses Helbing. “And uncertainty, which means the impossibility to determine the likelihood and expected size of damage, is often man-made.” Many systems could be better managed with real-time data. These would allow one to avoid delayed response and to enhance the transparency, understanding, and adaptive control of systems. However, even all the data in the world cannot compensate for ill-designed systems such as the current financial system. Such systems will sooner or later get out of control, causing catastrophic human-made failure. Therefore, a re-design of such systems is urgently needed.

Helbing’s Nature paper on “Globally Networked Risks” also calls attention to strategies that make systems more resilient, i.e. able to recover from shocks. For example, setting up backup systems (e.g. a parallel financial system), limiting the system size and connectivity, building in breaking points to stop cascade effects, or reducing complexity may be used to improve resilience. In the case of financial systems, there is still much work to be done to fully incorporate these principles.

Contemporary information and communication technologies (ICT) are also far from being failure-proof. They are based on principles that are 30 or more years old and not designed for today’s use. The explosion of cyber risks is a logical consequence. This includes threats to individuals (such as privacy intrusion, identity theft, or manipulation through personalized information), to companies (such as cybercrime), and to societies (such as cyberwar or totalitarian control). To counter this, Helbing recommends an entirely new ICT architecture inspired by principles of decentralized self-organization as observed in immune systems, ecology, and social systems.

Coming Era of Social Innovation

A better understanding of the success principles of societies is urgently needed. “For example, when systems become too complex, they cannot be effectively managed top-down” explains Helbing. “Guided self-organization is a promising alternative to manage complex dynamical systems bottom-up, in a decentralized way.” The underlying idea is to exploit, rather than fight, the inherent tendency of complex systems to self-organize and thereby create a robust, ordered state. For this, it is important to have the right kinds of interactions, adaptive feedback mechanisms, and institutional settings, i.e. to establish proper “rules of the game.” The paper offers the example of an intriguing “self-control” principle, where traffic lights are controlled bottom-up by the vehicle flows rather than top-down by a traffic center.

Creating and Protecting Social Capital

“One man’s disaster is another man’s opportunity. Therefore, many problems can only be successfully addressed with transparency, accountability, awareness, and collective responsibility,” underlines Helbing. Moreover, social capital such as cooperativeness or trust is important for economic value generation, social well-being and societal resilience, but it may be damaged or exploited. “Humans must learn how to quantify and protect social capital. A warning example is the loss of trillions of dollars in the stock markets during the financial crisis.” This crisis was largely caused by a loss of trust. “It is important to stress that risk insurances today do not consider damage to social capital,” Helbing continues. However, it is known that large-scale disasters have a disproportionate public impact, in part because they destroy social capital. As we neglect social capital in risk assessments, we are taking excessive risks.

Journal Reference:

  1. Dirk Helbing. Globally networked risks and how to respondNature, 2013; 497 (7447): 51 DOI:10.1038/nature12047

European carbon market in trouble (Washington Post)

By Published: May 5

LONDON — As the centerpiece of Europe’s pledge to lead the global battle against climate change, the region’s market for carbon emissions effectively turned pollution into a commodity that could be traded like gold or oil. But the once-thriving pollution trade here has turned into a carbon bust.Under the system, 31 nations slapped emission limits on more than 11,000 companies and issued carbon credits that could be traded by firms to meet their new pollution caps. More efficient ones could sell excess carbon credits, while less efficient ones were compelled to buy more. By August 2008, the price for carbon emission credits had soared above $40 per ton — high enough to become an added incentive for some companies to increase their use of cleaner fuels, upgrade equipment and take other steps to reduce carbon footprints.

Europe's carbon-trading market

Europe’s carbon-trading market

That system, however, is in deep trouble. A drastic drop in industrial activity has sharply reduced the need for companies to buy emission rights, causing a gradual fall in the price of carbon allowances since the region slipped into a multi-year economic crisis in the latter half of 2008. In recent weeks, however, the price has appeared to have entirely collapsed — falling below $4 as bickering European nations failed to agree on measures to shore up the program.The collapsing price of carbon in Europe is darkening the outlook for a greener future in a part of the world that was long the bright spot in the struggle against climate change. It is also presenting new challenges for those who once saw Europe’s program as the natural anchor for what would eventually be a linked network of cap-and-trade systems worldwide.

Carbon “started as the commodity of the future, but it has now deteriorated,” said Matthew Gray, a trader at Jefferies Bache in London and one of a diminishing breed of carbon dealers in Europe. “Its future is uncertain.”

The problems plaguing Europe’s cap-and-trade system underscore the uphill battle for international cooperation in the global-warming fight. After middling progress at various summits, officials from more than 190 countries have been charged with forging a global accord by 2015 aimed at cutting carbon emissions. But critics point to the inability of even the European Union — a largely progressive region bound by open borders and a shared bureaucracy — to come together on a fix for its cap-and-trade system as evidence of how difficult consensus building on climate change has become.

Negotiations to launch a similar system across the United States collapsed in 2010, replaced with a regional approach in which California, for instance, moved forward with its own program. Aided by a boom in cheaper and cleaner shale gas as well as the spread of more renewable energies, including wind and solar, the United States has — like Europe — nevertheless seen a continuing drop in its overall emission levels.

But there are also signs that years of increasing investment in clean energies are ebbing on both sides of the Atlantic. In 2012, overall clean-energy investment in the United States fell 37 percent,to $35.6 billion, compared with a year earlier, according to a new report by the Pew Charitable Trusts. European countries, including green leaders such as Germany, also saw declines, leading analysts to call the problems with the region’s cap-and-trade system that much more troubling.

“Obviously, what’s happening now is very disheartening for people who have been involved in trying to cut carbon emissions,” said Agustin Silvani, managing director of carbon finance at Conservation International in Arlington, Va. “The European system was at the center of the global fight, and the fact that it is collapsing is definitely a blow. Maybe a moral one more than anything else.”Lost incentive

The cap-and-trade program is based on a system of carbon allowances for large emitters such as utilities and manufacturers, with some bought and others awarded for free. Companies are allowed to draw on global mitigation projects — such as planting trees in tropical rain forests — to offset a small portion of their emissions. But for the most part, they must meet targets through carbon credits issued by European authorities.A number of other factors, including mandates and subsidies for renewable energy, have coaxed European companies to reduce their emissions in recent years. But in the early stages of the cap-and-trade program, “higher carbon prices were a big incentive for companies to take action,” said Marcus Ferdinand, senior market analyst for Thomson Reuters Point Carbon. “Now, they’ve lost that incentive.”

At the core of the problem is a massive oversupply of carbon allowances. Demand for carbon began to fade in the late 2000s as a recession set in and factories across Europe dramatically curbed production. But there were also built-in flaws. Unlike newer cap-and-trade programs such as the one in California, Europe’s system never established a price floor that could have prevented a market collapse. In addition, too many free allowances were given to too many companies. Some, in fact, never had to pay for allowances at all, allowing them to hoard them or even sell their carbon credits at a profit.

On April 16, the European Parliament was on the verge of temporarily tightening the supply of allowances to boost the price of carbon and shore up the ailing market. But opposition by countries led by Poland — a nation strongly dependent on heavy-emitting coal power plants — defeated the measure. The rejection sent the price of carbon plummeting to a historic low of roughly $3.60.

Shoring up prices

A bright future for cap-and-trade systems may yet exist. Promising new programs, for instance, are being rolled out in California, Australia, Quebec and a few provinces in China, with officials in some areas setting a minimum price for carbon credits to prevent the kind of market collapse seen in Europe.

But if Europe is unable to shore up the price for carbon credits here, observers say, it could complicate hopes down the line of linking various programs together. The price per ton in California, for instance, is above $10 — about two and half times the price in Europe.

Large emitters such as the steel industry, however, say the system is working just fine. With a price determined by supply and demand, industry groups say, it is only fitting for the price to be low now. Also, given the region’s weaker economic activity, they note that the European Union is still virtually assured of meeting its pledge to cut carbon emissions — a reduction of 20 percent by 2020 compared with 1990 levels — even with the cap-and-trade system faltering.

Yet critics argue that the low price of carbon has removed the incentive for European companies to reduce their carbon footprints. They point to a boom in the use of cheap imported American coal in European power plants. In addition, many fear that the lack of an incentive to make more green upgrades will create a boom in emissions if and when European economies recover.

As the regional plan falters, some countries are going it alone on domestic initiatives. This year, for instance, Britain introduced a carbon tax on emissions that British manufacturers say has put them at a competitive disadvantage with their counterparts on the continent. It suggests the potential pitfalls ahead as countries and even smaller jurisdictions such as states, provinces and cities introduce a disparate patchwork of climate-change measures.

Optimists point to hope that the European Parliament will once again vote on a measure to tighten the supply of carbon credits in the coming months, thus shoring up the price. They also note that the European Commission is studying more ambitious proposals for a bigger overhaul of the region’s cap-and-trade system.

But given the growing resistance in some European countries to anything that might drive energy costs up further, others wonder whether Europe’s leaders still have the political will to take aggressive action.

“We’re risking the credibility of European politicians by not fixing this system,” said Johannes Teyssen, chief executive of German energy giant E.ON. “How can they travel to world climate-change conferences claiming others should do more when our own system is on its deathbed and they do nothing?”

Eliza Mackintosh contributed to this report.

*   *   *

In Europe, Paid Permits for Pollution Are Fizzling (N.Y.Times)

Andrew Testa for The International Herald Tribune. The trading floor at CF Partners in West London. The market for carbon permits is more volatile than its founders envisioned.

By STANLEY REED and MARK SCOTT

Published: April 21, 2013

LONDON — On a showery afternoon last week in West London, a ripple of enthusiasm went through the trading floor of CF Partners, a privately owned financial company. The price of carbon allowances, shown in green lights on a board hanging from the ceiling, was creeping up toward three euros.

*The Emissions Trading System began with a test phase that ended in 2007. Note: Data are for the futures contract expiring in mid-December each year. Phase 2 price was initially for the December 2008 futures contract.

That is pretty small change — $3.90, or only about 10 percent of what the price was in 2008. But to the traders it came as a relief after the market had gone into free fall to record lows two days earlier, after the European Parliament spurned an effort to shore up prices by shrinking the number of allowances.

“The market still stands,” said Thomas Rassmuson, a native of Sweden who founded the company with Jonathan Navon, a Briton, in 2006.

Still, Europe’s carbon market, a pioneering effort to use markets to regulate greenhouse gases, is having a hard time staying upright. This year has been stomach-churning for the people who make their living in the arcane world of trading emissions permits. The most recent volatility comes on top of years of uncertainty during which prices have fluctuated from $40 to nearly zero for the right to emit one ton of carbon dioxide.

More important, though, than lost jobs and diminished payouts for traders and bankers, the penny ante price of carbon credits means the market is not doing its job: pushing polluters to reduce carbon emissions, which most climate scientists believe contribute to global warming.

The market for these credits, officially called European Union Allowances, or E.U.A.’s, has been both unstable and under sharp downward pressure this year because of a huge oversupply and a stream of bad political and economic news. On April 16, for instance, after the European Parliament voted down the proposed reduction in the number of credits, prices dropped about 50 percent, to 2.63 euros from nearly 5, in 10 minutes.

“No one was going to buy” on the way down, said Fred Payne, a trader with CF Partners.

Europe’s troubled experience with carbon trading has also discouraged efforts to establish large-scale carbon trading systems in other countries, including the United States, although California and a group of Northeastern states have set up smaller regional markets.

Traders do not mind big price swings in any market — in fact, they can make a lot of money if they play them right.

But over time, the declining prices for the credits have sapped the European market of value, legitimacy and liquidity — the ease with which the allowances can be traded — making it less attractive for financial professionals.

A few years ago, analysts thought world carbon markets were heading for the $2 trillion mark by the end of this decade.

Today, the reality looks much more modest. Total trading last year was 62 billion euros, down from 96 billion in 2011, according to Thomson Reuters Point Carbon, a market research firm based in Oslo. Close to 90 percent of that activity was in Europe, while North American trading represented less than 1 percent of worldwide market value.

Financial institutions that had rushed to increase staff have shrunk their carbon desks. Companies have also laid off other professionals who helped set up greenhouse gas reduction projects in developing countries like China and India.

When the emissions trading system was started in 2005, the goal was to create a global model for raising the costs of emitting greenhouse gases and for prodding industrial polluters to switch from burning fossil fuels to using clean-energy alternatives like wind and solar.

When carbon prices hit their highs of more than 30 euros in 2008 and companies spent billions to invest in renewables, policy makers hailed the market as a success. But then prices began to fall. And at current levels, they are far too low to change companies’ behaviors, analysts say. Emitting a ton of carbon dioxide costs about the same as a hamburger.

“At the moment, the carbon price does not give any signal for investment,” said Hans Bünting, chief executive of RWE, one of the largest utilities in Germany and Europe.

This cap-and-trade system in Europe places a ceiling on emissions. At the end of each year, companies like electric utilities or steel manufacturers must hand over to the national authorities the permits equivalent to the amount gases emitted.

Until the end of 2012, these credits were given to companies free according to their estimated output of greenhouse gases. Policy makers wanted to jump-start the trading market and avoid higher costs for consumers.

Beginning this year, energy companies must buy an increasing proportion of their credits in national auctions. Industrial companies like steel plants will follow later this decade.

Companies and other financial players like banks and hedge funds can also acquire and trade the allowances on exchanges like the IntercontinentalExchange, based in Atlanta. Over time the number of credits is meant to fall gradually, theoretically raising prices and cutting pollution.

The reality has been far different because of serious flaws in the design of the system. To win over companies and skeptical countries like Poland, which burn a lot of coal, far too many credits have been handed out.

At the same time, Europe’s debilitating economic slowdown has sharply curtailed industrial activity and reduced the Continent’s overall carbon emissions.

Steel making in Europe, for instance, has fallen about 30 percent since 2007, while new car registrations were at their lowest level last year since 1995.

Big investments in renewable energy sources like wind and solar also reduced carbon emissions, which have fallen about 10 percent in Europe since 2007.

As a result, there is a vast surplus of permits — about 800 million tons’ worth, according to Point Carbon. That has caused prices to plunge.

The cost of carbon is far too low to force electric utilities in Europe to switch from burning coal, a major polluter, to much cleaner natural gas. Just the opposite: Britain increased coal burning for electricity more than 30 percent last year, while cutting back gas use a similar amount, and other West European nations increased their coal use as well.

“The European energy scene is not a good one,” said Andrew Brown, head of exploration and production at Royal Dutch Shell. “They haven’t got the right balance in terms of promoting gas.”

Fearing that prices might go to zero because of the huge oversupply, the European authorities proposed a short-term solution known as backloading, which would have delayed the scheduled auctioning of a large portion of the credits that were supposed to be sold over the next three years. But the European Parliament in Strasbourg voted the measure down on April 16.

Lawmakers were worried about tampering with the market as well as doing anything that might increase energy costs in the struggling economy.

“It was the worst possible moment to try to implement something like that,” said Francesco Starace, chief executive of Enel Green Power, one of the largest European green-energy companies, which is based in Rome.

The European authorities, led by Connie Hedegaard, the European commissioner for climate change, have not given up on fixing the system. But analysts like Stig Scholset, at Point Carbon, say that there is not much the authorities can do in the short term and that prices may slump for months, if not years.

That means more tough times for financial institutions. Particularly troubled is the business of investing in greenhouse gas abatement projects like wind farms orhydroelectric dams in developing countries like China. JPMorgan Chase paid more than $200 million for one of the largest investors in these projects, EcoSecurities, in 2009.

Financiers say these projects used to be gold mines, generating credits that industrial companies could use to offset their emissions elsewhere. But so many credits have been produced by these projects — on top of the existing oversupply of credits in Europe — that they are trading at about a third of a euro.

Market participants say they see many rivals pulling back from world carbon markets. Deutsche Bank, the largest bank in Germany, has cut back its carbon trading. Smaller outfits like Mabanaft, based in Rotterdam, have also left the business.

Anthony Hobley, a lawyer in London and president of the Climate Market and Investors Association, an industry group, estimates that among the traders, analysts and bankers who flocked to the carbon markets in the early days, half may now be gone.

But carbon trading is unlikely to fade completely.

For one thing, European utilities and other companies now must buy the credits to comply with the rules. And they can buy credits to save for later use, when their emissions increase and the price of credits rises.

Despite Europe’s sputters, carbon trading is beginning to gain traction in places like China, Australia and New Zealand.

In London, Mr. Rassmuson concedes that the business has turned out to be more up-and-down than he anticipated when he and his partner set up their firm in a tiny two-man office in 2006.

But he said his firm was benefiting from others’ dropping out. He is also branching out into trading electric power and natural gas.

Like many in the carbon markets, he says what he is doing is not just about money.

“Trying to make the world more sustainable is important to us,” he said. “It is a good business opportunity that makes us proud.”

A version of this article appeared in print on April 22, 2013, on page B1 of the New York edition with the headline: In Europe, Paid Permits For Pollution Are Fizzling.

Hidrelétricas podem afetar sistema hidrológico do Pantanal (Fapesp)

Projeto para construção de mais 87 pequenas centrais hidrelétricas na bacia do Alto Paraguai pode afetar conectividade da área de planalto com a de planície do bioma pantaneiro e dificultar fluxo migratório de peixes e outras espécies aquáticas, alertam pesquisadores (Walfrido Tomas)

23/04/2013

Por Elton Alisson

Agência FAPESP – O projeto de construção de mais 87 Pequenas Centrais Hidrelétricas (PCHs) na Bacia do Alto Paraguai, em discussão atualmente, pode afetar a conectividade do planalto – onde nasce o Rio Paraguai e seus afluentes – e a planície inundada do Pantanal – por onde as águas desses rios escoam –, dificultando o fluxo migratório de peixes e outras espécies aquáticas e semiaquáticas pelo sistema hidrológico.

O alerta foi feito por pesquisadores durante o terceiro evento do Ciclo de Conferências 2013 do BIOTA Educação, que teve como tema o Pantanal. O evento foi realizado pelo programa BIOTA-FAPESP no dia 18 de abril, na sede da FAPESP.

De acordo com José Sabino, professor da Universidade Anhanguera-Uniderp, o impacto das PCHs já existentes na região da Bacia do Alto Paraguai não são tão grandes porque, em geral, baseiam-se em uma tecnologia denominada “a fio d’água” – que dispensa a necessidade de manter grandes reservatórios de água.

A somatória das cerca de 30 PCHs existentes com as 87 planejadas, no entanto, pode impactar a hidrologia e a conectividade das águas do planalto e da planície da Bacia do Alto Paraguai e dificultar processos migratórios de espécies de peixes do Pantanal, alertou o especialista.

“A criação dessas PCHs pode causar a quebra de conectividade hidrológica de populações e de processos migratórios reprodutivos, como a piracema, de algumas espécies de peixes”, disse Sabino.

Durante a piracema, o período de procriação que antecede as chuvas do verão, algumas espécies de peixes, como o curimbatá (Prochilodus lineatus) e o dourado (Salminus brasiliensis), sobem os rios até as nascentes para desovar.

Se o acesso às cabeceiras dos rios for interrompido por algum obstáculo, como uma PCH, a piracema pode ser dificultada. “A construção de mais PCHs na região do Pantanal pode ter uma influência sistêmica sobre o canal porque, além de mudar o funcionamento hidrológico, também deve alterar a força da carga de nutrientes carregada pelas águas das nascentes dos rios no planalto que entram na planície pantaneira”, disse Walfrido Moraes Tomas, pesquisador do Centro de Pesquisa Agropecuária do Pantanal (CPAP) da Empresa Brasileira de Pesquisa Agropecuária (Embrapa), no Mato Grosso do Sul, palestrante na conferência na FAPESP.

“Isso também poderá ter impactos nos hábitats de espécies aquáticas ou semiaquáticas”, reiterou Tomas. De acordo com o pesquisador, o Pantanal é uma das áreas úmidas mais ricas em espécies do mundo, distribuídas de forma abundante, mas não homogênea, pela planície pantaneira.

Alguns dos últimos levantamentos de espécies apontaram que o bioma possui 269 espécies de peixes, 44 de anfíbios, 127 de répteis, 582 de aves e 152 de mamíferos.

São necessários, no entanto, mais inventários de espécies para preencher lacunas críticas de conhecimento sobre outros grupos, como o dos invertebrados – sobre os quais ainda não há levantamento sobre o número de espécies –, além de crustáceos, moluscos e lepidópteros (ordem de insetos que inclui as borboletas), que ainda são pouco conhecidos.

“Uma iniciativa que vai nos dar uma grande contribuição nesse sentido será o programa Biota Mato Grosso do Sul, que começou ser implementado há três anos”, disse Tomas.

Inspirado no BIOTA-FAPESP, o programa Biota Mato Grosso do Sul pretende consolidar a infraestrutura de coleções e acervos em museus, herbários, jardins botânicos, zoológicos e bancos de germoplasma do Mato Grosso do Sul para preencher lacunas de conhecimento, taxonômicas e geográficas, sobre a diversidade biológica no estado.

Para atingir esse objetivo, pesquisadores pretendem informatizar os acervos e coleções científicas e estabelecer uma rede de informação em biodiversidade entre todas as instituições envolvidas com a pesquisa e conservação de biodiversidade do Mato Grosso do Sul.

“Começamos agora a fazer os primeiros inventários de espécies de regiões- chave do estado e estamos preparando um volume especial da revista Biota Neotropica sobre a biodiversidade de Mato Grosso do Sul, que será um passo fundamental para verificarmos as informações disponíveis sobre a biota do Pantanal e direcionar nossas ações”, disse Tomas à Agência FAPESP.

“Diferentemente do Estado de São Paulo, que tem coleções gigantescas, Mato Grosso do Sul não dispõe de grandes coleções para fazermos mapeamentos de diversidade. Por isso, precisaremos ir a campo para fazer os inventários”, explicou.

Espécies ameaçadas

Segundo Tomas, das espécies de aves ameaçadas, vulneráveis ou em perigo de extinção no Brasil, por exemplo, 188 podem ser encontradas no Pantanal. No entanto, diminuiu muito nos últimos anos a ocorrência de caça de espécies como onça-pintada, onça-parda, ariranha, arara-azul – ave símbolo do Pantanal – e jacaré.

E não há indícios de que a principal atividade econômica da região – a pecuária, que possibilitou a ocupação humana do bioma em um primeiro momento em razão de o ambiente ser uma savana inundada com pastagem renovada todo ano – tenha causado impactos na biota pantaneira.

“Pelo que sabemos até agora, nenhuma espécie da fauna do Pantanal foi levada a risco de extinção por causa da pecuária”, afirmou Tomas. Já a pesca – a segunda atividade econômica mais intensiva no Pantanal – pode ter impactos sobre algumas espécies de peixes.

Isso porque a atividade está focalizada em 20 das 270 espécies de peixes do bioma pantaneiro, em razão do tamanho, sabor da carne e pela própria cultura regional.

Entre elas, estão o dourado, o curimbatá, a piraputanga (Brycon hilarii), o pacu (Piaractus mesopotamicus) e a cachara (Pseudoplatystoma reticulatum) – um peixe arisco encontrado em rios como Prata e Olho D’água, que pode chegar a medir 1,20 metro e pesar 40 quilos.

“Há indícios de que, pelo fato de a pesca no Pantanal ser direcionada a algumas espécies, a atividade possa reduzir algumas populações de peixes”, disse Sabino.

Além de Sabino e Tomas, o professor Arnildo Pott, da Universidade Federal de Mato Grosso do Sul (UFMS), de Campo Grande, também proferiu palestra, sobre a origem, evolução e diversidade da vegetação do Bioma Pantanal.

Estratégias de conservação

Os pesquisadores também chamaram a atenção para o fato de que, atualmente, apenas cerca de 5% do Pantanal está protegido por unidades de conservação. E que muitas das espécies de animais da região, como a onça- pintada, a ariranha e a arara-azul, por exemplo, não são protegidas efetivamente, porque ficam fora dessas unidades de conservação.

“A conservação de espécies ameaçadas no Pantanal requer estratégias mais amplas do que apenas a implantação ou gestão das unidades de conservação”, destacou Tomas. “São necessárias políticas de gestão de bacias hidrográficas e de remuneração por serviços ecossistêmicos para assegurar a conservação de espécies ameaçadas.”

Organizado pelo Programa BIOTA-FAPESP, o Ciclo de Conferências 2013 tem o objetivo de contribuir para o aperfeiçoamento do ensino de ciência. A quarta etapa será no dia 16 de maio, quando o tema será “Bioma Cerrado”. Seguem-se conferências sobre os biomas Caatinga (20 de junho), Mata Atlântica (22 de agosto), Amazônia (19 de setembro), Ambientes Marinhos e Costeiros (24 de outubro) e Biodiversidade em Ambientes Antrópicos – Urbanos e Rurais (21 de novembro).

Earth’s Current Warmth Not Seen in the Last 1,400 Years or More, Says Study (Science Daily)

Apr. 21, 2013 — Fueled by industrial greenhouse gas emissions, Earth’s climate warmed more between 1971 and 2000 than during any other three-decade interval in the last 1,400 years, according to new regional temperature reconstructions covering all seven continents. This period of humanmade global warming, which continues today, reversed a natural cooling trend that lasted several hundred years, according to results published in the journalNature Geoscience by more than 80 scientists from 24 nations analyzing climate data from tree rings, pollen, cave formations, ice cores, lake and ocean sediments, and historical records from around the world.

During Europe’s 2003 heat wave, July temperatures in France were as much as 18 degrees F hotter than in 2001. (Credit: NASA)

“This paper tells us what we already knew, except in a better, more comprehensive fashion,” said study co-author Edward Cook, a tree-ring scientist at Lamont-Doherty Earth Observatory who led the Asia reconstruction.

The study also found that Europe’s 2003 heat wave and drought, which killed an estimated 70,000 people, happened during Europe’s hottest summer of the last 2,000 years. “Summer temperatures were intense that year and accompanied by a lack of rain and very dry soil conditions over much of Europe,” said study co-author Jason Smerdon, a climate scientist at Lamont-Doherty and one of the lead contributors to the Europe reconstruction. Though summer 2003 set a record for Europe, global warming was only one of the factors that contributed to the temperature conditions that summer, he said.

The study is the latest to show that the Medieval Warm Period, from about 950 to 1250, may not have been global, and may not have happened at the same time in places that did grow warmer. While parts of Europe and North America were fairly warm between 950 and 1250, South America stayed relatively cold, the study says. Some people have argued that the natural warming that occurred during the medieval ages is happening today, and that humans are not responsible for modern day global warming. Scientists are nearly unanimous in their disagreement “If we went into another Medieval Warm Period again that extra warmth would be added on top of warming from greenhouse gases,” said Cook.

Temperatures varied less between continents in the same hemisphere than between hemispheres. “Distinctive periods, such as the Medieval Warm Period or the Little Ice Age stand out, but do not show a globally uniform pattern,” said co-author Heinz Wanner, a scientist at the University of Bern. By 1500, temperatures dropped below the long-term average everywhere, though colder temperatures emerged several decades earlier in the Arctic, Europe and Asia.

The most consistent trend across all regions in the last 2,000 years was a long-term cooling, likely caused by a rise in volcanic activity, decrease in solar irradiance, changes in land-surface vegetation, and slow variations in Earth’s orbit. With the exception of Antarctica, cooling tapered off at the end of the 19th century, with the onset of industrialization. Cooler 30-year periods between 830 and 1910 were particularly pronounced during weak solar activity and strong tropical volcanic eruptions. Both phenomena often occurred simultaneously and led to a drop in the average temperature during five distinct 30- to 90-year intervals between 1251 and 1820. Warming in the 20th century was on average twice as large in the northern continents as it was in the Southern Hemisphere. During the past 2000 years, some regions experienced warmer 30-year intervals than during the late 20th century. For example, in Europe the years between 21 and 80 AD were likely warmer than the period 1971-2000.

Mathematical Models Out-Perform Doctors in Predicting Cancer Patients’ Responses to Treatment (Science Daily)

Apr. 19, 2013 — Mathematical prediction models are better than doctors at predicting the outcomes and responses of lung cancer patients to treatment, according to new research presented today (Saturday) at the 2nd Forum of the European Society for Radiotherapy and Oncology (ESTRO).

These differences apply even after the doctor has seen the patient, which can provide extra information, and knows what the treatment plan and radiation dose will be.

“The number of treatment options available for lung cancer patients are increasing, as well as the amount of information available to the individual patient. It is evident that this will complicate the task of the doctor in the future,” said the presenter, Dr Cary Oberije, a postdoctoral researcher at the MAASTRO Clinic, Maastricht University Medical Center, Maastricht, The Netherlands. “If models based on patient, tumour and treatment characteristics already out-perform the doctors, then it is unethical to make treatment decisions based solely on the doctors’ opinions. We believe models should be implemented in clinical practice to guide decisions.”

Dr Oberije and her colleagues in The Netherlands used mathematical prediction models that had already been tested and published. The models use information from previous patients to create a statistical formula that can be used to predict the probability of outcome and responses to treatment using radiotherapy with or without chemotherapy for future patients.

Having obtained predictions from the mathematical models, the researchers asked experienced radiation oncologists to predict the likelihood of lung cancer patients surviving for two years, or suffering from shortness of breath (dyspnea) and difficulty swallowing (dysphagia) at two points in time:

1) after they had seen the patient for the first time, and

2) after the treatment plan was made. At the first time point, the doctors predicted two-year survival for 121 patients, dyspnea for 139 and dysphagia for 146 patients.

At the second time point, predictions were only available for 35, 39 and 41 patients respectively.

For all three predictions and at both time points, the mathematical models substantially outperformed the doctors’ predictions, with the doctors’ predictions being little better than those expected by chance.

The researchers plotted the results on a special graph [1] on which the area below the plotted line is used for measuring the accuracy of predictions; 1 represents a perfect prediction, while 0.5 represents predictions that were right in 50% of cases, i.e. the same as chance. They found that the model predictions at the first time point were 0.71 for two-year survival, 0.76 for dyspnea and 0.72 for dysphagia. In contrast, the doctors’ predictions were 0.56, 0.59 and 0.52 respectively.

The models had a better positive predictive value (PPV) — a measure of the proportion of patients who were correctly assessed as being at risk of dying within two years or suffering from dyspnea and dysphagia — than the doctors. The negative predictive value (NPV) — a measure of the proportion of patients that would not die within two years or suffer from dyspnea and dysphagia — was comparable between the models and the doctors.

“This indicates that the models were better at identifying high risk patients that have a very low chance of surviving or a very high chance of developing severe dyspnea or dysphagia,” said Dr Oberije.

The researchers say that it is important that further research is carried out into how prediction models can be integrated into standard clinical care. In addition, further improvement of the models by incorporating all the latest advances in areas such as genetics, imaging and other factors, is important. This will make it possible to tailor treatment to the individual patient’s biological make-up and tumour type

“In our opinion, individualised treatment can only succeed if prediction models are used in clinical practice. We have shown that current models already outperform doctors. Therefore, this study can be used as a strong argument in favour of using prediction models and changing current clinical practice,” said Dr Oberije.

“Correct prediction of outcomes is important for several reasons,” she continued. “First, it offers the possibility to discuss treatment options with patients. If survival chances are very low, some patients might opt for a less aggressive treatment with fewer side-effects and better quality of life. Second, it could be used to assess which patients are eligible for a specific clinical trial. Third, correct predictions make it possible to improve and optimise the treatment. Currently, treatment guidelines are applied to the whole lung cancer population, but we know that some patients are cured while others are not and some patients suffer from severe side-effects while others don’t. We know that there are many factors that play a role in the prognosis of patients and prediction models can combine them all.”

At present, prediction models are not used as widely as they could be by doctors. Dr Oberije says there are a number of reasons: some models lack clinical credibility; others have not yet been tested; the models need to be available and easy to use by doctors; and many doctors still think that seeing a patient gives them information that cannot be captured in a model. “Our study shows that it is very unlikely that a doctor can outperform a model,” she concluded.

President of ESTRO, Professor Vincenzo Valentini, a radiation oncologist at the Policlinico Universitario A. Gemelli, Rome, Italy, commented: “The booming growth of biological, imaging and clinical information will challenge the decision capacity of every oncologist. The understanding of the knowledge management sciences is becoming a priority for radiation oncologists in order for them to tailor their choices to cure and care for individual patients.”

[1] For the mathematicians among you, the graph is known as an Area Under the Curve (AUC) of the Receiver Operating Characteristic (ROC).

[2] This work was partially funded by grants from the Dutch Cancer Society (KWF), the European Fund for Regional Development (INTERREG/EFRO), and the Center for Translational Molecular Medicine (CTMM).

Carbon bubble will plunge the world into another financial crisis – report (The Guardian)

Trillions of dollars at risk as stock markets inflate value of fossil fuels that may have to remain buried forever, experts warn

Damian Carrington – The Guardian, Friday 19 April 2013

Carbon bubble : carbon dioxide polluting power plant : coal-fired Bruce Mansfield Power Plant

Global stock markets are betting on countries failing to adhere to legally binding carbon emission targets. Photograph: Robert Nickelsberg/Getty Images

The world could be heading for a major economic crisis as stock marketsinflate an investment bubble in fossil fuels to the tune of trillions of dollars, according to leading economists.

“The financial crisis has shown what happens when risks accumulate unnoticed,” said Lord (Nicholas) Stern, a professor at the London School of Economics. He said the risk was “very big indeed” and that almost all investors and regulators were failing to address it.

The so-called “carbon bubble” is the result of an over-valuation of oil,coal and gas reserves held by fossil fuel companies. According to a report published on Friday, at least two-thirds of these reserves will have to remain underground if the world is to meet existing internationally agreed targets to avoid the threshold for “dangerous” climate changeIf the agreements hold, these reserves will be in effect unburnable and so worthless – leading to massive market losses. But the stock markets are betting on countries’ inaction on climate change.

The stark report is by Stern and the thinktank Carbon Tracker. Their warning is supported by organisations including HSBC, Citi, Standard and Poor’s and the International Energy Agency. The Bank of England has also recognised that a collapse in the value of oil, gas and coal assets as nations tackle global warming is a potential systemic risk to the economy, with London being particularly at risk owing to its huge listings of coal.

Stern said that far from reducing efforts to develop fossil fuels, the top 200 companies spent $674bn (£441bn) in 2012 to find and exploit even more new resources, a sum equivalent to 1% of global GDP, which could end up as “stranded” or valueless assets. Stern’s landmark 2006 reporton the economic impact of climate change – commissioned by the then chancellor, Gordon Brown – concluded that spending 1% of GDP would pay for a transition to a clean and sustainable economy.

The world’s governments have agreed to restrict the global temperature rise to 2C, beyond which the impacts become severe and unpredictable. But Stern said the investors clearly did not believe action to curb climate change was going to be taken. “They can’t believe that and also believe that the markets are sensibly valued now.”

“They only believe environmental regulation when they see it,” said James Leaton, from Carbon Tracker and a former PwC consultant. He said short-termism in financial markets was the other major reason for the carbon bubble. “Analysts say you should ride the train until just before it goes off the cliff. Each thinks they are smart enough to get off in time, but not everyone can get out of the door at the same time. That is why you get bubbles and crashes.”

Paul Spedding, an oil and gas analyst at HSBC, said: “The scale of ‘listed’ unburnable carbon revealed in this report is astonishing. This report makes it clear that ‘business as usual’ is not a viable option for the fossil fuel industry in the long term. [The market] is assuming it will get early warning, but my worry is that things often happen suddenly in the oil and gas sector.”

HSBC warned that 40-60% of the market capitalisation of oil and gas companies was at risk from the carbon bubble, with the top 200 fossil fuel companies alone having a current value of $4tn, along with $1.5tn debt.

Lord McFall, who chaired the Commons Treasury select committee for a decade, said: “Despite its devastating scale, the banking crisis was at its heart an avoidable crisis: the threat of significant carbon writedown has the unmistakable characteristics of the same endemic problems.”

The report calculates that the world’s currently indicated fossil fuel reserves equate to 2,860bn tonnes of carbon dioxide, but that just 31% could be burned for an 80% chance of keeping below a 2C temperature rise. For a 50% chance of 2C or less, just 38% could be burned.

Carbon capture and storage technology, which buries emissions underground, can play a role in the future, but even an optimistic scenario which sees 3,800 commercial projects worldwide would allow only an extra 4% of fossil fuel reserves to be burned. There are currently no commercial projects up and running. The normally conservativeInternational Energy Agency has also concluded that a major part of fossil fuel reserves is unburnable.

Citi bank warned investors in Australia’s vast coal industry that little could be done to avoid the future loss of value in the face of action on climate change. “If the unburnable carbon scenario does occur, it is difficult to see how the value of fossil fuel reserves can be maintained, so we see few options for risk mitigation.”

Ratings agencies have expressed concerns, with Standard and Poor’s concluding that the risk could lead to the downgrading of the credit ratings of oil companies within a few years.

Steven Oman, senior vice-president at Moody’s, said: “It behoves us as investors and as a society to know the true cost of something so that intelligent and constructive policy and investment decisions can be made. Too often the true costs are treated as unquantifiable or even ignored.”

Jens Peers, who manages €4bn (£3bn) for Mirova, part of €300bn asset managers Natixis, said: “It is shocking to see the report’s numbers, as they are worse than people realise. The risk is massive, but a lot of asset managers think they have a lot of time. I think they are wrong.” He said a key moment will come in 2015, the date when the world’s governments have pledged to strike a global deal to limit carbon emissions. But he said that fund managers need to move now. If they wait till 2015, “it will be too late for them to take action.”

Pension funds are also concerned. “Every pension fund manager needs to ask themselves have we incorporated climate change and carbon risk into our investment strategy? If the answer is no, they need to start to now,” said Howard Pearce, head of pension fund management at the Environment Agency, which holds £2bn in assets.

Stern and Leaton both point to China as evidence that carbon cuts are likely to be delivered. China’s leaders have said its coal use will peak in the next five years, said Leaton, but this has not been priced in. “I don’t know why the market does not believe China,” he said. “When it says it is going to do something, it usually does.” He said the US and Australia were banking on selling coal to China but that this “doesn’t add up”.

Jeremy Grantham, a billionaire fund manager who oversees $106bn of assets, said his company was on the verge of pulling out of all coal and unconventional fossil fuels, such as oil from tar sands. “The probability of them running into trouble is too high for me to take that risk as an investor.” He said: “If we mean to burn all the coal and any appreciable percentage of the tar sands, or other unconventional oil and gas then we’re cooked. [There are] terrible consequences that we will lay at the door of our grandchildren.”

Politicians Found to Be More Risk-Tolerant Than the General Population (Science Daily)

Apr. 16, 2013 — According to a recent study, the popularly elected members of the German Bundestag are substantially more risk-tolerant than the broader population of Germany. Researchers in the Cluster of Excellence “Languages of Emotion” at Freie Universität Berlin and at DIW Berlin (German Institute for Economic Research) conducted a survey of Bundestag representatives and analyzed data on the general population from the German Socio-Economic Panel Study (SOEP). Results show that risk tolerance is even higher among Bundestag representatives than among self-employed people, who are themselves more risk-tolerant than salaried employees or civil servants. This was true for all areas of risk that were surveyed in the study: automobile driving, financial investments, sports and leisure activities, career, and health. The authors interpret this finding as positive.

The full results of the study were published in German in the SOEPpapers series of the German Institute for Economic Research (DIW Berlin).

The authors of the study, Moritz Hess (University of Mannheim), Prof. Dr. Christian von Scheve (Freie Universität Berlin and DIW Berlin), Prof. Dr. Jürgen Schupp (DIW Berlin and Freie Universität Berlin), and Prof. Dr. Gert G. Wagner (DIW Berlin and Technische Universität Berlin) view the above-average risk tolerance found among Bundestag representatives as positive. According to sociologist and lead author of the study Moritz Hess: “Otherwise, important societal decisions often wouldn’t be made due to the almost incalculable risks involved. This would lead to stagnation and social standstill.” The authors do not interpret the higher risk-tolerance found among politicians as a threat to democracy. “The results show a successful and sensible division of labor among citizens, voters, and politicians,” says economist Gert G. Wagner. Democratic structures and parliamentary processes, he argues, act as a brake on the individual risk propensity of elected representatives and politicians.

For their study, the research team distributed written questionnaires to all 620 members of the 17th German Bundestag in late 2011. Twenty-eight percent of Bundestag members responded. Comparisons with the statistical characteristics of all current Bundestag representatives showed that the respondents comprise a representative sample of Bundestag members. SOEP data were used to obtain a figure for the risk tolerance of the general population for comparison with the figures for Bundestag members.

The questions posed to Bundestag members were formulated analogously to the questions in the standard SOEP questionnaire. Politicians were asked to rate their own risk tolerance on a scale from zero (= not at all risk-tolerant) to ten (= very risk-tolerant). They rated both their general risk tolerance as well as their specific risk tolerance in the areas of driving, making financial investments, sports and leisure activities, career, health, and trust towards strangers. They also rated their risk tolerance in regard to political decisions. No questions on party affiliation were asked in order to exclude the possibility that results could be used for partisan political purposes.

References:

Hess, M., von Scheve, C., Schupp, J., Wagner. G. G. (2013): Members of German Federal Parliament More Risk-Loving Than General Population, in: DIW Economic Bulletin, Vol. 3, No. 4, 2013, pp. 20-24.

Hess, M., von Scheve, C., Schupp, J., Wagner. G. G. (2013): Sind Politiker risikofreudiger als das Volk? Eine empirische Studie zu Mitgliedern des Deutschen Bundestags, SOEPpaper No. 545, DIW Berlin.

Ocean’s Future Not So Bleak? Resilience Found in Shelled Plants Exposed to Ocean Acidification (Science Daily)

Apr. 12, 2013 — Marine scientists have long understood the detrimental effect of fossil fuel emissions on marine ecosystems. But a group led by a UC Santa Barbara professor has found a point of resilience in a microscopic shelled plant with a massive environmental impact, which suggests the future of ocean life may not be so bleak.

This shows cells of the coccolithophore species Emiliania huxleyi strain NZEH under present-day, left, and future high, right, carbon dioxide conditions. (Credit: UCSB)

As fossil fuel emissions increase, so does the amount of carbon dioxide oceans absorb and dissolve, lowering their pH levels. “As pH declines, there is this concern that marine species that have shells may start dissolving or may have more difficulty making calcium carbonate, the chalky substance that they use to build shells,” said Debora Iglesias-Rodriguez, a professor in UCSB’s Department of Ecology, Evolution and Marine Biology.

Iglesias-Rodriguez and postdoctoral researcher Bethan Jones, who is now at Rutgers University, led a large-scale study on the effects of ocean acidification on these tiny plants that can only be seen under the microscope. Their research, funded by the European Project on Ocean Acidification, is published in the journal PLoS ONE and breaks with traditional notions about the vitality of calcifiers, or creatures that make shells, in future ocean conditions.

“The story years ago was that ocean acidification was going to be bad, really bad for calcifiers,” said Iglesias-Rodriguez, whose team discovered that one species of the tiny single celled marine coccolithophore, Emiliania huxleyi, actually had bigger shells in high carbon dioxide seawater conditions. While the team acknowledges that calcification tends to decline with acidification, “we now know that there are variable responses in sea corals, in sea urchins, in all shelled organisms that we find in the sea.”

These E. huxleyi are a large army of ocean-regulating shell producers that create oxygen as they process carbon by photosynthesis and fortify the ocean food chain. As one of Earth’s main vaults for environmentally harmful carbon emissions, their survival affects organisms inside and outside the marine system. However, as increasing levels of atmospheric carbon dioxide causes seawater to slide down the pH scale toward acidic levels, this environment could become less hospitable.

The UCSB study incorporated an approach known as shotgun proteomics to uncover how E. huxleyi‘s biochemistry could change in future high carbon dioxide conditions, which were set at four times the current levels for the study. This approach casts a wider investigative net that looks at all changes and influences in the environment as opposed to looking at individual processes like photosynthesis.

Shotgun proteomics examines the type, abundance, and alterations in proteins to understand how a cell’s machinery is conditioned by ocean acidification. “There is no perfect approach,” said Iglesias-Rodriguez. “They all have their caveats, but we think that this is a way of extracting a lot of information from this system.”

To mirror natural ocean conditions, the team used over half a ton of seawater to grow the E. huxleyi and bubbled in carbon dioxide to recreate both present day and high future carbon levels. It took more than six months for the team to grow enough plants to accumulate and analyze sufficient proteins.

The team found that E. huxleyi cells exposed to higher carbon dioxide conditions were larger and contained more shell than those grown in current conditions. However, they also found that these larger cells grow slower than those under current carbon dioxide conditions. Aside from slower growth, the higher carbon dioxide levels did not seem to affect the cells even at the biochemical level, as measured by the shotgun proteomic approach.

“The E. huxleyi increased the amount of calcite they had because they kept calcifying but slowed down division rates,” said Iglesias-Rodriguez. “You get fewer cells but they look as healthy as those under current ocean conditions, so the shells are not simply dissolving away.”

The team stresses that while representatives of this species seem to have biochemical mechanisms to tolerate even very high levels of carbon dioxide, slower growth could become problematic. If other species grow faster, E. huxleyi could be outnumbered in some areas.

“The cells in this experiment seemed to tolerate future ocean conditions,” said Jones. “However, what will happen to this species in the future is still an open question. Perhaps the grow-slow outcome may end up being their downfall as other species could simply outgrow and replace them.”

Journal Reference:

  1. Bethan M. Jones, M. Debora Iglesias-Rodriguez, Paul J. Skipp, Richard J. Edwards, Mervyn J. Greaves, Jeremy R. Young, Henry Elderfield, C. David O’Connor. Responses of the Emiliania huxleyi Proteome to Ocean AcidificationPLoS ONE, 2013; 8 (4): e61868 DOI:10.1371/journal.pone.0061868

Carbon Dioxide Removal Can Lower Costs of Climate Protection (Science Daily)

Apr. 12, 2013 — Directly removing CO2 from the air has the potential to alter the costs of climate change mitigation. It could allow prolonging greenhouse-gas emissions from sectors like transport that are difficult, thus expensive, to turn away from using fossil fuels. And it may help to constrain the financial burden on future generations, a study now published by the Potsdam Institute for Climate Impact Research (PIK) shows. It focuses on the use of biomass for energy generation, combined with carbon capture and storage (CCS). According to the analysis, carbon dioxide removal could be used under certain requirements to alleviate the most costly components of mitigation, but it would not replace the bulk of actual emissions reductions. 

Directly removing CO2 from the air has the potential to alter the costs of climate change mitigation. It could allow prolonging greenhouse-gas emissions from sectors like transport that are difficult, thus expensive, to turn away from using fossil fuels. And it may help to constrain the financial burden on future generations, a new study shows. It focuses on the use of biomass for energy generation, combined with carbon capture and storage. (Credit: © Jürgen Fälchle / Fotolia)

“Carbon dioxide removal from the atmosphere allows to separate emissions control from the time and location of the actual emissions. This flexibility can be important for climate protection,” says lead-author Elmar Kriegler. “You don’t have to prevent emissions in every factory or truck, but could for instance plant grasses that suck CO2 out of the air to grow — and later get processed in bioenergy plants where the CO2 gets stored underground.”

In economic terms, this flexibility allows to lower costs by compensating for emissions which would be most costly to eliminate. “This means that a phase-out of global emissions by the end of the century — that we would need to hold the 2 degree line adopted by the international community — does not necessarily require to eliminate each and every source of emissions,” says Kriegler. “Decisions whether and how to protect future generations from the risks of climate change have to be made today, but the burden of achieving these targets will increase over time. The costs for future generations can be substantially reduced if carbon dioxide removal technologies become available in the long run.”

Balancing the financial burden across generations

The study now published is the first to quantify this. If bioenergy plus CCS is available, aggregate mitigation costs over the 21st century might be halved. In the absence of such a carbon dioxide removal strategy, costs for future generations rise significantly, up to a quadrupling of mitigation costs in the period of 2070 to 2090. The calculation was carried out using a computer simulation of the economic system, energy markets, and climate, covering a range of scenarios.

Options for carbon dioxide removal from the atmosphere include afforestation and chemical approaches like direct air capture of CO2 from the atmosphere or reactions of CO2 with minerals to form carbonates. But the use of biomass for energy generation combined with carbon capture and storage is less costly than chemical options, as long as sufficient biomass feedstock is available, the scientists point out.

Serious concerns about large-scale biomass use combined with CCS

“Of course, there are serious concerns about the sustainability of large-scale biomass use for energy,” says co-author Ottmar Edenhofer, chief-economist of PIK. “We therefore considered the bioenergy with CCS option only as an example of the role that carbon dioxide removal could play for climate change mitigation.” The exploitation of bioenergy can conflict with land-use for food production or ecosystem protection. To account for sustainability concerns, the study restricts the bioenergy production to a medium level, that may be realized mostly on abandoned agricultural land.

Still, global population growth and changing dietary habits, associated with an increased demand for land, as well as improvements of agricultural productivity, associated with a decreased demand for land, are important uncertainties here. Furthermore, CCS technology is not yet available for industrial-scale use and, due to environmental concerns, is controversial in countries like Germany. Yet in this study it is assumed that it will become available in the near future.

“CO2 removal from the atmosphere could enable humankind to keep the window of opportunity open for low-stabilization targets despite of a likely delay in international cooperation, but only under certain requirements,” says Edenhofer. “The risks of scaling up bioenergy use need to be better understood, and safety concerns about CCS have to be thoroughly investigated. Still, carbon dioxide removal technologies are no science fiction and need to be further explored.” In no way should they be seen as a pretext to neglect emissions reductions now, notes Edenhofer. “By far the biggest share of climate change mitigation has to come from a large effort to reduce greenhouse-gas emissions globally.”

Journal Reference:

  1. Elmar Kriegler, Ottmar Edenhofer, Lena Reuster, Gunnar Luderer, David Klein. Is atmospheric carbon dioxide removal a game changer for climate change mitigation? Climatic Change, 2013; DOI: 10.1007/s10584-012-0681-4

Maya Long Count Calendar Calibrated to Modern European Calendar Using Carbon-14 Dating (Science Daily)

Apr. 11, 2013 — The Maya are famous for their complex, intertwined calendric systems, and now one calendar, the Maya Long Count, is empirically calibrated to the modern European calendar, according to an international team of researchers.

Elaborately carved wooden lintel or ceiling from a temple in the ancient Maya city of Tikal, Guatemala, that carries a carving and dedication date in the Maya calendar. (Credit: Courtesy of the Museum der Kulturen)

“The Long Count calendar fell into disuse before European contact in the Maya area,” said Douglas J. Kennett, professor of environmental archaeology, Penn State.

“Methods of tying the Long Count to the modern European calendar used known historical and astronomical events, but when looking at how climate affects the rise and fall of the Maya, I began to question how accurately the two calendars correlated using those methods.”

The researchers found that the new measurements mirrored the most popular method in use, the Goodman-Martinez-Thompson (GMT) correlation, initially put forth by Joseph Goodman in 1905 and subsequently modified by others. In the 1950s scientists tested this correlation using early radiocarbon dating, but the large error range left open the validity of GMT.

“With only a few dissenting voices, the GMT correlation is widely accepted and used, but it must remain provisional without some form of independent corroboration,” the researchers report in today’s (April 11) issue of Scientific Reports.

A combination of high-resolution accelerator mass spectrometry carbon-14 dates and a calibration using tree growth rates showed the GMT correlation is correct.

The Long Count counts days from a mythological starting point. The date is composed of five components that combine a multiplier times 144,000 days — Bak’tun, 7,200 days — K’atun, 360 days — Tun, 20 days — Winal, and 1 day — K’in separated, in standard notation, by dots.

Archaeologists want to place the Long Count dates into the European calendar so there is an understanding of when things happened in the Maya world relative to historic events elsewhere. Correlation also allows the rich historical record of the Maya to be compared with other sources of environmental, climate and archaeological data calibrated using the European calendar.

The samples came from an elaborately carved wooden lintel or ceiling from a temple in the ancient Maya city of Tikal, Guatemala, that carries a carving and dedication date in the Maya calendar. This same lintel was one of three analyzed in the previous carbon-14 study.

Researchers measured tree growth by tracking annual changes in calcium uptake by the trees, which is greater during the rainy season.

The amount of carbon-14 in the atmosphere is incorporated into a tree’s incremental growth. Atmospheric carbon-14 changes through time, and during the Classic Maya period oscillated up and down.

The researchers took four samples from the lintel and used annually fluctuating calcium concentrations evident in the incremental growth of the tree to determine the true time distance between each by counting the number of elapsed rainy seasons. The researchers used this information to fit the four radiocarbon dates to the wiggles in the calibration curve. Wiggle-matching the carbon-14 dates provided a more accurate age for linking the Maya and Long Count dates to the European calendars.

These calculations were further complicated by known differences in the atmospheric radiocarbon content between northern and southern hemisphere.

“The complication is that radiocarbon concentrations differ between the southern and northern hemisphere,” said Kennett. “The Maya area lies on the boundary, and the atmosphere is a mixture of the southern and northern hemispheres that changes seasonally. We had to factor that into the analysis.”

The researchers results mirror the GMT European date correlations indicating that the GMT was on the right track for linking the Long Count and European calendars.

Events recorded in various Maya locations “can now be harmonized with greater assurance to other environmental, climatic and archaeological datasets from this and adjacent regions and suggest that climate change played an important role in the development and demise of this complex civilization,” the researchers wrote.

Journal Reference:

  1. Douglas J. Kennett, Irka Hajdas, Brendan J. Culleton, Soumaya Belmecheri, Simon Martin, Hector Neff, Jaime Awe, Heather V. Graham, Katherine H. Freeman, Lee Newsom, David L. Lentz, Flavio S. Anselmetti, Mark Robinson, Norbert Marwan, John Southon, David A. Hodell, Gerald H. Haug. Correlating the Ancient Maya and Modern European Calendars with High-Precision AMS 14C DatingScientific Reports, 2013; 3 DOI:10.1038/srep01597

Segue o Seco (Rolling Stone)

Edição 77 – Fevereiro de 2013

Enquanto a Bahia sofre com “a pior seca dos últimos 50 anos”, os habitantes do sertão se desdobram para superar os percalços. A esperança persiste, mas é minguada como a água da chuva

Segue o SecoFoto: Flavio Forner

Por MAÍRA KUBÍK MANO

“Para o carro! para o carro! olha ali, em cima das pedras! Tá vendo?” Não, eu não via nada. A paisagem parecia exatamente a mesma da última meia hora. Toda cor de terra, com uma ou outra catingueira no horizonte e os mandacarus, sempre em maior número, acompanhando o traçado da estrada de chão. “Lembra da cena em que o Fabiano vai tentar pegar um preá? Olha ali!”, o interlocutor insiste, apontando. Vidro abaixado, olhos a postos. Dois bichos pequenos, amarronzados e amendoados, de focinho pontudo, se mexem e se fazem notar. Pronto, lá estão os preás. Júlio César Santos fica satisfeito. Afinal, ele fora parar no sertão justamente depois de ler Vidas Secas.

“Eu sou da Zona da Mata, mas quando li Graciliano Ramos quis vir para cá”, conta Santos, um engenheiro agrônomo que se encantou pela caatinga quando ainda era estudante da Universidade Federal do Recôncavo Baiano (UFRB). Hoje, é chefe do escritório da EBDA (Empresa Baiana de Desenvolvimento Agrícola) em Ipirá, um dos 258 municípios da Bahia em situação de emergência por causa da seca. Junto com outros 17 órgãos e secretarias do governo de Jaques Wagner (PT), a EDBA faz parte do Comitê Estadual de Ações de Convivência com a Seca.

Estamos a caminho da cidade vizinha, Pintadas, onde a estiagem é ainda mais crítica. No percurso, cruzamos quatro rios. Três deles, secos. O céu nublado ao longe parece o prenúncio da mudança. Um chuvisco havia caído naquela madrugada, algo que não acontecia há muito tempo. As marcas ainda estavam na terra, em alguns sulcos rasos que provavelmente abrigaram fios de água corrente. Santos parece aliviado. “Agora precisa chover mais”, diz.

Em uma curva à esquerda surge a casa de Messias e Ginalva Jesus Pereira. A plantação de palmas logo se destaca da monocromia – é verde-escura, com nenhum tom de marrom. Na seca, o vegetal tem sido fonte de alimento imprescindível para garantir a sobrevivência dos animais, que já não têm mais pasto. “O povo vem, visita, admira. Outros ficam com usura”, fala Ginalva, sobrancelhas levantadas, há cerca de 20 anos vivendo naquele roçado.

Como era de se esperar, a conversa envereda para o clima e as gotas que caíram à noite. “Choveu em Ipirá, foi? Ah, aqui foi só uma neblina”, rebate o pequeno Matheus, filho do meio de Ginalva. “Aqui não chove mesmo há três anos. Perdemos dois bezerros e dois umbuzeiros para a seca. Painho está pedindo a Deus para esse resto de palma pegar”, diz, referindo-se a uma área mais distante da casa, plantada há pouco, onde o verde já está quase desbotando.

O cálculo de Matheus não é exagerado. Geralmente, chove na caatinga entre janeiro e maio, justamente a época do plantio. Em 2012, porém, a água não caiu e um período de estiagem emendou no outro, fazendo desta a maior seca dos últimos 50 anos, segundo a Coordenação de Defesa Civil da Bahia (Cordec). A previsão é que ela se estenda por mais um ou dois anos. “Agora, com a chuva, vai ser outra coisa. Vai mudar tudo”, avalia uma experiente Ginalva. Assim como o protagonista Fabiano da obra de Graciliano Ramos, ela sabe que a caatinga ressuscita.

Na casa dela, canos estrategicamente posicionados aguardam a próxima precipitação para recolher a água em cisternas. Enquanto isso não ocorre, Ginalva mantém, por meio de irrigação artificial, a produção – que inclui também feijão de corda, cebolinha, coentro, mamão, batata-doce e quiabo, além da criação de ovinos, caprinos e bovinos. O poço, recém-construído, foi financiado via Pronaf (Programa Nacional de Fortalecimento da Agricultura Familiar) Emergencial.

Assim como Ginalva, outros 6 mil agricultores da região apresentaram projetos para acessar o Programa. Segundo o Banco do Nordeste do Brasil (BNB), foram liberados R$ 10 milhões do Pronaf Emergencial até janeiro de 2013 para os 17 municípios do entorno de Feira de Santana, entre eles Pintadas e Ipirá. “São pequenos agricultores que você vê aqui, solicitando financiamento para plantar palmas ou fazer aguada para recuperar o pasto”, diz José Wilson Junqueira Queiroz, gerente de negócios do BNB. Em todo o Brasil, entre maio e dezembro de 2012, o governo federal autorizou R$ 656,2 milhões em linhas de crédito emergenciais para atender os atingidos pela seca.

“São essas políticas públicas que estão segurando as famílias no campo”, avalia Jeane de Almeida Santiago. Agrônoma que trabalha em uma ONG chamada Fundação Apaeba, ela presta assistência técnica para os produtores de Pintadas, Ipirá, Riachão do Jacuípe, Pé de Serra, Baixa Grande e Nova Fátima, todas na Bahia. “Antes, tinha muito mais gente que ia para São Paulo e outros estados para fazer migração.”

O relato é de alguém que conhece de perto a situação. Jeane nasceu em Pintadas. Estudou na escola agrícola e saiu para fazer curso técnico em Juazeiro e faculdade no Recôncavo Baiano. Voltou quando se formou, querendo transmitir os conhecimentos aprendidos. Olhos vivos e atentos, ela muda o tom e reavalia sua afirmação: “É, mas este ano muitos jovens estão indo. Com a seca, a rentabilidade das propriedades está zero. E as pessoas não vão ficar aqui sem ter dinheiro. Infelizmente, são obrigadas a sair, de coração partido, para São Paulo em busca de trabalho, ver se conseguem mandar dinheiro para a família que ficou aqui manter o rebanho vivo”.

De fato, o ponto de ônibus de Pintadas estava cheio naquela manhã. A cidade ainda não tem rodoviária e o asfalto que a conecta com o resto do mundo foi inaugurado há apenas um ano, como avisam as placas do governo do estado logo na entrada. Todos aguardavam na calçada o próximo transporte para a capital paulista, malas e parentes em pé, sol a pino. Há cerca de três semanas, Ginalva se despedia ali mesmo do filho mais velho, de 18 anos, que decidiu tentar a vida fora dali. “Me ligou ontem dizendo que já arrumou um emprego numa fábrica. É temporário, mas é um emprego”, ela conta. É a famosa ponte aérea Pintadas-São Paulo.

“O pior é que não temos previsão boa para este ano”, lamenta Jeane. Ela conta que até a palma e o mandacaru, também usados para alimentar o rebanho, começaram a desaparecer, e que a maioria das terras da região está na mão de pequenos agricultores de subsistência ou pecuaristas. “Já faz mais de um ano que o município está dando ração aos animais porque não tem mais pasto. Mas agora a ração esgotou. Você procura e não acha. Quando acha, é um valor que não dá para colocar no orçamento.”

Jeane preocupa-se: “Tem produtores que estão pagando três ou quatro projetos. Vai chegar uma hora que ninguém vai conseguir pegar mais [crédito], de tanto que devem. E aí, não sei como vai ser. Porque a propriedade não está tendo rentabilidade para pagar os empréstimos que já deve. Sem crédito, eu acredito que na zona rural fica impossível.”

“A causa desta seca é a destruição do meio ambiente”, ela sentencia, citando uma pesquisa recente que constata que 90% da mata nativa da região havia desaparecido. “A natureza está respondendo. O território está descoberto. E a partir daí vêm as queimadas. Muitos solos já se perderam ou estão enfraquecidos. O pessoal não tem a cultura de adubar e vão explorando e explorando. Os rios que tínhamos morreram. As nascentes estão desmatadas.”

Em Ipirá, logo ao lado, a realidade é semelhante. No lugar da caatinga, estão os bois. A cena mais comum é ver o gado ou os cavalos amontoados embaixo das poucas árvores que restam para escapar do sol escaldante – cabeça na sombra, lombo de fora. “Ipirá era um município cheio de minifúndios”, explica Orlando Cintra, gerente de Agricultura e Cooperativismo da Prefeitura. “Os grandes criadores começaram a chegar nos anos 1960. Este pessoal comprou a terra barata e empurrou o homem que produzia a batata, a mandioca e a mamona para a periferia daqui ou para São Paulo, Mato Grosso e Paraná.” Outros tantos foram trabalhar no corte da cana-de-açúcar. “Aqui não tinha boi e os pequenos produtores não desmatavam”, continua. “O que criávamos mais era o bode. Foi com a chegada dos grandes fazendeiros que o clima em Ipirá começou a mudar mais rapidamente. Desmataram para plantar capim.”

“A caatinga não é uma área para agropecuária. É para criação de caprinos, ovinos, animais de médio porte. Trouxeram a cultura do Sul, de pecuarista, e todo mundo quis ter fazenda de boi aqui”, completa Meire Oliveira, assessora da Secretaria de Agricultura e Meio Ambiente de Ipirá.

Meire passou a infância na zona rural do município e ainda se lembra do cheiro dessa mata. Conta que, quando criança, fazia burros a partir de umbus: enfiava quatro pedaços de galhinhos na fruta, representando as quatro patas. “Pena que, muitas vezes, quando eu digo para não desmatar, nem meu pai me ouve”, lamenta. Ela parece conhecer todas as plantas da caatinga. Quando encontra um cacto coroa-de-frade, mostra que é possível comer seu fruto, pequenino e vermelho. Caminhando pelas propriedades da região, cruza as cercas de arame farpado com desenvoltura. Pega um punhado de maxixe ainda verde e explica como cozinhá-lo. “Igualzinho a quiabo, sabe?” No sertão, tudo pode ser aproveitado. “A caatinga tem um poder de regeneração incrível”, explica. “A solução seria deixá-la descansar. Algumas áreas no entorno do Rio do Peixe já estão em processo de desertificação.”

Um exemplo de preservação ambiental é o assentamento D. Mathias, que completou sete anos de existência. Ali, a caatinga aos poucos renasce entre bodes, cabras e ovelhas. As árvores são podadas apenas o suficiente para não machucarem os animais, que circulam livremente pelas aroeiras, xique-xiques e umbuzeiros. Organizado pelo Movimento Luta Camponesa (MLC), o símbolo do assentamento é uma família de retirantes desenhada em preto e vermelho. A fila é puxada por uma mulher com uma foice nas mãos. Em seguida vem um homem, com uma enxada nos ombros. Dois filhos, um menino e uma menina seguem-nos de mãos dadas. Por último, um cachorro que, quiçá, se chama Baleia.

Júlio César Santos, dirigente da EBDA, presta assistência aos assentados e explica que os camponeses estão muito atentos às políticas públicas e linhas de crédito oferecidas pelos governos estadual e federal. Com isso, já conseguiram construir casas, comprar uma resfriadeira de leite e ampliar a criação de ovelhas. Entre as últimas iniciativas no local está a plantação adensada de palmas, mais rentável do que a tradicional. Em um primeiro momento, os agricultores não confiaram na técnica e continuaram plantando os cactos distantes uns dos outros, como sempre fizeram. Para contornar as dificuldades, Santos utilizou o “método de Paulo Freire”. Plantou dois roçados: de um lado, as palmas, adensadas; de outro, as tradicionais. Agora, as duas estão crescendo e ele espera, em breve, provar sua teoria. “Tomara que a falta de chuva não queime elas”, diz.

O sucesso do assentamento motivou, há 11 meses, um acampamento no latifúndio vizinho. Leidinaura Souza Santana, ou simplesmente Leila, é uma das moradoras do acampamento Elenaldo Teixeira. “O problema maior aqui é a água para beber e cozinhar. Ficamos quase 15 dias sem água. O caminhão-pipa chegou só ontem”, reclama. “A Embasa [Empresa Baiana de Águas e Saneamento] suspendeu o pipa por causa do rio, que já estava muito baixo, e também porque deu um problema na bomba”, explica Meire, que acompanha a visita. “Tivemos que tomar uma água que não é boa para beber”, murmura Leila.

Leila nasceu em Coração de Maria, ao norte de Feira de Santana. O marido trabalhava como vaqueiro em Malhador, povoado no município de Ipirá, quando souberam dos boatos da ocupação. Vieram logo participar. “Estamos esperando chegar a hora para entrar dentro da fazenda e acabar com o sofrimento. A área já foi atestada como improdutiva. O assentamento aqui do lado é uma maravilha. Me animei de ver que esse pessoal era acampado como a gente. Não desisto, não”, afirma. Meire aproveita para dar uma injeção de ânimo: “Eu acompanhei o outro acampamento desde o começo e era igualzinho. Acho que era até mais quente que este. Este é mais fresco. E olha como estão hoje”.

A conversa acontece na escola do acampamento, onde jovens e adultos são alfabetizados. A pequena construção de palha e madeira da escola fica no início daquela que foi batizada de “Avenida Brasil”, uma sequência bem aprumada de cerca de 15 barracos de lona. Leila acabou de passar para a 4a série do ensino fundamental e soletra o nome para mim. “L-E-I-D-I-N-A-U-R-A.” “Não é com ‘l’, não?”, pergunta Meire. “Não, é com ‘u’ mesmo”, Leila responde.

Em Tamanduá, povoado do entorno de Ipirá, motos e jegues passam com gente e baldes na garupa. Tudo lembra a estiagem. Egecivaldo Oliveira Nunes está à beira da estrada, ao volante do caminhão-pipa estacionado em frente à casa azul e branca. “Só trabalho particular, não trabalho com Exército nem Prefeitura. Pegamos água das barragens porque os açudes estavam secos”, ele conta, afirmando que nos piores dias da seca não “acha tempo” para as entregas solicitadas. O pagamento é por distância, e a cada quilômetro rodado muda o valor: 5 quilômetros são equivalentes a 9 mil litros e custam R$ 80. Quem não puder pagar (como os acampados) pode esperar pela Defesa Civil estadual – que afirma ter investido R$ 4 milhões em caminhões-pipa – ou pelo Exército, que mensalmente abastece de água 137 municípios.

“A cada ano, a seca vem mais intensa e a tendência é sempre durar mais”, lamenta Orlando Cintra, gerente de Agricultura e Cooperativismo de Ipirá. “A perspectiva é a de que em cinco ou seis anos ninguém vá produzir mais nada aqui, na área da agricultura. O clima vem se transformando. A cada ano piora.”

“Já tivemos tantas previsões, e nada”, diz Jeane Santiago. “Passa a previsão de chuva no jornal e as pessoas dizem: ‘Não tenho mais fé, só acredito se eu vir’. O pessoal da zona rural tem simpatias, como ‘se a flor do mandacaru desabrochar é sinal de que vai chover’. Mas todas deram errado até agora. A fé está acabando.” Os mandacarus já florearam. O vermelho-forte chama atenção. Agora é esperar.

In Big Data, We Hope and Distrust (Huffington Post)

By Robert Hall

Posted: 04/03/2013 6:57 pm

“In God we trust. All others must bring data.” — W. Edwards Deming, statistician, quality guru

Big data helped reelect a pesident, find Osama bin Laden, and contributed to the meltdown of our financial system. We are in the midst of a data revolution where social media introduces new terms like Arab Spring, Facebook Depression and Twitter anxiety that reflect a new reality: Big data is changing the social and relationship fabric of our culture.

We spend hours installing and learning how to use the latest versions of our ever-expanding technology while enduring a never-ending battle to protect our information. Then we labor while developing practices to rid ourselves of technology — rules for turning devices off during meetings or movies, legislation to outlaw texting while driving, restrictions in classrooms to prevent cheating, and scheduling meals or family time where devices are turned off. Information and technology: We love it, hate it, can’t live with it, can’t live without it, use it voraciously, and distrust it immensely. I am schizophrenic and so am I.

Big data is not only big but growing rapidly. According to IBM, we create 2.5 quintillion bytes a day and that “ninety percent of the data in the world has been created in the last two years.” Vast new computing capacity can analyze Web-browsing trails that track our every click, sensor signals from every conceivable device, GPS tracking and social network traffic. It is now possible to measure and monitor people and machines to an astonishing degree. How exciting, how promising. And how scary.

This is not our first data rodeo. The early stages of the customer relationship management movement were filled with hope and with hype. Large data warehouses were going to provide the kind of information that would make companies masters of customer relationships. There were just two problems. First, getting the data out of the warehouse wasn’t nearly as hard as getting it into the person or device interacting with the customers in a way that added value, trust and expanded relationships. We seem to always underestimate the speed of technology and overestimate the speed at which we can absorb it and socialize around it.

Second, unfortunately the customers didn’t get the memo and mostly decided in their own rich wisdom they did not need or want “masters.” In fact as providers became masters of knowing all the details about our lives, consumers became more concerned. So while many organizations were trying to learn more about customer histories, behaviors and future needs — customers and even their governments were busy trying to protect privacy, security, and access. Anyone attempting to help an adult friend or family member with mental health issues has probably run into well-intentioned HIPAA rules (regulations that ensure privacy of medical records) that unfortunately also restrict the ways you can assist them. Big data gives and the fear of big data takes away.

Big data does not big relationships make. Over the last 20 years as our data keeps getting stronger, our customer relationships keep getting weaker. Eighty-six percent of consumers trust corporations less than they did five years ago. Customer retention across industries has fallen about 30 percent in recent years. Is it actually possible that we have unwittingly contributed in the undermining of our customer relationships? How could that be? For one thing, as companies keep getting better at targeting messages to specific groups and those groups keep getting better at blocking their messages. As usual, the power to resist trumps the power to exert.

No matter how powerful big data becomes, if it is to realize its potential, it must build trust on three levels. First, customers must trust our intentions. Data that can be used for us can also be used against us. There is growing fear institutions will become a part of a “surveillance state.” While organizations have gone to great length to promote protection of our data — the numbers reflect a fair amount of doubt. For example, according to MainStreet, “87 percent of Americans do not feel large banks are transparent and 68 percent do not feel their bank is on their side.:

Second, customers must trust our actions. Even if they trust our intentions, they might still fear that our actions put them at risk. Our private information can be hacked, then misused and disclosed in damaging and embarrassing ways. After the Sandy Hook tragedy a New York newspaper published the names and addresses of over 33,000 licensed gun owners along with an interactive map that showed exactly where they lived. In response names and addresses of the newspaper editor and writers were published on-line along with information about their children. No one, including retired judges, law enforcement officers and FBI agents expected their private information to be published in the midst of a very high decibel controversy.

Third, customers must trust the outcome — that sharing data will benefit them. Even with positive intentions and constructive actions, the results may range from disappointing to damaging. Most of us have provided email addresses or other contact data — around a customer service issue or such — and then started receiving email, phone or online solicitations. I know a retired executive who helps hard-to-hire people. She spent one evening surfing the Internet to research about expunging criminal records for released felons. Years later, Amazon greets her with books targeted to the felon it believes she is. Even with opt-out options, we felt used. Or, we provide specific information, only to repeat it in the next transaction or interaction — not getting the hoped for benefit of saving our time.

It will be challenging to grow the trust at anywhere near the rate we grow the data. Information develops rapidly, competence and trust develop slowly. Investing heavily in big data and scrimping on trust will have the opposite effect desired. To quote Dolly Parton who knows a thing or two about big: “It costs a lot of money to look this cheap.”

How Big Could a Man-Made Earthquake Get? (Popular Mechanics)

Scientists have found evidence that wastewater injection induced a record-setting quake in Oklahoma two years ago. How big can a man-made earthquake get, and will we see more of them in the future?

By Sarah Fecht – April 2, 2013 5:00 PM

hydraulic fracking drilling illustration

Hydraulic fracking drilling illustration. Brandon Laufenberg/Getty Images

In November 2011, a magnitude-5.7 earthquake rattled Prague, Okla., and 16 other nearby states. It flattened 14 homes and many other buildings, injured two people, and set the record as the state’s largest recorded earthquake. And according to a new study in the journal Geology, the event can also claim the title of Largest Earthquake That’s Ever Been Induced by Fluid Injection.”

In the paper, a team of geologists pinpoints the quake’s starting point at less than 200 meters (about 650 feet) from an injection well where wastewater from oil drilling was being pumped into the ground at high pressures. At 5.7 magnitude, the Prague earthquake was about 10 times stronger than the previous record holder: a magnitude-4.8 Rocky Mountain Arsenal earthquake in Colorado in 1967, caused by the U.S. Army injecting a deep well with 148,000 gallons per day of fluid wastes from chemical-weapons testing. So how big can these man-made earthquakes get?

The short answer is that scientists don’t really know yet, but it’s possible that fluid injection could cause some big ones on very rare occasions. “We don’t see any reason that there should be any upper limit for an earthquake that is induced,” says Bill Ellsworth, a geophysicist with the U.S. Geological Survey, who wasn’t involved in the new study.

As with natural earthquakes, most man-made earthquakes have been small to moderate in size, and most are felt only by seismometers. Larger quakes are orders of magnitude rarer than small quakes. For example, for every 1000 magnitude-1.0 earthquakes that occur, expect to see 100 magnitude-2.0s, 10 magnitude-3.0s, just 1 magnitude-4.0, and so on. And just as with natural earthquakes, the strength of the induced earthquake depends on the size of the nearby fault and the amount of stress acting on it. Some faults just don’t have the capacity to cause big earthquakes, whether natural or induced.

How do Humans Trigger Earthquakes?

Faults have two major kinds of stressors: shear stress, which makes two plates slide past each other along the fault line, and normal stress, which pushes the two plates together. Usually the normal stress keeps the fault from moving sideways. But when a fluid is injected into the ground, as in Prague, that can reduce the normal stress and make it easier for the fault to slip sideways. It’s as if if you have a tall stack of books on a table, Ellsworth says: If you take half the books away, it’s easier to slide the stack across the table.

“Water increases the fluid pressure in pores of rocks, which acts against the pressure across the fault,” says Geoffrey Abers, a Columbia University geologist and one of the new study’s authors. “By increasing the fluid pressure, you’re decreasing the strength of the fault.”

A similar mechanism may be behind earthquakes induced by large water reservoirs. In those instances, the artificial lake behind a dam causes water to seep into the pore spaces in the ground. In 1967, India’s Koyna Dam caused a 6.5 earthquake that killed 177 people, injured more than 2000, and left 50,000 homeless. Unprecedented seasonal fluctuations in water level behind a dam in Oroville, Calif., are believed to be behind the magnitude-6.1 earthquake that occurred there in 1975.

Extracting a fluid from the ground can also contribute to triggering a quake. “Think about filling a balloon with water and burying it at the beach,” Ellsworth says. “If you let the water out, the sand will collapse inward.” Similarly, when humans remove large amounts of oil and natural gas from the ground, it can put additional stress on a fault line. “In this case it may be the shear stresses that are being increased, rather than normal stresses,” Ellsworth says.

Take the example of the Gazli gas field in Uzbekistan, thought to be located in a seismically inactive area when drilling began in 1962. As drillers removed the natural gas, the pressure in the gas field dropped from 1030 psi in 1962 to 515 psi in 1976, then down to 218 psi in 1985. Meanwhile, three large magnitude-7.0 earthquakes struck: two in 1976 and one in 1984. Each quake had an epicenter within 12 miles of Gazli and caused a surface uplift of some 31 inches. Because the quakes occurred in Soviet-era Uzbekistan, information about the exact locations, magnitudes, and causes are not available. However, a report by the National Research Council concludes that “observations of crustal uplift and the proximity of these large earthquakes to the Gazli gas field in a previously seismically quiet region strongly suggest that they were induced by hydrocarbon extraction.” Extraction of oil is believed to have caused at least three big earthquakes in California, with magnitudes of 5.9, 6.1, and 6.5.

Some people worry that hydraulic fracturing, or fracking‚Äîwherein high-pressure fluids are used to crack through rock layers to extract oil and natural gas‚Äîwill lead to an increased risk of earthquakes. However, the National Research Council report points out that there are tens of thousands of hydrofracking wells in existence today, and there has only been one case in which a “felt” tremor was linked to fracking. That was a 2.3 earthquake in Blackpool, England, in 2011, which didn’t cause any significant damage. Although scientists have known since the 1920s that humans trigger earthquakes, experts caution that it’s not always easy to determine whether a specific event was induced.

Are Human Activities Making Quakes More Common?

Human activities have been linked to increased earthquake frequencies in certain areas. For instance, researchers have shown a strong correlation between the volume of fluid injected into the Rocky Mountain Arsenal well and the frequency of earthquakes in that area.

Geothermal-energy sites can also induce many earthquakes, possibly due to pressure, heat, and volume changes. The Geysers in California is the largest geothermal field in the U.S., generating 725 megawatts of electricity using steam from deep within the earth. Before The Geysers began operating in 1960, seismic activity was low in the area. Now the area experiences hundreds of earthquakes per year. Researchers have found correlations between the volume of steam production and the number of earthquakes in the region. In addition, as the area of the steam wells increased over the years, so did the spatial distribution of earthquakes.

Whether or not human activity is increasing the magnitude of earthquakes, however, is more of a gray area. When it comes to injection wells, evidence suggests that earthquake magnitudes rise along with the volume of injected wastewater, and possibly injection pressure and rate of injection as well, according to a statement from the Department of Interior.

The vast majority of earthquakes caused by The Geysers are considered to be microseismic events—too small for humans to feel. However, researchers from Lawrence Berkeley National Laboratory note that magnitude-4.0 earthquakes, which can cause minor damage, seem to be increasing in frequency.

The new study says that though earthquakes with a magnitude of 5.0 or greater are rare east of the Rockies, scientists have observed an 11-fold increase between 2008 and 2011, compared with 1976 through 2007. But the increase hasn’t been tied to human activity. “We do not really know what is causing this increase, but it is remarkable,” Abers says. “It is reasonable that at least some may be natural.”

Futuristic predictions from 1988 LA Times Magazine come true… mostly (Singularity Hub)

Written By: 

Posted: 03/28/13 8:52 AM

los-angeles-banner

In 2013, a day in the life of a Los Angeles family of four is an amazing testament to technological progress and the idealistic society that can be achieved…or at least that’s what the Los Angeles Times Magazine was hoping for 25 years ago. Back in April 1988, the magazine ran a special cover story called “L.A. 2013″ and presented what a typical day would be like for a family living in the city.

The author of the story, Nicole Yorkin, spoke with over 30 experts and futurists to forecast daily life in 2013 and then wove these into a story akin to those “World of Tomorrow” MGM cartoons from the mid-20th century. But unlike the cartoons which often included far fetched technologies for humor, what’s most remarkable about the 1988 article is just how many of the predictions have actually come to pass, giving some leeway in how accurately the future can be imagined.

For anyone considering what will happen in the next 25 years, the article is worth a read as it serves as an amazing window into how well the future can be predicted in addition to what technology is able to achieve in a short period of time.

LA-2013-banner

Just consider the section on ‘smart cars’ speculated to be “smaller, more efficient, more automated and more personalized” than cars 25 years ago. While experts envisioned that cars would have more Transformer-like abilities to change from a sports car to a beach buggy, the key development in automobile technology will be “a central computer in the car that will control a number of devices.” Furthermore, cars were expected to be equipped with “electronic navigation or map systems,” or GPS systems. Although modern cars don’t have a ‘sonar shield’ that would cause a car to slow down when it came closer to another, parking sensors are becoming common and rearview cameras may soon be required by law.

Though the article doesn’t explicitly predict the Internet and all its consequences per se, computers were implicit to some of the predictions, such as telecommuting, virtual shopping, smart cards for health monitoring, a personalized ‘home newspaper,’ and video chatting. Integrated computers were also expected in the form of smart appliances, wall-to-ceiling computer displays in classrooms, and 3D video conferencing. These technologies exist today thanks to the networked computer revolution that was amazingly only in its infancy in 1988.

LA-2013-robot

‘The Ultimate Appliance’ is the mobile robot expected to be a ‘fixture’ in today’s homes.

But of all the technologies expected to be part of daily life in 2013, the biggest miss by the article comes with robots.

In fact, the mobile robot “Billy Rae” is depicted as an integral component to the household, much like Rosie The Robot was in The Jetsons. In the story, the family communicates with Billy Rae naturally as the mother reads a list of chores for cleaning the house and preparing meals. There’s even a pet canine robot named Max that helps the son learn to read and do math. The robots aren’t necessarily depicted as being super intelligent, but they were still expected to be vital, even being referred to as the “ultimate appliance.”

In recent years, great strides have been made with robots and artificial intelligence, but we are years away from having a maid-like robot that was hoped for in the article. We’re all familiar withcleaning robots like the Roomba and hospitals are starting to utilize healthcare robots.Personal assistants like Siri show that we’re getting closer to the day when people and computers can communicate verbally. But bringing all these technologies together is one of the most challenging problems to be solved, even with the high amounts of expectation and huge market potential that these bots will experience.

In light of this, it’s interesting to compare the predictions in this article to those in French illustrations drawn around 1900, which also include a fair share of robotic automation.

The piece is peppered with utopian speculation, but already on the radar were concerns about the shifting job market, increasing pollution, and the need for quality schooling, public transportation, and affordable housing, issues that have reached or are nearing crisis levels. It’s comforting to know that many of the problems that modern cities face were understood fairly well a quarter of a century ago, but it is sobering to recognize how technologies have been slow in some cases at handling these problems.

Perhaps the greatest lesson from reading the article is that few of the predictions are completely wrong, but the timescale was ambitious. Almost all of the technologies described will get here sooner or later. The real issue then is, what is preventing rapid innovation or broad-scale adoption of technologies?

Not surprisingly, the answers today are the same as they were 25 years ago: time and money.

LA-metro-rail

[images: kla4067/Flickr, LA Times]

Brain scans can now tell who you’re thinking about (Singularity Hub)

Written By: 

Posted: 03/23/13 7:48 AM

[Source: Listal]

[Source: Listal]

Beware stalkers, these neuroscientists can tell who you’re thinking of. Or, at least, the kind of personality he or she might have.

As a social species humans are highly attuned to the behavior of others around them. It’s a survival mechanism, helping us to safely navigate the social world. That awareness involves both evaluating people and predicting how they will behave in different situations in the future (“Uh oh, don’t get him started!”). But just how does the brain represent another person’s personality?

To answer this question a group of scientists at Cornell’s College of Human Ecology (whatever that means) used functional magnetic resonance imaging (fMRI) to measure neuronal activity while people thought about different types of personalities. The 19 participants – all young adults – learned about four protagonists, all of whom had considerably different personalities, based on agreeableness (e.g., “Likes to cooperate with others”) and extraversion (“Is sometimes shy”). They were then presented different scenarios (such as sitting on a bus with no empty seats and watching an elderly person get on) and asked to imagine how each of the four protagonists would react.

Varying degrees of a person's deemed "agreeableness" and "extraversion" combine to produce different brain activation patterns in the brain. [Source: Cerebral Cortex]

Varying degrees of a person’s deemed “agreeableness” and “extraversion” combine to produce different brain activation patterns in the brain. [Source: Cerebral Cortex]

The study’s lead author, Nathan Spreng, said they were “shocked” when they saw the results. The brain scans revealed that each of the four distinct personalities elicited four distinct activity patterns in the medial prefrontal cortex, an area at the front of the brain known to be involved in decision making. In essence, the researchers had succeeded in extracting mental pictures – the personalities of others – that people were thinking of.The study was published in the March 5 issue of Cerebral Cortex.

Sizing up the personality of another or thinking what they’re thinking is unique to social animals and in fact to do so was until recently thought to be uniquely human. But there’s now reason to believe the network – called the ‘default network’ – is a fundamental feature of social mammals in general. As Spreng explained in an email, “Macaque [monkeys] clearly have a similar network, observable even in the rat. All of these mammalian species are highly social.”

The fact that the mental snapshot of others was seen in the neurons of the medial prefrontal cortex means the current study may have implications for autism, Spreng said in a Cornell University news release. “Prior research has implicated the anterior mPFC in social cognition disorders such as autism, and our results suggest people with such disorders may have an inability to build accurate personality models. If further research bears this out, we may ultimately be able to identify specific brain activation biomarkers not only for diagnosing such diseases, but for monitoring the effects of interventions.”

Previous work has shown that brain scans can tell us a lot about what a person’s thinking. With an array of electrodes placed directly on the brain, researchers were able to decode specific words that people were thinking. In another experiment fRMI scans of the visual cortex were used to reconstruct movie trailers that participants were watching.

Much of neuroscience explores how the brain processes the sensory information that guides us through our physical environment. But, for many species, navigating the social environment can be just as important to survival. “For me, an important feature of the work is that our emotions and thoughts about other people are felt to be private experiences,” Spreng said. “In our life, we may choose to share our thoughts and feelings with peers, friends and loved ones. However, [thoughts and feelings] are also physical and biological processes that can be observed. Considering how important our social world is, we know very little about the brain processes that support social knowledge. The objective of this work is to understand the physical mechanisms that allow us to have an inner world, and a part of that is how we represent other people in our mind.”

Everybody Knows. Climate Denialism has peaked. Now what are we going to do? (EcoEquity)

– Tom Athanasiou (toma@ecoequity.org).  April 2, 2013.

It was never going to be easy to face the ecological crisis.  Even back in the 1970s, before climate took center stage, it was clear that we the prosperous were walking far too heavily.  And that “environmentalism,” as it was called, was only going to be a small beginning.  But it was only when the climate crisis pushed fossil energy into the spotlight that the real stakes were widely recognized.  Fossil fuels are the meat and potatoes of industrial civilization, and the need to rapidly and radically reduce their emissions cut right through to the heart of the great American dream.  And the European dream.  And, inevitably, the Chinese dream as well.

Decades later, 81% of global energy is still supplied by the fossil fuels: coal, gas, and oil.[1]  And though the solar revolution is finally beginning, the day is late.  The Arctic is melting, and, soon, as each year the northern ocean lies bare beneath the summer sun, the warming will accelerate.  Moreover, our plight is becoming visible.  We have discovered, to our considerable astonishment, that most of the fossil fuel on the books of our largest corporations is “unburnable” – in the precise sense that, if we burn it, we are doomed.[2]  Not that we know what to do with this rather strange knowledge.  Also, even as China rises, it’s obvious that it’s not the last in line for the promised land.  Billions of people, all around the world, watch the wealthy on TV, and most all of them want a drink from the well of modern prosperity.  Why wouldn’t they?  Life belongs to us all, as does the Earth.

The challenge, in short, is rather daunting.

The denial of the challenge, on the other hand, always came ready-made.  As Francis Bacon said so long ago, “what a man would rather were true, he more readily believes.”  And we really did want to believe that ours was still a boundless world.  The alternative – an honest reckoning – was just too challenging.  For one thing, there was no obvious way to reconcile the Earth’s finitude with the relentless expansion of the capitalist market.  And as long as we believed in a world without limits, there was no need to see that economic stratification would again become a fatal issue.  Sure, our world was bitterly riven between haves and have-nots, but this problem, too, would fade in time.  With enough growth – the universal balm – redistribution would never be necessary.  In time, every man would be a king.

The denial had many cheerleaders.  The chemical-company flacks who derided Rachel Carson as a “hysterical woman” couldn’t have known that they were pioneering a massive trend.  Also, and of course, big money always has plenty of mouthpieces.  But it’s no secret that, during the 20th Century, the “engineering of consent” reached new levels of sophistication.  The composed image of benign scientific competence became one of its favorite tools, and somewhere along the way tobacco-industry science became a founding prototype of anti-environmental denialism.  On this front, I’m happy to say that the long and instructive history of today’s denialist pseudo-science has already been expertly deconstructed.[3]  Given this, I can safely focus on the new world, the post-Sandy world of manifest climatic disruption in which the denialists have lost any residual aura of scientific legitimacy, and have ceased to be a decisive political force.  A world in which climate denialism is increasingly seen, and increasingly ridiculed, as the jibbering of trolls.

To be clear, I’m not claiming that the denialists are going to shut up anytime soon.  Or that they’ll call off their suicidal, demoralizing campaigns.  Or that their fogs and poisons are not useful to the fossil-fuel cartel.  But the battle of the science is over, at least as far as the scientists are concerned.  And even on the street, hard denialism is looking pretty ridiculous.  To be sure, the core partisans of the right will fight on, for the win and, of course, for the money.[4]  And they’ll continue to have real weight too, for just as long as people do not believe that life beyond carbon is possible.  But for all this, their influence has peaked, and their position is vulnerable.  They are – and visibly now – agents of a mad and dangerous ideology.  They are knaves, and often they are fools.[5]

As for the rest of us, we can at least draw conclusions, and make plans.

As bad as the human prospect may be – and it is quite bad – this is not “game over.”  We have the technology we need to save ourselves, or most of it in any case; and much of it is ready to go.  Moreover, the “clean tech” revolution is going to be disruptive indeed.  There will be cascades of innovation, delivering opportunities of all kinds, all around the world.  Also, our powers of research and development are strong.  Also, and contrary to today’s vogue for austerity and “we’re broke” political posturing, we have the money to rebuild, quickly and on a global scale.  Also, we know how to cooperate, at least when we have to.  All of which is to say that we still have options.  We are not doomed.

But we are in extremely serious danger, and it is too late to pretend otherwise.  So allow me to tip my hand by noting Jorgen Randers’ new book, 2052: A Global Forecast for the Next Forty Years.[6]  Randers is a Norwegian modeler, futurist, professor, executive, and consultant who made his name as co-author of 1972’s landmark The Limits to Growth.  Limits, of course, was a global blockbuster; it remains the best-selling environmental title of all times.  Also, Limits has been relentlessly ridiculed (the early denialists cut their teeth by distorting it[7]) so it must be said that – very much contrary to the mass-produced opinions of the denialist age – its central, climate-related projections are holding up depressingly well.[8]

By 2012 (when he published 2052) Randers had decided to step away from the detached exploration of multiple scenarios that was the methodological core of Limits, and to make actual predictions.  After a lifetime of frustrated efforts, these predictions are vivid, pessimistic and bitter.  In a nutshell, Randers doesn’t expect anything beyond what he calls “progress as usual,” and while he expects it to yield a “light green” buildout (e.g., solar on a large scale) he doesn’t think it will suffice to stabilize the climate system.  Such stabilization, he grants, is still possible, but it would require concerted global action on a scale that neither he nor Dennis Meadows, the leader of the old Limits team, see on today’s horizon.  Let’s call that kind of action global emergency mobilization.  Meadows, when he peers forwards, sees instead “many decades of uncontrolled climatic disruption and extremely difficult decline.”[9]  Randers is more precise, and predicts that we will by 2052 wake to find ourselves on a dark and frightening shore, knowing full well that our planet is irrevocably “on its way towards runaway climate change in the last third of the twenty-first century.”

This is an extraordinary claim, and it requires extraordinary evidence.[10]  Such evidence, unfortunately, is readily available, but for the moment let me simply state the public secret of this whole discussion.  To wit: we (and I use this pronoun advisedly) can still avoid a global catastrophe, but it’s not at all obvious that we will do so.  What is obvious is that stabilizing the global climate is going to be very, very hard.  Which is a real problem, because we don’t do hard anymore.  Rather, when confronted with a serious problem, we just do what we can, hoping that it will be enough and trying our best not to offend the rich.  In truth, and particularly in America, we count ourselves lucky if we can manage governance at all.

This essay is about climate politics after legitimate skepticism.  Climate politics in a world where, as Leonard Cohen put it, “everybody knows.”  What does this mean?  In the first place, it means that we’ve reached the end of what might be called “environmentalism-as-usual.”  This point is widely understood and routinely granted, as when people say something like “climate is not a merely environmental problem,” but my concern is a more particular one.  As left-green writer Eddie Yuen astutely noted in a recent book on “catastrophism,” the problems of the environmental movement are to a very large degree rooted in “the pairing of overwhelmingly bleak analysis with inadequate solutions.”[11]  This is exactly right.

The climate crisis demands a “new environmentalism,” and such a thing does seem to be emerging.  It’s final shape is unknowable, but one thing is certain – the environmentalism that we need will only exist when its solutions and strategies stand up to its own analyses.  The problem is that this requires us to take our “overwhelmingly bleak” analyses straight, rather than soft-pedaling them so that our “inadequate solutions” might look good.  Pessimism, after all, is closely related to realism.  It cannot just be wished away.

Soft-pedaling, alas, has long been standard practice, on both the scientific and the political sides of the climate movement.  Examples abound, but the best would have to be the IPCC itself, the U.N’s Intergovernmental Panel on Climate Change.  The world’s premier climate-science clearinghouse, the IPCC is often attacked from the right, and has developed a shy and reticent culture.  Even more importantly, though, and far more rarely noted, is that the IPCC is conservative by definition and by design.[12]  It almost has to be conservative to do its job, which is to herd the planet’s decision makers towards scientific realism.  The wrinkle is that, at this point, this isn’t even close to being good enough, not at least in the larger scheme.  At this point, we need strategic realism as well as baseline scientific realism, and it demands a brutal honesty in which underlying scientific and political truths are clearly drawn and publicly expressed.

Yet when it comes to strategic realism, we balk.  The first impulse of the “messaging” experts is always to repeat their perennial caution that sharp portraits of the danger can be frightening, and disempowering, and thus lead to despair and passivity.  This is an excellent point, but it’s only the beginning of the truth, not the end.  The deeper problem is that the physical impacts of climate disruption – the destruction and the suffering – will continue to escalate.  “Superstorm Sandy” was bad, but the future will be much worse.  Moreover, the most severe suffering will be far away, and easy for the good citizens of the wealthy world to ignore.  Imagine, for example, a major failure of the Indian Monsoon, and a subsequent South Asian famine.  Imagine it against a drumbeat background in which food is becoming progressively more expensive.  Imagine the permanence of such droughts, and increasing evidence of tipping points on the horizon, and a world in which ever more scientists take it upon themselves to deliver desperate warnings.  The bottom line will not be the importance of communications strategies, but rather the manifest reality, no longer distant and abstract, and the certain knowledge that we are in deep trouble.  And this is where the dangers of soft-pedaling lie.  For as people come to see the scale of the danger, and then to look about for commensurate strategies and responses, the question will be if such strategies are available, and if they are known, and if they are plausible.  If they’re not, then we’ll all going, together, down the road “from aware to despair.”

Absent the public sense of a future in which human resourcefulness and cooperation can make a decisive difference, we assuredly face an even more difficult future in which denial fades into a sense of pervasive hopelessness.  The last third of the century (when Randers is predicting “runaway climate change”) is not so very far away.  Which is to say that, as denialism collapses – and it will – the challenge of working out a large and plausible response to the climate crisis will become overwhelmingly important.  If we cannot imagine such a response, and explain how it would actually work, then people will draw their own conclusions.  And, so far, it seems that we cannot.  Even those of us who are now climate full-timers don’t have a shared vision, not in any meaningful detail, nor do we have a common sense of the strategic initiatives that could make such a vision cohere.

The larger landscape is even worse.  For though many scientists are steeling themselves to speak, the elites themselves are still stiff and timid, and show few signs of rising to the occasion.  Each month, it seems, there’s another major report on the approaching crisis – the World Bank, the National Intelligence Council, and the International Energy Agency have all recently made hair-raising contributions – but they never quite get around to the really important questions.  How should we contrive the necessary global mobilization?  What conditions are needed to absolutely maximize the speed of the clean-tech revolution?  By what strategy will we actually manage to keep the fossil-fuels in the ground?  What kind of international treaties are necessary, and how shall we establish them?  What would a fast-enough global transition cost, and how shall we pay for it?  What about all those who are forced to retreat from rising waters and drying lands?  How shall they live, and where?  How shall we talk about rights and responsibilities in the Greenhouse Century?  And what about the poor?  How shall they find futures in a climate-constrained world?  Can we even imagine a world in which they do?

In the face of such questions, you have a choice.  You can conclude that we’ll just have to do the best we can, and then you can have a drink.  Or maybe two.  Or you can conclude that, despite all evidence to the contrary, enough of us will soon awaken to reality.  What’s certain is that, all around us, there is a vast potentiality – for reinvention, for resistance, for redistribution, and for renewal of all kinds – and that it could at any time snap into solidity.  And into action.

Forget about “hope.”  What we need now is intention.

***

About a decade ago, in San Francisco, I was on a PBS talk show with, among others, Myron Ebell, chief of climate propaganda at the Competitive Enterprise Institute.  Ebell is an aggressive professional, and given the host’s commitment to phony balance he was easily able to frame the conversation.[13]  The result was a travesty, but not an entirely wasted time, at least not for me.  It was instructive to speak, tentatively, of the need for global climate justice, and to hear, in response, that I was a non-governmental fraud that was only in it for the money.  Moreover, as the hour wore on, I came to appreciate the brutal simplicity of the denialist strategy.  The whole point is to suck the oxygen out of the room, to weave such a tangle of confusionism and pseudo-debate that the Really Big Question – What is to be done? – becomes impossible to even ask, let alone discuss.

When Superstorm Sandy slammed into the New York City region, Ebell’s style of hard denialism took a body blow, though obviously it has not dropped finally to the mat.  Had it done do, the Big Question, in all its many forms, would be buzzing constantly around us.  Clearly, that great day has not yet come.  Still, back in November of 2012, when Bloomberg’s Business Week blared “It’s Global Warming, Stupid” from its front cover, this was widely welcomed as a overdue milestone.  It may even be that Michael Tobis, the editor of the excellent Planet 3.0, will prove correct in his long-standing, half-facetious prediction that 2015 will be the date when “the Wall Street Journal will acknowledge the indisputable and apparent fact of anthropogenic climate change; the year in which it will simply be ridiculous to deny it.”[14]  Or maybe not.  Maybe that day will never come.  Maybe Ebell’s style of well-funded, front-group denialism will live on, zombie-like, forever.  Or maybe (and this is my personal prediction) hard climate denialism will soon go the way of creationism and far-right Christianity, becoming a kind of political lifestyle choice, one that’s dangerous but contained.  One that’s ultimately more dangerous to the right than it is to the reality-based community.

If so, then at some point we’re going to have to ask ourselves if we’ve been so long distracted by the hard denialists that we’ve missed the parallel danger of a “soft denialism.”  By which I mean the denialism of a world in which, though the dangers of climate change are simply too ridiculous to deny, they still – somehow – are not taken to imply courage, and reckoning, and large-scale mobilization.  This is a long story, but the point is that, now that the Big Question is finally on the table, we’re going to have to answer it.  Which is to say that we’re going to have to face the many ways in which political timidity and small-bore realism have trained us to calibrate our sense of what must be done by our sense of what can be done, which these days is inadequate by definition.

And not just because of the denialists.

George Orwell once said that “To see what is in front of one’s nose needs a constant struggle.”[15]  As we hurtle forward, this struggle will rage as never before.  The Big Question, after all, changes everything.  Another way of saying this is that our futures will be shaped by the effort to avoid a full-on global climate catastrophe.  Despite all the rest of the geo-political and geo-economic commotion that will mark the 21st Century (and there’ll be plenty) it will be most fundamentally the Greenhouse Century.  We know this now, if we care to, though still only in preliminary outline.  The details, inevitably, will surprise us all.

The core problem, of course, will be “ambition” – action on the scale that’s actually necessary, rather than the scale that is or appears to be possible.  And here, the legacies of the denialist age – the long-ingrained habits of soft-pedaling and strained optimism – will weigh heavily.  Consider the quasi-official global goal (codified, for example, in the Copenhagen Accord) to hold total planetary warming to 2°C (Earth surface average) above pre-industrial levels.  This is the so-called “2°C target.”  What are we to do with it in the post-denialist age?  Let me count the complications: One, all sorts of Very Important People are now telling us it’s going to all but impossible to avoid overshooting 2°C.[16]  Two, in so doing, they are making a political and not a scientific judgment, though they’re not always clear on this point.  (It’s probably still technically possible to hold the 2°C line – if we’re not too unlucky – though it wouldn’t be easy under the best of circumstances.)[17]  Three, the 2°C line, which was once taken to be reasonably safe, is now widely seen (at least among the scientists) to mark the approximate point of transition from “dangerous” to “extremely dangerous,” and possibly to altogether unmanageable levels of warming.[18]  Four, and finally, it’s now widely recognized that any future in which we approach the 2°C line (which we will do) is one in which we also have a real possibility of pushing the average global temperature up by 3°C, and if this were to come to pass we’d be playing a very high-stakes game indeed, one in which uncontrolled positive feedbacks and worst-case scenarios were surrounding us on every side.

The bottom line is today as it was decades ago.  Greenhouse-gas emissions were increasing then, and they are increasing now.  In late 2012, the authoritative Global Carbon Project reported that, since 1990, they had risen by an astonishing 58 percent.[19]  The climate system has unsurprisingly responded with storms, droughts, ice-melt, conflagrations and floods.  The weather has become “extreme,” and may finally be getting our attention.  In Australia, according to the acute Mark Thomson of the Institute for Backyard Studies in Adelaide, the crushing heatwave of early 2013 even pushed aside “the idiot commentariat” and cleared the path for a bit of 11th-hour optimism: “Another year of this trend will shift public opinion wholesale.  We’re used to this sort of that temperature now and then and even take a perverse pride in dealing with it, but there seems to be a subtle shift in mood that ‘This Could Be Serious.’”  Let’s hope he’s right.  Let’s hope, too, that the mood shift that swept through America after Sandy also lasts, and leads us, too, to conclude that ‘This Could Be Serious.’  Not that this alone would be enough to support a real mobilization – the “moral equivalent of war” that we need – but it would be something.  It might even lead us to wonder about our future, and about the influence of money and power on our lives, and to ask how serious things will have to get before it becomes possible to imagine a meaningful change of direction.

The wrinkle is that, before we can advocate for a meaningful change of direction, we have to have one we believe in, one that we’re willing to explain in global terms that actually scale to the problem.  None of which is going to be easy, given that we’re fast approaching a point where only tales of existential danger ring true.  (cf the zombie apocalypse).  The Arctic ice, as noted above, offers an excellent marker.  In fact, the first famous photos of Earth from space – the “blue marble” photos taken in 1972 by the crew of the Apollo 17 – allow us to anchor our predicament in time and in memory.  For these are photos of an old Earth now passed away; they must be, because they show great expanses of ice that are nowhere to be found.  By August of 2012 the Arctic Sea’s ice cover had declined by 40%,[20] a melt that’s easily large enough to be visible from space.  Moreover, beneath the surface, ice volume is dropping even more precipitously.  The polar researchers who are now feverishly evaluating the great melting haven’t yet pushed the entire scientific community to the edge of despair, though they have managed to inspire a great deal of dark muttering about positive feedbacks and tipping points.  Soon, it seems, that muttering will become louder.  Perhaps as early as 2015, the Arctic Ocean will become virtually ice free for the first time in recorded history.[21]  When it does, the solar absorptivity of the Arctic waters will increase, and shift the planetary heat balance by a surprisingly large amount, and by so doing increase the rate of  planetary warming.  And this, of course, will not be end of it.  The feedbacks will continue.  The cycles will go on.

Should we remain silent about such matters, for risk of inflaming the “idiot commentariat?”  It’s absurd to even ask.  The suffering is already high, and if you know the science, you also know that the real surprise would be an absence of positive feedbacks.  The ice melt, the methane plumes, the drying of the rainforests – they’re all real.  Which is to say that there are obviously tipping points before us, though we do not and can not know how much time will pass before they force themselves upon our attention.  The real question is what we must do if we would talk of them in good earnest, while at the same time speaking, without despair and effectively, about the human future.


[1] Jorgen Randers, 2052: A Global Forecast for the Next Forty Years, Chelsea Green, 2012, page 99.

[2] Begin at the Carbon Track Initiative’s website.  http://www.carbontracker.org/

[3] Two excellent examples: Naomi Oreskes, Erik M. M. Conway, Merchants of Doubt: How a Handful of Scientists Obscured the Truth on Issues from Tobacco Smoke to Global Warming, Bloomsbury Press, 2011,  Chris Mooney, The Republican War on Science, Basic Books, 2006.

[4] See, for example, Suzanne Goldenberg, “Secret funding helped build vast network of climate denial thinktanks,” February 14, 2013, The Guardian.

[5] “Lord Monckton,” in particular, is fantastic.  See http://www.youtube.com/watch?v=w833cAs9EN0

[6] Randers, 2012.  See also Randers’ essay and video at the University of Cambridge 2013 “State of Sustainability Leadership,” athttp://www.cpsl.cam.ac.uk/About-Us/What-is-Sustainability-Leadership/The-State-of-Sustainability-Leadership.aspx

[7] Ugo Bardi, in The Limits to Growth Revisited (Springer Briefs, 2011) offers this summary:

“If, at the beginning, the debate on LTG had seemed to be balanced, gradually the general attitude on the study became more negative. It tilted decisively against the study when, in 1989, Ronald Bailey published a paper in “Forbes” where he accused the authors of having predicted that the world’s economy should have already run out of some vital mineral commodities whereas that had not, obviously, occurred.

Bailey’s statement was only the result of a flawed reading of the data in a single table of the 1972 edition of LTG. In reality, none of the several scenarios presented in the book showed that the world would be running out of any important commodity before the end of the twentieth century and not even of the twenty-first. However, the concept of the “mistakes of the Club of Rome” caught on. With the 1990s, it became commonplace to state that LTG had been a mistake if not a joke designed to tease the public, or even an attempt to force humankind into a planet-wide dictatorship, as it had been claimed in some earlier appraisals (Golub and Townsend 1977; Larouche 1983). By the end of the twentieth century, the victory of the critics of LTG seemed to be complete. But the debate was far from being settled.”

[8] See, for example, Graham Turner, “A Comparison of The Limits to Growth with Thirty Years of Reality.” Global Environmental Change, Volume 18, Issue 3, August 2008, Pages 397–411.  An unprotected copy (without the graphics) can be downloaded at www.csiro.au/files/files/plje.pdf.  Also

[9] In late 2012, Dennis Meadows said that “In the early 1970s, it was possible to believe that maybe we could make the necessary changes.  But now it is too late.  We are entering a period of many decades of uncontrolled climatic disruption and extremely difficult decline.”  See Christian Parenti, “The Limits to Growth’: A Book That Launched a Movement,” The Nation, December 24, 2012.

[11] Eddie Yuen, “The Politics of Failure Have Failed: The Environmental Movement and Catastrophism,” in Catastrophism: The Apocalyptic Politics of Collapse and Rebirth, Sasha Lilley, David McNally, Eddie Yuen, James Davis, with a foreword by Doug Henwood. PM Press 2012.  Yuen’s whole line is “the main reasons that [it] has not led to more dynamic social movements; these include catastrophe fatigue, the paralyzing effects of fear; the pairing of overwhelmingly bleak analysis with inadequate solutions, and a misunderstanding of the process of politicization.” 

[12] See Glenn Scherer, “Special Report: IPCC, assessing climate risks, consistently underestimates,” The Daily Climate, December 6, 2012.   More formally (and more interestingly) see Brysse, Oreskes, O’Reilly, and Oppenheimer, “Climate change prediction: Erring on the side of least drama?,” Global Environmental Change 23 (2013), 327-337.

[13] KQED-FM, Forum, July 22, 2003.

[14] Michael Tobis, editor of Planet 3.0, is amusing on this point.  He notes that “many data-driven climate skeptics are reassessing the issue,” that “In 1996 I defined the turning point of the discussion about climate science (the point where we could actually start talking about policy) as the date when theWall Street Journal would acknowledge the indisputable and apparent fact of anthropogenic climate change; the year in which it would simply be ridiculous to deny it.  My prediction was that this would happen around 2015… I’m not sure the WSJ has actually accepted reality yet.  It’s just starting to squint in its general direction.  2015 still looks like a good bet.”  See http://planet3.org/2012/08/07/is-the-tide-turning/

[15] The Collected Essays, Journalism and Letters of George Orwell: In Front of Your Nose, 1945-1950, Sonia Orwell and Ian Angus, Editors / Paperback / Harcourt Brace Jovanovich, 1968, p. 125.

[16] See for example, Fatih Birol and Nicholas Stern, “Urgent steps to stop the climate door closing,” The Financial Times, March 9, 2011.  And see Sir Robert Watson’s Union Frontiers of Geophysics Lecture at the 2012 meeting of the American Geophysical Union, athttp://fallmeeting.agu.org/2012/events/union-frontiers-of-geophysics-lecture-professor-sir-bob-watson-cmg-frs-chief-scientific-adviser-to-defra/

[17] I just wrote “probably still technically possible.”  I could have written “Excluding the small probability of a very bad case, and the even smaller probability of a very good case, it’s probably still technically possible to hold the 2°C line, though it wouldn’t be easy.”  This, however, is a pretty ugly sentence.  I could also have written “Unless we’re unlucky, and the climate sensitivity turns out be on the high side of the expected range, it’s still technically possible to hold the 2°C line, though it wouldn’t be easy, unless we’re very lucky, and the climate sensitivity turns out to be on the low side.”  Saying something like this, though, kind of puts the cart before the horse, since I haven’t said anything about “climate sensitivity,” or about how the scientists think about probability – and of course it’s even uglier.  The point, at least for now, is that climate projections are probabilistic by nature, which does not mean that they are merely “uncertain.”  We know a lot about the probabilities.

[18] See Kevin Anderson, a former director of Britain’s Tyndall Center, who has been unusually frank on this point.  His views are clearly laid out in a (non-peer-reviewed) essay published by the Dag Hammarskjold Foundation in Sweden.  See “Climate change going beyond dangerous – Brutal numbers and tenuous hope” in Development Dialog #61, September 2012, available at http://www.dhf.uu.se/wordpress/wp-content/uploads/2012/10/dd61_art2.pdf.  For a peer-reviewed paper, see Anderson and Bows, “Beyond ‘dangerous’ climate change: emission scenarios for a new world.”  Philosophical Transactions of The Royal Society, (2011) 369, 20-44 and for a lecture, see “Are climate scientists the most dangerous climate skeptics?” a Tyndall Centre video lecture (September 2010) at http://www.tyndall.ac.uk/audio/are-climate-scientist-most-dangerous-climate-sceptics.

[19] “The challenge to keep global warming below 2°C,” Glen P. Peters, et. al., Nature Climate Change (2012) 3, 4–6 (2013) doi:10.1038/nclimate1783.  December 2, 2012.  This figure might actually be revised upward, as 2012 saw the second-largest annual  concentration increase on record (http://climatedesk.org/2013/03/large-rise-in-co2-emissions-sounds-climate-change-alarm/)

[20] The story of the photos is on Wikipedia – see “blue marble.”  For the latest on the Arctic ice, see the “Arctic Sea Ice News and Analysis” page that the National Snow and Ice Data Center — http://nsidc.org/arcticseaicenews/

[21] Climate Progress is covering the “Arctic Death Spiral” in detail.  See for example Joe Romm, “NOAA: Climate Change Driving Arctic Into A ‘New State’ With Rapid Ice Loss And Record Permafrost Warming,” Climate Progress, Dec 6, 2012.  Give yourself a few hours and follow the links.

Climate Maverick to Retire From NASA (N.Y.Times)

Michael Nagle for The New York Times. James E. Hansen of NASA, retiring this week, reflected in a window at his farm in Pennsylvania.

By 

Published: April 1, 2013

His departure, after a 46-year career at the space agency’s Goddard Institute for Space Studies in Manhattan, will deprive federally sponsored climate research of its best-known public figure.

At the same time, retirement will allow Dr. Hansen to press his cause in court. He plans to take a more active role in lawsuits challenging the federal and state governments over their failure to limit emissions, for instance, as well as in fighting the development in Canada of a particularly dirty form of oil extracted from tar sands.

“As a government employee, you can’t testify against the government,” he said in an interview.

Dr. Hansen had already become an activist in recent years, taking vacation time from NASA to appear at climate protests and allowing himself to be arrested or cited a half-dozen times.

But those activities, going well beyond the usual role of government scientists, had raised eyebrows at NASA headquarters in Washington. “It was becoming clear that there were people in NASA who would be much happier if the ‘sideshow’ would exit,” Dr. Hansen said in an e-mail.

At 72, he said, he feels a moral obligation to step up his activism in his remaining years.

“If we burn even a substantial fraction of the fossil fuels, we guarantee there’s going to be unstoppable changes” in the climate of the earth, he said. “We’re going to leave a situation for young people and future generations that they may have no way to deal with.”

His departure, on Wednesday, will end a career of nearly half a century working not just for a single agency but also in a single building, on the edge of the Columbia University campus.

From that perch, seven floors above the diner made famous by “Seinfeld,” Dr. Hansen battled the White House, testified dozens of times in Congress, commanded some of the world’s most powerful computers and pleaded with ordinary citizens to grasp the basics of a complex science.

His warnings and his scientific papers have drawn frequent attack from climate-change skeptics, to whom he gives no quarter. But Dr. Hansen is a maverick, just as likely to vex his allies in the environmental movement. He supports nuclear power and has taken stands that sometimes undercut their political strategy in Washington.

In the interview and in subsequent e-mails, Dr. Hansen made it clear that his new independence would allow him to take steps he could not have taken as a government employee. He plans to lobby European leaders — who are among the most concerned about climate change — to impose a tax on oil derived from tar sands. Its extraction results in greater greenhouse emissions than conventional oil.

Dr. Hansen’s activism of recent years dismayed some of his scientific colleagues, who felt that it backfired by allowing climate skeptics to question his objectivity. But others expressed admiration for his willingness to risk his career for his convictions.

Initially, Dr. Hansen plans to work out of a converted barn on his farm in Pennsylvania. He has not ruled out setting up a small institute or taking an academic appointment.

He said he would continue publishing scientific papers, but he will no longer command the computer time and other NASA resources that allowed him to track the earth’s rising temperatures and forecast the long-run implications.

Dr. Hansen, raised in small-town Iowa, began his career studying Venus, not the earth. But as concern arose in the 1970s about the effects of human emissions of greenhouse gases, he switched gears, publishing pioneering scientific papers.

His initial estimate of the earth’s sensitivity to greenhouse gases was somewhat on the high side, later work showed. But he was among the first scientists to identify the many ways the planet is likely to respond to rising temperatures and to show how those effects would reinforce one another to produce immense changes in the climate and environment, including a sea level rise that could ultimately flood many of the world’s major cities.

“He’s done the most important science on the most important question that there ever was,” said Bill McKibben, a climate activist who has worked closely with Dr. Hansen.

Around the time Dr. Hansen switched his research focus, in the 1970s, a sharp rise in global temperatures began. He labored in obscurity over the next decade, but on a blistering June day in 1988 he was called before a Congressional committee and testifiedthat human-induced global warming had begun.

Speaking to reporters afterward in his flat Midwestern accent, he uttered a sentence that would appear in news reports across the land: “It is time to stop waffling so much and say that the evidence is pretty strong that the greenhouse effect is here.”

Given the natural variability of climate, it was a bold claim to make after only a decade of rising temperatures, and to this day some of his colleagues do not think he had the evidence.

Yet subsequent events bore him out. Since the day he spoke, not a single month’s temperatures have fallen below the 20th-century average for that month. Half the world’s population is now too young to have lived through the last colder-than-average month, February 1985.

In worldwide temperature records going back to 1880, the 19 hottest years have all occurred since his testimony.

Again and again, Dr. Hansen made predictions that were ahead of the rest of the scientific community and, arguably, a bit ahead of the evidence.

“Jim has a real track record of being right before you can actually prove he’s right with statistics,” said Raymond T. Pierrehumbert, a planetary scientist at the University of Chicago.

Dr. Hansen’s record has by no means been spotless. Even some of his allies consider him prone to rhetorical excess and to occasional scientific error.

He has repeatedly called for trying the most vociferous climate-change deniers for “crimes against humanity.” And in recent years, he stated that excessive carbon dioxide emissions might eventually lead to a runaway greenhouse effect that would boil the oceans and render earth uninhabitable, much like Venus.

His colleagues pointed out that this had not happened even during exceedingly warm episodes in the earth’s ancient past. “I have huge respect for Jim, but in this particular case, he overstated the risk,” said Daniel P. Schrag, a geochemist and the head of Harvard’s Center for the Environment, who is nonetheless deeply worried about climate change.

Climate skeptics have routinely accused Dr. Hansen of alarmism. “He consistently exaggerates all the dangers,” Freeman Dyson, the famed physicist and climate contrarian,told The New York Times Magazine in 2009.

Perhaps the biggest fight of Dr. Hansen’s career broke out in late 2005, when a young political appointee in the administration of George W. Bush began exercising control over Dr. Hansen’s statements and his access to journalists. Dr. Hansen took the fight public and the administration backed down.

For all his battles with conservatives, however, he has also been hard on environmentalists. He was a harsh critic of a failed climate bill they supported in 2009, on the grounds that it would have sent billions into the federal government’s coffers without limiting emissions effectively.

Dr. Hansen agrees that a price is needed on carbon dioxide emissions, but he wants the money returned to the public in the form of rebates on tax bills. “It needs to be done on the basis of conservative principles — not one dime to make the government bigger,” said Dr. Hansen, who is registered as a political independent.

In the absence of such a broad policy, Dr. Hansen has been lending his support to fights against individual fossil fuel projects. Students lured him to a coal protest in 2009, and he was arrested for the first time. That fall he was cited again after sleeping overnight in a tent on the Boston Common with students trying to pressure Massachusetts into passingclimate legislation.

“It was just humbling to have that solidarity and support from this leader, this lion among men,” said Craig S. Altemose, an organizer of the Boston protest.

Dr. Hansen says he senses the beginnings of a mass movement on climate change, led by young people. Once he finishes his final papers as a NASA employee, he intends to give it his full support.

“At my age,” he said, “I am not worried about having an arrest record.”

Secretário da ONU pede urgência na criação de metas globais para o clima (G1/Globo Natureza)

JC e-mail 4699, de 05 de Abril de 2013.

Ban Ki-moon disse que será tarde demais se nada for feito até 2015. Data é limite para criar acordo global que reduza emissão de gases-estufa

O secretário-geral da ONU, Ban Ki-moon, declarou nesta quarta-feira (3) em Mônaco que será “tarde demais” para salvar o meio ambiente, se não forem adotadas medidas vinculantes até 2015 para o clima.

“As palavras não foram seguidas por ações. Logo será tarde demais. Nossos padrões de consumo são incompatíveis com a saúde do planeta”, indicou Ban Ki-moon, diante de uma plateia de personalidades. “Devemos agir agora, se quisermos que em 2050 o planeta continue a ser habitável para os seus nove bilhões de pessoas”, argumentou.

Ele se refere à criação de um novo tratado (ou protocolo) previsto para ser assinado em 2015 e entrar em vigor a partir de 2020, quando o Protocolo de Kyoto perder sua validade. Assim, todos países pretendem terão que cumprir metas para reduzir os gases de efeito estufa e conter a elevação da temperatura do planeta.

Dos noventa objetivos adotados pela comunidade internacional relacionados a questões ambientais nos últimos 20 anos, apenas quatro registraram progressos significativos, lamentou o secretário das Nações Unidas.

Problemas ambientais

Segundo a agência de notícias France Presse, ele destacou como problemas atuais a diminuição da biodiversidade, a redução dos recursos pesqueiros, a maior acidez dos oceanos e o aumento das emissões de gases do efeito estufa. “Temos que acelerar nossa dinâmica. Precisamos desenvolver o que estamos testando em tubos de ensaio há 40 anos. Para isso, devemos adotar medidas de incentivos eficazes, e principalmente colocar um preço sobre as emissões de carbono”, declarou.

“Também devemos adotar, até 2015, um instrumento universal e jurídico vinculante relativo ao clima, de modo que todos os países adotem medidas adicionais para reduzir os efeitos da mudança climática”, instou o secretário-geral das Nações Unidas.

Homenagens em Mônaco

Ban também prestou homenagem à Fundação Prince Albert II de Mônaco, que “é respeitada em todo o mundo pelo trabalho que faz nas áreas da biodiversidade, da água e na luta contra as mudanças climáticas”.

“No momento em que a terra e os oceanos sofrem pressões sem precedentes, em particular devido ao crescimento da população global e às mudanças climáticas, é nossa responsabilidade agir de forma decisiva para preparar para o futuro”, declarou por sua vez o príncipe Albert de Mônaco.

Para o pequeno principado, a visita oficial de Ban Ki-moon marca o 20º aniversário da entrada do Mônaco na Organização das Nações Unidas, em 28 de maio de 1993. “Eu lembro com carinho o orgulho que ele sentiu por esse reconhecimento”, disse o soberano em referência a seu pai, o príncipe Rainier III.

Ban Ki-moon, que iniciou nesta semana um giro europeu com uma visita aos pequenos principados de San Marino e Andorra, também visitará a Espanha e a Holanda. Ele se reunirá na quinta-feira (4) em Mônaco com o chefe de governo.

Survey Shows Many Republicans Feel America Should Take Steps to Address Climate Change (Science Daily)

Apr. 2, 2013 — In a recent survey of Republicans and Republican-leaning Independents conducted by the Center for Climate Change Communication (4C) at George Mason University, a majority of respondents (62 percent) said they feel America should take steps to address climate change. More than three out of four survey respondents (77 percent) said the United States should use more renewable energy sources, and of those, most believe that this change should begin immediately.

The national survey, conducted in January 2013, asked more than 700 people who self-identified as Republicans and Republican-leaning Independents about energy and climate change.

“Over the past few years, our surveys have shown that a growing number of Republicans want to see Congress do more to address climate change,” said Mason professor Edward Maibach, director of 4C. “In this survey, we asked a broader set of questions to see if we could better understand how Republicans, and Independents who have a tendency to vote Republican, think about America’s energy and climate change situation.”

Other highlights from the survey include the following:

  • Republicans and Republican-leaning Independents prefer clean energy as the basis of America’s energy future and say the benefits of clean energy, such as energy independence (66 percent) saving resources for our children and grandchildren (57 percent), and providing a better life for our children and grandchildren (56 percent) outweigh the costs, such as more government regulation (42 percent) or higher energy prices (31 percent).
  • By a margin of 2 to 1, respondents say America should take action to reduce its fossil fuel use.
  • Only one third of respondents agree with the Republican Party’s position on climate change, while about half agree with the party’s position on how to meet America’s energy needs.
  • A large majority of respondents say their elected representatives are unresponsive to their views about climate change.

“The findings from this survey suggest there is considerable support among conservatives for accelerating the transition away from fossil fuels and toward clean renewable forms of energy, and for taking steps to address climate change,” said Maibach. “Perhaps the most surprising finding, however, is how few of our survey respondents agreed with the Republican Party’s current position on climate change.”

The report can be downloaded at: http://climatechangecommunication.org

The report is based on findings from a nationally representative survey conducted by the George Mason University Center for Climate Change Communication. A total of 726 adults (18+) were interviewed between January 12th and January 27th, 2013. The average margin of error for the survey +/- 4 percentage points at the 95% confidence level.

The Tar Sands Disaster (N.Y.Times)

OP-ED CONTRIBUTOR

By THOMAS HOMER-DIXON

Published: March 31, 2013

WATERLOO, Ontario

Rick Froberg

IF President Obama blocks the Keystone XL pipeline once and for all, he’ll do Canada a favor.

Canada’s tar sands formations, landlocked in northern Alberta, are a giant reserve of carbon-saturated energy — a mixture of sand, clay and a viscous low-grade petroleum called bitumen. Pipelines are the best way to get this resource to market, but existing pipelines to the United States are almost full. So tar sands companies, and the Alberta and Canadian governments, are desperately searching for export routes via new pipelines.

Canadians don’t universally support construction of the pipeline. A poll by Nanos Research in February 2012 found that nearly 42 percent of Canadians were opposed. Many of us, in fact, want to see the tar sands industry wound down and eventually stopped, even though it pumps tens of billions of dollars annually into our economy.

The most obvious reason is that tar sands production is one of the world’s most environmentally damaging activities. It wrecks vast areas of boreal forest through surface mining and subsurface production. It sucks up huge quantities of water from local rivers, turns it into toxic waste and dumps the contaminated water into tailing ponds that now cover nearly 70 square miles.

Also, bitumen is junk energy. A joule, or unit of energy, invested in extracting and processing bitumen returns only four to six joules in the form of crude oil. In contrast, conventional oil production in North America returns about 15 joules. Because almost all of the input energy in tar sands production comes from fossil fuels, the process generates significantly more carbon dioxide than conventional oil production.

There is a less obvious but no less important reason many Canadians want the industry stopped: it is relentlessly twisting our society into something we don’t like. Canada is beginning to exhibit the economic and political characteristics of a petro-state.

Countries with huge reserves of valuable natural resources often suffer from economic imbalances and boom-bust cycles. They also tend to have low-innovation economies, because lucrative resource extraction makes them fat and happy, at least when resource prices are high.

Canada is true to type. When demand for tar sands energy was strong in recent years, investment in Alberta surged. But that demand also lifted the Canadian dollar, which hurt export-oriented manufacturing in Ontario, Canada’s industrial heartland. Then, as the export price of Canadian heavy crude softened in late 2012 and early 2013, the country’s economy stalled.

Canada’s record on technical innovation, except in resource extraction, is notoriously poor. Capital and talent flow to the tar sands, while investments in manufacturing productivity and high technology elsewhere languish.

But more alarming is the way the tar sands industry is undermining Canadian democracy. By suggesting that anyone who questions the industry is unpatriotic, tar sands interest groups have made the industry the third rail of Canadian politics.

The current Conservative government holds a large majority of seats in Parliament but was elected in 2011 with only 40 percent of the vote, because three other parties split the center and left vote. The Conservative base is Alberta, the province from which Prime Minister Stephen Harper and many of his allies hail. As a result, Alberta has extraordinary clout in federal politics, and tar sands influence reaches deep into the federal cabinet.

Both the cabinet and the Conservative parliamentary caucus are heavily populated by politicians who deny mainstream climate science. The Conservatives have slashed financing for climate science, closed facilities that do research on climate change, told federal government climate scientists not to speak publicly about their work without approval and tried, unsuccessfully, to portray the tar sands industry as environmentally benign.

The federal minister of natural resources, Joe Oliver, has attacked “environmental and other radical groups” working to stop tar sands exports. He has focused particular ire on groups getting money from outside Canada, implying that they’re acting as a fifth column for left-wing foreign interests. At a time of widespread federal budget cuts, the Conservatives have given Canada’s tax agency extra resources to audit registered charities. It’s widely assumed that environmental groups opposing the tar sands are a main target.

This coercive climate prevents Canadians from having an open conversation about the tar sands. Instead, our nation behaves like a gambler deep in the hole, repeatedly doubling down on our commitment to the industry.

President Obama rejected the pipeline last year but now must decide whether to approve a new proposal from TransCanada, the pipeline company. Saying no won’t stop tar sands development by itself, because producers are busy looking for other export routes — west across the Rockies to the Pacific Coast, east to Quebec, or south by rail to the United States. Each alternative faces political, technical or economic challenges as opponents fight to make the industry unviable.

Mr. Obama must do what’s best for America. But stopping Keystone XL would be a major step toward stopping large-scale environmental destruction, the distortion of Canada’s economy and the erosion of its democracy.

Thomas Homer-Dixon, who teaches global governance at the Balsillie School of International Affairs, is the author of “The Upside of Down: Catastrophe, Creativity and the Renewal of Civilization.”

Unearthed: The Fracking Facade (Top Documentary Films)

A video exposing a flawed claim often abused in the sales pitch for promoting shale gas development across the world:

“With a history of 60 years, after nearly a million wells drilled, there are no documented cases that hydraulic fracturing (fracking) has lead to the contamination of groundwater.”

Brought to you by the team behind the upcoming South African feature documentary, Unearthed, that is investigating natural gas development and the controversial method of extraction known as fracking from a global perspective. Should South Africa and other countries drill down?

Watch the full documentary now

 

 

The Mathematics of Averting the Next Big Network Failure (Wired)

BY NATALIE WOLCHOVER, SIMONS SCIENCE NEWS

03.19.13 – 9:30 AM

Data: Courtesy of Marc Imhoff of NASA GSFC and Christopher Elvidge of NOAA NGDC; Image: Craig Mayhew and Robert Simmon of NASA GSFC

Gene Stanley never walks down stairs without holding the handrail. For a fit 71-year-old, he is deathly afraid of breaking his hip. In the elderly, such breaks can trigger fatal complications, and Stanley, a professor of physics at Boston University, thinks he knows why.

“Everything depends on everything else,” he said.

Original story reprinted with permission from Simons Science News, an editorially independent division of SimonsFoundation.org whose mission is to enhance public understanding of science by covering research developments and trends in mathematics and the physical and life sciences.

Three years ago, Stanley and his colleagues discovered the mathematics behind what he calls “the extreme fragility of interdependency.” In a system of interconnected networks like the economy, city infrastructure or the human body, their model indicates that a small outage in one network can cascade through the entire system, touching off a sudden, catastrophic failure.

First reported in 2010 in the journal Nature, the finding spawned more than 200 related studies, including analyses of the nationwide blackout in Italy in 2003, the global food-price crisis of 2007 and 2008, and the “flash crash” of the United States stock market on May 6, 2010.

“In isolated networks, a little damage will only lead to a little more,” said Shlomo Havlin, a physicist at Bar-Ilan University in Israel who co-authored the 2010 paper. “Now we know that because of dependency between networks, you can have an abrupt collapse.”

While scientists remain cautious about using the results of simplified mathematical models to re-engineer real-world systems, some recommendations are beginning to emerge. Based on data-driven refinements, new models suggest interconnected networks should have backups, mechanisms for severing their connections in times of crisis, and stricter regulations to forestall widespread failure.

“There’s hopefully some sweet spot where you benefit from all the things that networks of networks bring you without being overwhelmed by risk,” said Raissa D’Souza, a complex systems theorist at the University of California, Davis.

Power, gas, water, telecommunications and transportation networks are often interlinked. When nodes in one network depend on nodes in another, node failures in any of the networks can trigger a system-wide collapse. (Illustration: Leonardo Dueñas-Osorio)

To understand the vulnerability in having nodes in one network depend on nodes in another, consider the “smart grid,” an infrastructure system in which power stations are controlled by a telecommunications network that in turn requires power from the network of stations. In isolation, removing a few nodes from either network would do little harm, because signals could route around the outage and reach most of the remaining nodes. But in coupled networks, downed nodes in one automatically knock out dependent nodes in the other, which knock out other dependent nodes in the first, and so on. Scientists model this cascading process by calculating the size of the largest cluster of connected nodes in each network, where the answer depends on the size of the largest cluster in the other network. With the clusters interrelated in this way, a decrease in the size of one of them sets off a back-and-forth cascade of shrinking clusters.

When damage to a system reaches a “critical point,” Stanley, Havlin and their colleagues find that the failure of one more node drops all the network clusters to zero, instantly killing connectivity throughout the system. This critical point will vary depending on a system’s architecture. In one of the team’s most realistic coupled-network models, an outage of just 8 percent of the nodes in one network — a plausible level of damage in many real systems — brings the system to its critical point. “The fragility that’s implied by this interdependency is very frightening,” Stanley said.

However, in another model recently studied by D’Souza and her colleagues, sparse links between separate networks actually help suppress large-scale cascades, demonstrating that network models are not one-size-fits-all. To assess the behavior of smart grids, financial markets, transportation systems and other real interdependent networks, “we have to start from the data-driven, engineered world and come up with the mathematical models that capture the real systems instead of using models because they are pretty and analytically tractable,” D’Souza said.

In a series of papers in the March issue of Nature Physics, economists and physicists used the science of interconnected networks to pinpoint risk within the financial system. In one study, an interdisciplinary group of researchers including the Nobel Prize-winning economist Joseph Stiglitz found inherent instabilities within the highly complex, multitrillion-dollar derivatives market and suggested regulations that could help stabilize it.

Irena Vodenska, a professor of finance at Boston University who collaborates with Stanley, custom-fit a coupled network model around data from the 2008 financial crisis. Her and her colleagues’ analysis, published in February in Scientific Reports, showed that modeling the financial system as a network of two networks — banks and bank assets, where each bank is linked to the assets it held in 2007 — correctly predicted which banks would fail 78 percent of the time.

“We consider this model as potentially useful for systemic risk stress testing for financial systems,” said Vodenska, whose research is financially supported by the European Union’s Forecasting Financial Crisis program. As globalization further entangles financial networks, she said, regulatory agencies must monitor “sources of contagion” — concentrations in certain assets, for example — before they can cause epidemics of failure. To identify these sources, “it’s imperative to think in the sense of networks of networks,” she said.

Leonardo Dueñas-Osorio, a civil engineer at Rice, visited a damaged high-voltage substation in Chile after a major earthquake in 2010 to gather information about the power grid’s response to the crisis. (Photo: Courtesy of Leonardo Dueñas-Osorio)

Scientists are applying similar thinking to infrastructure assessment. Leonardo Dueñas-Osorio, a civil engineer at Rice University, is analyzing how lifeline systems responded to recent natural disasters. When a magnitude 8.8 earthquake struck Chile in 2010, for example, most of the power grid was restored after just two days, aiding emergency workers. The swift recovery, Dueñas-Osorio’s researchsuggests, occurred because Chile’s power stations immediately decoupled from the centralized telecommunications system that usually controlled the flow of electricity through the grid, but which was down in some areas. Power stations were operated locally until the damage in other parts of the system subsided.

“After an abnormal event, the majority of the detrimental effects occur in the very first cycles of mutual interaction,” said Dueñas-Osorio, who is also studying New York City’s response to Hurricane Sandy last October. “So when something goes wrong, we need to have the ability to decouple networks to prevent the back-and-forth effects between them.”

D’Souza and Dueñas-Osorio are collaborating to build accurate models of infrastructure systems in Houston, Memphis and other American cities in order to identify system weaknesses. “Models are useful for helping us explore alternative configurations that could be more effective,” Dueñas-Osorio explained. And as interdependency between networks naturally increases in many places, “we can model that higher integration and see what happens.”

Scientists are also looking to their models for answers on how to fix systems when they fail. “We are in the process of studying what is the optimal way to recover a network,” Havlin said. “When networks fail, which node do you fix first?”

The hope is that networks of networks might be unexpectedly resilient for the same reason that they are vulnerable. As Dueñas-Osorio put it, “By making strategic improvements, can we have what amounts to positive cascades, where a small improvement propagates much larger benefits?”

These open questions have the attention of governments around the world. In the U.S., the Defense Threat Reduction Agency, an organization tasked with safeguarding national infrastructure against weapons of mass destruction, considers the study of interdependent networks its “top mission priority” in the category of basic research. Some defense applications have emerged already, such as a new design for electrical network systems at military bases. But much of the research aims at sorting through the mathematical subtleties of network interaction.

“We’re not yet at the ‘let’s engineer the internet differently’ level,” said Robin Burk, an information scientist and former DTRA program manager who led the agency’s focus on interdependent networks research. “A fair amount of it is still basic science — desperately needed science.”

Original story reprinted with permission from Simons Science News, an editorially independent division of SimonsFoundation.org whose mission is to enhance public understanding of science by covering research developments and trends in mathematics and the physical and life sciences.

Mudanças climáticas afetam previsões astrológicas dos índios amazônicos (UOL Notícias)

Carlos A. Moreno

Da EFE, no Rio de Janeiro

31/03/201311h59

Crianças da aldeia ticuna brincam no Rio Solimões, no Amazonas; os ticunas são uma das tribos afetadas pelas mudanças climáticasCrianças da aldeia ticuna brincam no Rio Solimões, no Amazonas; os ticunas são uma das tribos afetadas pelas mudanças climáticas. Patrícia Santos – 30.nov.1999/Folhapress

As previsões que os índios da Amazônia brasileira fazem com a ajuda dos astros para determinar o melhor momento para plantar ou pescar, entre outras atividades, se veem afetadas pelas mudanças climáticas, segundo constatou um estudo realizado com diferentes etnias indígenas no Brasil.

“Os xamãs passaram a se queixar que suas previsões estavam perdendo a exatidão e, a partir dessas indagações, descobrimos que alguns fenômenos provocados pelas mudanças climáticas afetavam seus cálculos”, explicou à Agência Efe o astrônomo Germano Afonso, coordenador do estudo.

Segundo o especialista, que é doutor em Astronomia e Mecânica Celeste pela francesa Universidade Pierre et Marie Curie, os índios da Amazônia ainda utilizam o conhecimento astrológico ancestral para determinar seu calendário e programar, entre outras coisas, a melhor data para plantar, colher, caçar, pescar e, até mesmo, realizar seus rituais religiosos.

Afonso, que construiu e opera – com ajuda dos índios – um observatório solar na Amazônia, explicou que a observação ou não de diferentes constelações, assim como o deslocamento das mesmas, fazem com que os xamãs prevejam os momentos de chuva e seca, das cheias dos rios, da fertilidade da terra e da procriação dos peixes.

“No entanto, nas tribos com as quais trabalhamos, os próprios xamãs admitem que suas previsões não estavam sendo exatas, já que as chuvas se antecipavam ou se atrasavam e os rios secavam antes do tempo previsto. O curioso é que eles mesmos culpavam às mudanças climáticas”, declarou o astrônomo, que é professor da Universidade do Estado do Paraná e autor de diferentes obras sobre o assunto, como “O Céu dos Índios Tembé”.

A equipe coordenada por Afonso e contratada pela Fundação de Apoio à pesquisa no Estado do Amazonas (Fapeam) para estudar o assunto decidiu contrastar o conhecimento indígena de diferentes etnias – Tukano, Tupé, Dessana, Baré, Tuyuka, Baniwa e Tikuna – com as medições meteorológicas da região para tentar identificar as falhas nas previsões.

“Com essa análise percebemos que alguns fenômenos provocados pelas mudanças climáticas estavam desvirtuando as previsões, tendo em vista que a chuva se atrasava ou se antecipava por fenômenos como El Niño e o desmatamento”, apontou o especialista, que passou a morar em São Gabriel da Cachoeira, uma cidade amazônica na qual confluem várias etnias e onde construiu o Observatório Solar Indígena.

Afonso esclareceu que esse problema não pode ser atribuído diretamente ao aquecimento global, mas também aos fenômenos que causam o efeito estufa e os que são provocados pelo mesmo, como o desmatamento da Amazônia, a poluição ambiental e a construção de represas na floresta.

Tais fenômenos, segundo os especialistas, alteram os períodos de chuva e de cheia dos rios na Amazônia, que já não podem ser previstos a partir do conhecimento astronômico acumulado por séculos e transmitido oralmente entre os índios.

Após a constatação do problema, os pesquisadores responsáveis pelo estudo iniciaram um projeto para transmitir aos xamãs alguns conhecimentos científicos e, com isso, ajudá-los a corrigir suas previsões.

“Estamos usando cálculos astronômicos modernos e as informações recolhidas pelas estações meteorológicas da região para ajudá-los a aperfeiçoar seus cálculos”, explicou Afonso.

“Recuperamos o conhecimento astrológico que eles transmitem oralmente e comparamos com dados científicos para fazer alguns ajustes e permitir que as previsões sejam mais precisas”, completou.

De acordo com Afonso, com previsões mais exatas, os índios seguirão confiando em sua capacidade de interpretar os astros e na precisão de seus conhecimentos – o melhor, sem se afastarem de sua cultura.

“Mas só transmitimos os dados que podem ajudá-los. Não nos introduzimos mais. Não queremos invadir, deslegitimar e nem modificar nada de sua cultura. O projeto tem dois objetivos claros: recuperar o conhecimento astrológico dos índios e ajudá-los a melhorar suas previsões. Trata-se de uma troca”, exaltou o pesquisador.

Segundo o astrônomo, essa troca teve uma boa recepção devido ao fato de que a maioria de seus colaboradores no projeto são universitários e indígenas, alguns filhos ou netos de caciques e xamãs das tribos onde nasceram.

Latour: “No estaba escrito que la ecología fuera un partido” (El País)

ENTREVISTA

“No estaba escrito que la ecología fuera un partido”

Sociólogo, antropólogo, filósofo y director científico del Instituto de Estudios Políticos de París.

Bruno Latour tiene una mirada ácida y provocadora de la sociedad y el medio ambiente.

MIGUEL MORA 25 MAR 2013 – 11:52 CET19

Bruno Latour. / MANUEL BRAUN

¿Ha servido para algo el activismo ecológico? ¿Han forjado los verdes una política común? ¿Escuchan los políticos a los científicos cuando alertan sobre el cambio climático? ¿Puede la Tierra soportar más agresiones? El sociólogo, antropólogo y filósofo francés Bruno Latour(Beaune, 1947) lleva más de 20 años reflexionando sobre estos asuntos, y su pronóstico es desolador. A su juicio, la llegada de los ecologistas a la política ha sido un fracaso porque los verdes han renunciado al debate inteligente, los políticos se limitan a aplicar viejas recetas sin darse cuenta de que la revolución se ha producido ya y fue “una catástrofe”: ocurrió en 1947, cuando la población mundial superó el número que garantizaba el acceso a los recursos. Según Latour, es urgente poner en marcha una nueva forma de hacer ecología política, basada en una constitución que comprometa a gobernantes, científicos y ciudadanos a garantizar el futuro de la Tierra. Esta idea es una de las propuestas de su libro Políticas de la naturaleza. Por una democracia de las ciencias, publicado en Francia en 1999 y que ahora edita en español RBA.

Latour, aire de sabio despistado, recibe a El País Semanal en su caótico y enorme despacho del Instituto de Estudios Políticos de París, del que es director científico y director adjunto desde 2007.

PREGUNTA: Este libro se publicó en Francia hace ya 14 años. ¿Sigue suscribiendo lo que escribió?

RESPUESTA: Casi todo, sí. Pero las cosas no han mejorado. He seguido trabajando en lo mismo, pero con otro tono. Hoy debo de ser el único que se ocupa de estas cuestiones, de una filosofía política que exige una verdadera política ecologista. Lo que no ha funcionado es que pensé que iba a ser un libro fundador para los ecologistas. ¡Y ha sido un fracaso total! Los ecologistas han desaparecido.

P: En Francia al menos hay verdes en el Gobierno.

R: Sí, pero tienen una visión muy estrecha de la ecología, no reflexionan ni sobre la economía ni sobre la sociedad. La ecología está limitada a las cuestiones de la naturaleza, cuando en realidad no tiene nada que ver con eso. Hay que elegir entre naturaleza y política. Desgraciadamente, se ha intentado hacer una política ecologista que no ha producido nada bueno porque se ha basado en la lucha tradicional, que tenía como objetivo torpedear la política o, mejor, someterla; en cierto modo, los verdes actúan como un tribunal que trata de definir una especie de soberanía.

P: ¿De superioridad moral o natural?

R: Sí, pero sobre todo de estupidez. Evidentemente, el tomar la naturaleza como un fin no ha hecho más que debilitar la posición de los ecologistas, que nunca han sido capaces de hacer política; en fin, auténtica política en el sentido de la tradición socialista, en la que se hubieran debido inspirar. No han hecho el trabajo que el socialismo primero, el marxismo después y luego la socialdemocracia hicieron. No ha habido, para nada, un trabajo de invención intelectual, de exploración; han preferido “el escaparate”. Puede que no hubiera otra solución, pues no estaba escrito que la ecología se fuera a convertir en un partido.

“Hay una ecología profunda con un gran papel en EE UU y alemania”

P: ¿Entonces el ecologismo es hoy una especie de ac­­tivismo sin conexión científica?

R: Ha habido movimientos interesantes gracias a una casuística muy concreta, importante en lo que concierne a los animales, las plantas, los dientes de los elefantes, el agua, los ríos, etcétera. Han mostrado además gran energía en las cuestiones locales, pero sin afrontar las cuestiones de la política, de la vida en común. Por eso el ecologismo sigue siendo marginal, justo en un momento en que las cuestiones ecológicas se han convertido en un asunto de todos. Y se da una paradoja: la ecología se ocupa de temas minúsculos relacionados con la naturaleza y la sociedad mientras que la cuestión de la Tierra, la presencia de la Tierra en la política, se hace cada vez más apremiante. Esa urgencia, que ya era acuciante hace 10 o 15 años, lo es mucho más ahora.

P: ¿Quizá ha faltado formar una Internacional Verde?

R: No se ha hecho porque los ecologistas pensaban que la Tierra iba a unificar todos estos movimientos. Han surgido un montón de redes, basadas en casos concretos, como Greenpeace. Hay asociaciones, pero nada a nivel político. La internacional sigue siendo la geopolítica clásica de los Estados nación. No ha habido reflexión sobre la nueva situación. Existe una ecología profunda, deep ecology, en Francia prácticamente inexistente, que ha tenido un papel importante en Alemania, en los países escandinavos y en Norteamérica. Pero está muy poco politizada.

P: Estamos ante un fracaso político y ante una mayor conciencia de los científicos. ¿Y los ciudadanos?

R: Paradójicamente, esa dolorosa pelea sobre el clima nos ha permitido progresar. En cierto modo, la querella ha tenido un papel importante en una “comprensión renovada” por parte del público de la realidad científica. El problema es que intentamos insertar las cuestiones ecológicas en el viejo modelo “ciencia y política”. Desde este punto de vista, incluso los científicos más avanzados siguen intentando poner estas cuestiones dentro del marco de esa situación superada que intento criticar. Este es el tema del libro, y en ese sentido sigue de actualidad.

P: En Francia hay una identificación entre ecologismo y territorio. José Bové, por ejemplo, es un proteccionista a ultranza. Es rara esta evolución de la ecología hacia el nacionalismo, ¿no?

R: Sí, pero al mismo tiempo es útil e interesante replantearse lo que es el territorio, el terruño, por usar la palabra francesa. Los ecologistas siempre se han mostrado indecisos sobre el carácter progresista o reaccionario de su apego a la tierra, porque la expresión en francés puede significar cosas muy distintas. Pero es importante, porque es una de las dimensiones de la cuestión ecológica, tanto de la progresista como de la arcaica. Ese era uno de los objetivos fundamentales del libro, saber si hemos sido realmente modernos alguna vez. Hay aspectos regresivos en el apego al terruño, y a la vez hay otros muy importantes sobre la definición de los límites, de los entornos en los cuales vivimos, que son decisivos para el porvenir. Una vez más, los verdes han omitido trabajar esa cuestión. Pero el problema de la orientación, de la diferencia entre el apego reaccionario o progresista a la tierra, es fundamental. Si vemos movimientos como Slow Food, nos preguntamos si están adelantados o retrasados, porque tienen aspectos regresivos. Pero si se piensa en el tema de los circuitos de distribución, ¿por qué las lasañas inglesas tendrían que estar hechas con caballo rumano y transitar por 25 intermediarios? No es una tontería: si tomamos caballo francés, rumano o turco, las cuestiones de pertenencia y de límites se convierten en cuestiones progresistas.

El antropólogo iconoclasta

Bruno Latour nació en la Borgoña, donde surgen los vinos más caros del planeta. Su padre era viticultor. De ahí sus pecualiares análisis sobre el terruño y la tradición. Cursó Antropología y Sociología. Su formación es tan variopinta como los centros donde ha impartido clase, desde la Escuela de Minas de París hasta la London School of Economics y la cátedra de Historia de Harvard.

Escritor incansable, es autor de una treintena de libros de ensayo, todos los últimos editados por Harvard, por los que circulan la tierra, la sociedad, la guerra, la energía, la ciencia, la tecnología, la modernidad y los medios de comunicación.

Su último proyecto está conectado con el llamado medialab, un espacio donde desarrollar conexiones entre las tecnologías digitales, la sociología y los estudios científicos.

P: Su libro llama a superar los esquemas de izquierda y derecha. Pero no parece que eso haya cambiado mucho.

R: El debate afronta un gran problema. Hay una inversión de las relaciones entre el marco geográfico y la política: el marco ha cambiado mucho más que la política. Las grandes negociaciones internacionales manifiestan esa inercia de la organización económica, legal y política, mientras que el marco, lo que antes llamábamos la Tierra, la geografía, cambia a velocidad asombrosa. Esa mutación es difícil de comprender por la gente acostumbrada a la historia de antes, en la cual había humanos que se peleaban, como en el siglo XX: hombres haciéndose la guerra dentro de un marco geográfico estable desde la última glaciación. Es una razón demasiado filosófica. Así que preferimos pensar que tenemos tiempo, que todo está en su sitio, que la economía es así, que el derecho internacional es así, etcétera. Pero incluso los términos para señalar las aceleraciones rápidas han cambiado, volcándose hacia la naturaleza y los glaciares. El tiempo que vivimos es el del antropoceno, y las cosas ya no son como antes. Lo que ha cambiado desde que escribí el libro es que en aquel momento no teníamos la noción del antropoceno. Fue una invención muy útil de Crutzen, un climatólogo, pero no existía entonces, me habría ayudado mucho.

P: ¿Y qué fue de su propuesta de aprobar una constitución ecológica?

R: Intenté construir una asociación de parlamentarios y lanzar una constitución para que las cuestiones de la energía empezaran a ser tratadas de otro modo. Intentaba abrir un debate, que naturalmente no ha tenido lugar. El debate sobre la Constitución empezó bien, se consideró una gran invención de la democracia europea. El problema es que ya no se trata de la cuestión de la representación de los humanos, sino que ese debate atañe a los innumerables seres que viven en la Tierra. Me parecía necesario en aquel momento, y ahora más incluso, hacer un debate constitucional. ¿Cómo sería un Parlamento dedicado a la política ecológica? Tendrá que crearse, pero no reflexionamos lo suficiente sobre las cuestiones de fondo.

P: ¿Las grandes conferencias medioambientales resuelven algo?

R: El problema es que la geopolítica organizada en torno a una nación, con sus propios intereses y nivel de agregación, está mal adaptada a las cuestiones ecológicas, que son transnacionales. Todo el mundo sabe eso, los avances no pueden plasmarse ya a base de mapas, no jugamos en territorios clásicos. Así, desde Copenhague 2009 hay una desafección por las grandes cumbres, no solo porque no se consigue decidir nada, sino también porque nos damos cuenta de que el nivel de decisión y agregación política no es el correcto. De hecho, las ciudades, las regiones, las naciones, las provincias, toman a menudo más iniciativas que los Estados.

P: Francia es uno de los países más nuclearizados del mundo. Los ecologistas braman. ¿Le parece bien?

R: Los ecologistas se han obstinado en la cuestión nuclear, pero nadie ha venido a explicarnos por qué lo nuclear es antiecológico, mientras mucha gente seria considera que el átomo es una de las soluciones, a largo plazo no, pero a corto plazo sí. De nuevo estamos ante la ausencia total de reflexión política por parte de los ecologistas, que militan contra lo nuclear sin explicar por qué. Por consiguiente, no hemos avanzado un centímetro. De hecho, en este momento hay un gran debate público sobre la transición energética, y los verdes siguen siendo incapaces de comprender nada, incluso de discutir, porque han moralizado la cuestión nuclear. Cuando se hace ética, no hay que hacer política, hay que hacer religión.

P: ¿Está realmente en cuestión la supervivencia de la especie?

R: La especie humana se las apañará. Nadie piensa que vaya a desaparecer, ¿pero la civilización? No se sabe lo que es una Tierra a seis u ocho grados, no lo hemos conocido. Hay que remontarse centenares de millones de años. El problema no se abordaba con la misma urgencia cuando escribí el libro en 1999, se hablaba aún de las generaciones futuras. Ahora hablamos de nuestros hijos. No hay una sola empresa que haga un cálculo más allá de 2050, es el horizonte más corto que ha habido nunca. La mutación de la historia es increíblemente rápida. Ahora se trata de acontecimientos naturales, mucho más rápidos que los humanos. Es inimaginable para la gente formada en el siglo XX, una novedad total.

P: ¿Es la globalización? ¿O más que eso?

R: Tiene relación con la globalización, pero no por la extensión de las conexiones entre los humanos. Se trata de la llegada de un mundo desagradable que impide la globalización real: es un conflicto entre globos. Nos hemos globalizado, y eso resulta tranquilizador porque todo está conectado y hace de la Tierra un planeta pequeño. Pero que un gran pueblo sea aplastado al chocar con otra cosa tranquiliza menos.

La especie humana se las apañará. nadie piensa que va a desaparecer”

P: ¿Y el malestar que sentimos, la indignación, tiene que ver con ese miedo?

R: Ese catastrofismo siempre ha existido; siempre ha habido momentos de apocalipsis, de literatura de la catástrofe; pero al mismo tiempo existe un sentimiento nuevo: no se trata del apocalipsis de los humanos, sino del final de recursos, en un sentido, creo, literal.

P: ¿Nos hemos zampado el planeta?

R: La gente que analiza el antropoceno dibuja esquemas de este tipo (muestra un famoso gráfico de población y recursos). Esto se llama “la gran aceleración”, ocurrió en 1947. La revolución ya ha tenido lugar, y es una de las causas de esa nueva ansiedad. La gente sigue hablando de la revolución, desesperándose porque no llega, pero ya está aquí. Es un acontecimiento pasado y de consecuencias catastróficas. Eso también nubla la mente de progresistas y reaccionarios. ¿Qué significa vivir en una época en la cual la revolución ha ocurrido ya y cuyos resultados son catastróficos?

P: ¿No querrá decir que la austeridad es la solución?

R: Ya existe el concepto del decrecimiento feliz, no sé si la tienen en España… ¡Sí! Ustedes están muy adelantados sobre decrecimiento.

P: Estamos en plena vanguardia, pero del infeliz.

R: Es uno de los grandes temas del momento, la crisis económica es decrecimiento no deseado, desigualmente repartido; y hay algo más: austeridad no es necesariamente la palabra, sino ascetismo. Sería la visión religiosa, o espiritual, de la austeridad. Eso se mezcla con las nuevas visiones geológicas de los límites que debemos imponernos…

P: ¿Habla del regreso al campo o de reconstruir el planeta?

R: No me refiero a volver al campo, sino a otra Tierra.

P: ¿La tecnología es la única brújula?

R: La tecnología se encuentra en esa misma situación. Existe una solución muy importante de la geoingeniería, que considera que la situación es reversible, que se pueden recrear artificialmente unas condiciones favorables tras haberlas destruido sin saberlo. Así ha surgido un inmenso movimiento de geoingeniería en todas partes. Ya que es la energía de la Tierra, podemos mandar naves espaciales, modificar la acidez de las aguas del mar, etcétera. Hacer algo que contrarreste lo que se hizo mal. Si hemos podido modificar la Tierra, podemos modificarla en el otro sentido, lo que es un argumento peligroso, porque la podemos destrozar por segunda vez.

P: ¿No se regenerará sola?

R: Sí, ¡pero sin humanos! Se regenerará sola mientras no haya humanos. Puede deshacerse de nosotros, es una de las hipótesis, volviéndose invivible, pero eso no sería muy positivo. La era de los límites puede llegar hasta la extinción.

P: ¿Acabaremos fatal?

R: La historia no está repleta de ejemplos favorables. No se sabe. No hay nada en la naturaleza humana que favorezca la reflexión, por lo cual la solución solo puede ser mala.

P: Algunos temen que acabaremos devorados por los chinos.

R: Los chinos tienen más problemas que nosotros y corren el peligro de comerse a sí mismos por el suelo, el agua y el aire. No nos amenazan, desaparecerán antes que nosotros.

P: Žižek dice que nuestros problemas provienen de la mediocridad intelectual de Alemania y Francia, que esa es la razón principal de la decadencia actual. ¿Qué piensa?

R: Es una estupidez. Ocurren muchas más cosas intelectualmente en Europa que en América, infinitamente más. Por ejemplo, en arte, en filosofía, en ciencias, en urbanismo. Es insensato decir cosas así, pero es que Žižek es un viejo cretino, una especie de cosa de extrema izquierda, fruto del agotamiento de la extrema izquierda, de su decadencia final, de la cual es el síntoma. Por otra parte, es un chico muy majo. La extrema izquierda se ha equivocado tanto sobre el mundo que al final todos estos viejos de extrema izquierda no tienen otra cosa que hacer salvo vomitar sobre el mundo, como hace Alain Badiou en Francia.

P: ¿Prefiere a Marine Le Pen?

R: No soy político, no puedo responder a esta pregunta, no me interesa.

P: ¿No le gusta hablar de política?

R: Sí hablo de política, he escrito un libro sobre política, ¡que yo sepa!,Las políticas de la naturaleza.

P: ¿No le interesa la política de todos los días?

R: La de todos los días sí, pero no la de los partidos, son agitaciones superficiales, sobre todo en Francia, donde ya no hay verdaderamente política.

P: Critica a la extrema izquierda, ¿y nada a la extrema derecha?

R: Se agita, intenta agarrarse a un clavo ardiendo, pero no tiene mucha importancia. No es ahí donde las cosas están en juego.

P: ¿Cree que es residual?

R: No, no es residual, puede desarrollarse y provocar daños, tanto como la extrema izquierda; el no pensar siempre provoca daños, pero no es eso lo que va a solucionar los problemas de la Tierra, la economía, las ciudades, el transporte y la tecnología.

P: ¿Qué escenario prevé para 2050? ¿Qué Tierra, qué humanidad?

R: Ese no es mi trabajo, mi trabajo consiste en prepararnos para las guerras. Las guerras ecológicas van a ser muy importantes y tenemos que preparar nuestros ejércitos de un modo intelectual y humano. Ese es mi trabajo.

P: ¿Habrá guerras violentas por el clima?

R: La definición misma de guerra va a cambiar, estamos en una situación en la cual no podemos ganar contra la Tierra, es una guerra asimétrica: si ganamos, perdemos, y si perdemos, ganamos. Así pues, esta situación crea obligaciones a multitud de gente y antes que nada a los intelectuales.

P: ¿La batalla principal es esa?

R: Si no tenemos mundo, no podemos hacer gran cosa, ni siquiera la revolución. Cuando se lee a Marx, uno se queda impresionado por lo que dice sobre los humanos. En esta época, la cuestión de la ciencia y del margen geográfico, más la presencia de miles de millones de personas, conforma un escenario crucial. Antes teníamos otros problemas, pero este no.

P: ¿Así que se trata de ser o no ser?

R: En cada informe científico, las previsiones son peores, el plan más pesimista siempre aparece. Hay que tener en cuenta eso. Son previsiones extremas, pero de momento son las únicas válidas. No se trata de una guerra mundial, sino de una acumulación de guerras mundiales. Es parecido al invierno nuclear de la guerra fría, una situación de cataclismo, pero con algunas ventajas: es más radical, pero más lento, tenemos mucha capacidad de invención, 9.000 millones de personas y muchas mentes inteligentes. Pero también es un reto. Por tanto, es una cuestión de alta política y no de naturaleza. La política viene primero.

P: ¿Tiene la sensación de estar solo?

R: Lo que era complicado en este libro era crear el vínculo entre ciencia y política, y no puedo decir que haya convencido a mucha gente. Si además se hace el vínculo entre la religión y las artes, es más difícil. Gente como Sloterdijk sería muy capaz de comprenderlo. Sin embargo, muchos intelectuales siguen en el siglo XX, como Žižek. Permanecen en un contexto, en un ideal revolucionario, de decepción. Están decepcionados con los humanos.

P: ¿Cree que los humanos se dejarán ayudar?

R: Primero hay que ayudar a la Tierra. En el antropoceno ya no se puede hacer la distinción entre los humanos y la Tierra.

P: ¿Y sus estudiantes están listos para la lucha?

R: En mi escuela soy el único en dar clases sobre cuestiones donde no entra la política en el sentido clásico. Hay un curso o dos sobre cuestiones ecológicas. Es culpa mía, no he trabajado lo suficiente como para cambiar las cosas. Llevamos mucho retraso.