Arquivo da tag: Economia

‘Beautiful Game’ becomes ‘Pricey Game’ with World Cup, Confed Cup changing Brazilian soccer (Washington Post/AP)

By Associated Press, Published: May 28

RIO DE JANEIRO — It’s an image as Brazilian as Carnival or Rio’s Christ the Redeemer statue.

Drummers pound out a Samba rhythm. Swaying to the beat, fans sing and saunter up and down the aisles waving flags the size of bedsheets, seeming oblivious to the match below.

Little by little this picturesque mayhem in Brazilian soccer stadiums is disappearing, and ticket prices are soaring despite the toned-down version being sold.

The “Beautiful Game” has become the “Pricey Game.”

This year’s Confederations Cup and next year’s World Cup, the first in this South American country in 64 years, are speeding the changes. The national game is getting a different look with the use of numbered seating, a transformation that’s been going on for several years.

This might seem like a small thing, but it’s big in Brazil.

For decades, Brazilians simply raced into the stadiums and grabbed the best spots — some sitting, others standing in a crush amid thousands of others. At the Confederations Cup and World Cup, the seats will be assigned, and they won’t come cheaply. As an example, the least expensive seats for Sunday’s exhibition game between Brazil and England — the first major test event at Rio de Janeiro’s renovated Maracana Stadium — will be 90 reals ($45).

That’s 30 times more than the cheapest seat eight years ago at the historic stadium.

The Brazil-England match comes only days before the opening of the Confederations Cup, the eight-team warmup for the World Cup that starts on June 15. Maracana is the venue for the title game June 30 — and the World Cup final.

“The giant price change means there is a shift concerning the kind of people that are going to the new stadiums,” said Erick Omena de Melo, a native of Rio de Janeiro who is working on a doctorate in city planning at Oxford University in England. “It was previously a much more diverse place in the stadiums. But as the economy in Brazil changes, they are converting these stadiums to a much more middle-class, upper-middle class or even upper-class place that is much less for the lower-middle class and poor.”

Traditional general admission is being eliminated with luxury boxes and modern seating taking over at the six stadiums being used for the Confederations Cup, and the additional six that are to be ready for the World Cup. This change has already filtered down to the country’s heavily indebted club teams and is sure to take some of the spontaneity out of what Brazilians call “futebol” (pronounced foo-chee-BOHL).

Brazilian fans used to play a major role in the drama. These days they’re staying away. Average attendance for matches in Major League Soccer in the United States is higher than attendance for first-division matches in Brazil, which likes to call itself the “Home of Football.”

“What’s being done so far is transferring a European model to Brazil,” said Omena de Melo, who is working on a book about the social history of Maracana. “But Brazil is really different. It’s a totally different atmosphere at a football game. The changes are seen by many as a huge aggression against the traditional fans, the traditional crowds at football matches.”

Officials counter that ticket prices in Brazil are still below European levels, and that new and refurbished stadiums will improve safety that is needed in a country where soccer-related crime and violence is common. In addition, Brazil would never have been awarded the World Cup — and the 2016 Olympics in Rio de Janeiro — without a pledge to upgrade crumbling stadiums and tighten security.

The South American country is spending an estimated $3.5 billion on new stadiums and refurbishments, though most of the project has run behind schedule. The need to work 24-7 to finish the venues will run up the costs by millions more. FIFA has complained openly about the delays, acknowledging the Confederations Cup will be a maze of unfinished work.

FIFA Secretary General Jerome Valcke has admitted that “not all operational arrangements will be 100 percent,” then warned “this will be impossible to repeat for the FIFA World Cup.”

The new national stadium in Brasilia opened at a cost of more than $590 million, the most expensive of the 12 World Cup venues. But it has no local team to call it home, and many say it’s a “white elephant.”

It will host the opening of the Confederations Cup on June 15 with Brazil facing Japan.

Another stadium is going up in Manaus in the northern state of Amazonas — again with no local team. It’s the same in the southwestern city of Cuiaba, also without a team in Brazil’s top league.

Brazilian Sports Minister Aldo Rebelo — a Brazilian Communist Party member — defends the stadiums as “centers for sports and nonsports events,” and suggested they would be good places for businesses to hold conventions, shows and fairs.

Omena de Melo countered that the “gentrification” eliminates the diversity.

“Football in Brazil has been a kind of antenna that captures all the different values in Brazilian culture and correlates them into one,” Omena de Melo said. “This sort of informality has existed for a century in these stadiums.”

He used the example of Maracana to show how prices have soared.

The stadium has been closed twice for refurbishment since in the last decade. When it was closed in 2005 to be redone for the 2007 Pan American Games, Omena de Melo’s research showed the cheapest ticket was about $1.50.

In 2010, when it was closed again to be refurbished for next year’s World Cup, the cheapest ticket was about $20.

The Maracana was opened again a few weeks ago. Its capacity has been reduced to just under 79,000 — it held more than 170,000 for the final match of the 1950 World Cup — and plans call for it to be eventually shared by Brazilian clubs Flamengo and Fluminense.

In a country where the official minimum monthly salary is $339, the cheapest ticket for the Brazil-England match will be about $45 — 30 times the price of the cheapest ticket only eight years ago and out of reach for most Cariocas, the term for residents of Rio.

Rio de Janeiro sports journalist Telmo Zanini defended the rising prices and said adjusting to the seating changes will be easy in Rio and Sao Paulo in the prosperous southeast, but more difficult in provincial cities.

He cited a recent case in the city of Belo Horizonte “where people took seats and didn’t want to give them up when the ticketholders arrived. So police or stewards had to be called in.”

He said ticket prices had been rising for a long time, and declined to blame the World Cup. Rio de Janeiro and Sao Paulo are two of the world’s most expensive cities. A kilogram (2.2 pounds) of tomatoes recently sold for $6.50 at some Rio de Janeiro supermarkets, where a standard can of shaving cream costs $12. Shaving gel goes for $15.

“Poor people also can’t buy tickets in England or the United States,” Zanini said. “It’s a question of the market. You don’t see poor people buying tickets for Los Angeles Lakers games. The World Cup is not the only reason. Ticket prices have been going up for a long time. But with the World Cup stadiums we will have better quality stadiums. Some people have not gone to games previously because they did not feel safe.”

Marcello Campos, a 29-year-old fan of Rio club Flamengo who goes to at least one match a week, called the changes “a little difficult.”

“It’s going to be a challenge for the people who are used to the low prices; people who don’t have money to buy a ticket for 80 reals ($40) or 100 reals ($50). It’s expensive now.”

He said getting people to stay in numbered seats would be even tougher.

“It’s impossible for me to watch a football game sitting,” Campos said. “I’m too nervous to be sitting. I’ll need to fix that in my mind, to concentrate on sitting.”

He said the changes would be beneficial, imposing organization on chaos.

“We need to change the culture. It kind of gives everyone equal rights, not just those who show up first.”

Benefiting from many of the changes is a multinational consortium that won a contract in May from the state of Rio de Janeiro to run Maracana for 35 years. The consortium is made up of Brazilian construction conglomerate Odebrecht, Los Angeles-based Sports and entertainment company AEG, and the sport and entertainment company IMX, which is owned by Brazilian billionaire Eike Batista.

Critics say the deal gives the Rio de Janeiro state government less money than it invested in the venue and will lead to the demolition of an indigenous museum, a public school and some athletics facilities in the area. A public prosecutor estimated that $615 million in public money has been spent on Maracana since 2005, raising questions why a private consortium should reap most of the profits from taxpayer money.

The Brazilian soccer great Pele has come out against the privatization, saying the famous stadium “must be of the people, for the Brazilian people.” Others have also questioned selling off what has been traditionally a public space to private interests.

Omena de Melo cautioned that the new stadiums will not eliminate soccer-related violence.

“Violence tied to football could still be there, even after the gentrification,” he said. “If people can’t get inside the stadiums, they are going to get violent outside. You can’t isolate the stadium from the society where it exists. Brazilian society has a lot of problems caused by inequality, and violence is one of them.”

Geoengineering: Can We Save the Planet by Messing with Nature? (Democracy Now!)

Video: http://www.democracynow.org/2013/5/20/geoengineering_can_we_save_the_planet

Clive Hamilton, professor of public ethics at Charles Sturt University in Canberra, Australia. He is the author of the new book, Earthmasters: The Dawn of the Age of Climate Engineering.

Overheated rhetoric on climate change doesn’t make for good policies (Washington Post)

By Lamar Smith, Published: May 19, 2013

Lamar Smith, a Republican, represents Texas’s 21st District in the U.S. House and is chairman of the House Committee on Science, Space and Technology.

Climate change is an issue that needs to be discussed thoughtfully and objectively. Unfortunately, claims that distort the facts hinder the legitimate evaluation of policy options. The rhetoric has driven some policymakers toward costly regulations and policies that will harm hardworking American families and do little to decrease global carbon emissions. The Obama administration’s decision to delay, and possibly deny, the Keystone XL pipeline is a prime example.

The State Department has found that the pipeline will have minimal impact on the surrounding environment and no significant effect on the climate. Recent expert testimony before the House Committee on Science, Space and Technology confirms this finding. In fact, even if the pipeline is approved and is used at maximum capacity, the resulting increase in carbon dioxide emissions would be a mere 12 one-thousandths of 1 percent (0.012 percent). There is scant scientific or environmental justification for refusing to approve the pipeline, a project that the State Department has also found would generate more than 40,000 U.S. jobs.

Contrary to the claims of those who want to strictly regulate carbon dioxide emissions and increase the cost of energy for all Americans, there is a great amount of uncertainty associated with climate science. These uncertainties undermine our ability to accurately determine how carbon dioxide has affected the climate in the past. They also limit our understanding of how anthropogenic emissions will affect future warming trends. Further confusing the policy debate, the models that scientists have come to rely on to make climate predictions have greatly overestimated warming. Contrary to model predictions, data released in October from the University of East Anglia’s Climate Research Unit show that global temperatures have held steady over the past 15 years, despite rising greenhouse gas emissions.

Among the facts that are clear, however, are that U.S. emissions contribute very little to global concentrations of greenhouse gas, and that even substantial cuts in these emissions are likely to have no effect on temperature. Data from the Energy Information Administration show, for example, that the United States cut carbon dioxide emissions by 12 percent between 2005 and 2012 while global emissions increased by 15 percent over the same period.

Using data from the Intergovernmental Panel on Climate Change (IPCC), a Science and Public Policy Institute paper published last month found that if the United States eliminated all carbon dioxide emissions, the overall impact on global temperature rise would be only 0.08 degrees Celsius by 2050.

Further confounding the debate are unscientific and often hyperbolic claims about the potential effects of a warmer world. In his most recent State of the Union address, President Obama said that extreme weather events have become “more frequent and intense,” and he linked Superstorm Sandy to climate change.

But experts at the National Oceanic and Atmospheric Administration have told the New York Times that climate change had nothing to do with Superstorm Sandy. This is underscored by last year’s IPCC report stating that there is “high agreement” among leading experts that trends in weather disasters, floods, tornados and storms cannot be attributed to climate change. While these claims may make for good political theater, their effect on recent public policy choices hurts the economy.

Last spring the Environmental Protection Agency proposed emissions standards that virtually prohibit new coal-fired power plants. As we await implementation of these strict new rules, additional regulations that will affect existing power plants, refineries and other manufactures are sure to follow. Analyses of these measures by the American Council for Capital Formation, which studies economic and environmental policy, show that they will raise both electricity rates and gas prices — costing jobs and hurting the economy — even as the EPA admits that these choices will have an insignificant impact on global climate change (a point former EPA administrator Lisa Jackson confessed during a Senate hearing in 2009).

Instead of pursuing heavy-handed regulations that imperil U.S. jobs and send jobs (and their emissions) overseas, we should take a step back from the unfounded claims of impending catastrophe and think critically about the challenge before us. Designing an appropriate public policy response to this challenge will require that we fully assess the facts and the uncertainties surrounding this issue, and that we set aside the hyped rhetoric.

Read more from PostOpinions: Greg Sargent: Now can we talk about climate change? The Post’s View: Carbon tax is best option Congress has Matthew Stepp: The limits of renewable energy Stephen Stromberg: In State of the Union, Obama threatens Congress on climate change. And that’s a good thing.

Political Motivations May Have Evolutionary Links to Physical Strength (Science Daily)

May 15, 2013 — Men’s upper-body strength predicts their political opinions on economic redistribution, according to new research published inPsychological Science, a journal of the Association for Psychological Science.

The principal investigators of the research — psychological scientists Michael Bang Petersen of Aarhus University, Denmark and Daniel Sznycer of University of California, Santa Barbara — believe that the link may reflect psychological traits that evolved in response to our early ancestral environments and continue to influence behavior today.

“While many think of politics as a modern phenomenon, it has — in a sense — always been with our species,” says Petersen.

In the days of our early ancestors, decisions about the distribution of resources weren’t made in courthouses or legislative offices, but through shows of strength. With this in mind, Petersen, Sznycer and colleagues hypothesized that upper-body strength — a proxy for the ability to physically defend or acquire resources — would predict men’s opinions about the redistribution of wealth.

The researchers collected data on bicep size, socioeconomic status, and support for economic redistribution from hundreds of people in the United States, Argentina, and Denmark.

In line with their hypotheses, the data revealed that wealthy men with high upper-body strength were less likely to support redistribution, while less wealthy men of the same strength were more likely to support it.

“Despite the fact that the United States, Denmark and Argentina have very different welfare systems, we still see that — at the psychological level — individuals reason about welfare redistribution in the same way,” says Petersen. “In all three countries, physically strong males consistently pursue the self-interested position on redistribution.”

Men with low upper-body strength, on the other hand, were less likely to support their own self-interest. Wealthy men of this group showed less resistance to redistribution, while poor men showed less support.

“Our results demonstrate that physically weak males are more reluctant than physically strong males to assert their self-interest — just as if disputes over national policies were a matter of direct physical confrontation among small numbers of individuals, rather than abstract electoral dynamics among millions,” says Petersen.

Interestingly, the researchers found no link between upper-body strength and redistribution opinions among women. Petersen argues that this is likely due to the fact that, over the course of evolutionary history, women had less to gain, and also more to lose, from engaging in direct physical aggression.

Together, the results indicate that an evolutionary perspective may help to illuminate political motivations, at least those of men.

“Many previous studies have shown that people’s political views cannot be predicted by standard economic models,” Petersen explains. “This is among the first studies to show that political views may be rational in another sense, in that they’re designed by natural selection to function in the conditions recurrent over human evolutionary history.”

Co-authors on this research include Aaron Sell, Leda Cosmides, and John Tooby of the University of California, Santa Barbara.

This research was supported by a grant from the Danish Research Council and a Director’s Pioneer Award from the National Institutes of Health.

Journal Reference:

  1. M. B. Petersen, D. Sznycer, A. Sell, L. Cosmides, J. Tooby. The Ancestral Logic of Politics: Upper-Body Strength Regulates Men’s Assertion of Self-Interest Over Economic RedistributionPsychological Science, 2013; DOI: 10.1177/0956797612466415

For Insurers, No Doubts on Climate Change (N.Y.Times)

Master Sgt. Mark Olsen/U.S. Air Force, via Associated Press. Damage in Mantoloking, N.J., after Hurricane Sandy. Natural disasters caused $35 billion in private property losses last year.

By EDUARDO PORTER

Published: May 14, 2013

If there were one American industry that would be particularly worried about climate change it would have to be insurance, right?

From Hurricane Sandy’s devastating blow to the Northeast to the protracted drought that hit the Midwest Corn Belt, natural catastrophes across the United States pounded insurers last year, generating$35 billion in privately insured property losses, $11 billion more than the average over the last decade.

And the industry expects the situation will get worse. “Numerous studies assume a rise in summer drought periods in North America in the future and an increasing probability of severe cyclones relatively far north along the U.S. East Coast in the long term,” said Peter Höppe, who heads Geo Risks Research at the reinsurance giant Munich Re. “The rise in sea level caused by climate change will further increase the risk of storm surge.” Most insurers, including the reinsurance companies that bear much of the ultimate risk in the industry, have little time for the arguments heard in some right-wing circles that climate change isn’t happening, and are quite comfortable with the scientific consensus that burning fossil fuels is the main culprit of global warming.

“Insurance is heavily dependent on scientific thought,” Frank Nutter, president of the Reinsurance Association of America, told me last week. “It is not as amenable to politicized scientific thought.”

Yet when I asked Mr. Nutter what the American insurance industry was doing to combat global warming, his answer was surprising: nothing much. “The industry has really not been engaged in advocacy related to carbon taxes or proposals addressing carbon,” he said. While some big European reinsurers like Munich Re and Swiss Re support efforts to reduce CO2 emissions, “in the United States the household names really have not engaged at all.” Instead, the focus of insurers’ advocacy efforts is zoning rules and disaster mitigation.

Last week, scientists announced that the concentration of heat-trapping carbon dioxide in the atmosphere had reached 400 parts per million — its highest level in at least three million years, before humans appeared on the scene. Back then, mastodons roamed the earth, the polar ice caps were smaller and the sea level was as much as 60 to 80 feet higher.

The milestone puts the earth nearer a point of no return, many scientists think, when vast, disruptive climate change is baked into our future. Pietr P. Tans, who runs the monitoring program at the National Oceanic and Atmospheric Administration, told my colleague Justin Gillis: “It symbolizes that so far we have failed miserably in tackling this problem.” And it raises a perplexing question: why hasn’t corporate America done more to sway its allies in the Republican Party to try to avert a disaster that would clearly be devastating to its own interests?

Mr. Nutter argues that the insurance industry’s reluctance is born of hesitation to become embroiled in controversies over energy policy. But perhaps its executives simply don’t feel so vulnerable. Like farmers, who are largely protected from the ravages of climate change by government-financed crop insurance, insurers also have less to fear than it might at first appear.

The federal government covers flood insurance, among the riskiest kind in this time of crazy weather. And insurers can raise premiums or even drop coverage to adjust to higher risks. Indeed, despite Sandy and drought, property and casualty insurance in the United States was more profitable in 2012 than in 2011, according to the Property Casualty Insurers Association of America.

But the industry’s analysis of the risks it faces is evolving. One sign of that is how some top American insurers responded to a billboard taken out by the conservative Heartland Institute, a prominent climate change denier that has received support from the insurance industry.

The billboard had a picture of Theodore Kaczynski, the Unabomber, who asked: “I still believe in global warming. Do you?”

Concerned about global warming and angry to be equated with a murderous psychopath, insurance companies like Allied World, Renaissance Re, State Farm and XL Groupdropped their support for Heartland.

Even more telling, Eli Lehrer, a Heartland vice president who at the time led an insurance-financed project, left the group and helped start the R Street Institute, a standard conservative organization in all respects but one: it believes in climate change and supports a carbon tax to combat it. And it is financed largely with insurance industry money.

Mr. Lehrer points out that a carbon tax fits conservative orthodoxy. It is a broad and flat tax, whose revenue can be used to do away with the corporate income tax — a favorite target of the right. It provides a market-friendly signal, forcing polluters to bear the cost imposed on the rest of us and encouraging them to pollute less. And it is much preferable to a parade of new regulations from the Environmental Protection Agency.

“We are having a debate on the right about a carbon tax for the first time in a long time,” Mr. Lehrer said.

Bob Inglis, formerly a Republican congressman from South Carolina who lost his seat in the 2010 primary to a Tea Party-supported challenger, is another member of this budding coalition. Before he left Congress, he proposed a revenue-neutral bill to create a carbon tax and cut payroll taxes.

Changing the political economy of a carbon tax remains an uphill slog especially in a stagnant economy. But Mr. Inglis notices a thaw. “The best way to do this is in the context of a grand bargain on tax reform,” he said. “It could happen in 2015 or 2016, but probably not before.”

He lists a dozen Republicans in the House and eight in the Senate who would be open to legislation to help avert climate change. He notes that Exelon, the gas and electricity giant, is sympathetic to his efforts — perhaps not least because a carbon tax would give an edge to gas over its dirtier rival, coal. Exxon, too, has also said a carbon tax would be the most effective way to reduce emissions. So why hasn’t the insurance industry come on board?

Robert Muir-Wood is the chief research officer of Risk Management Solutions, one of two main companies the insurance industry relies on to crunch data and model future risks. He argues that insurers haven’t changed their tune because — with the exception of 2004 and 2005, when a string of hurricanes from Ivan to Katrina caused damage worth more than $200 billion — they haven’t yet experienced hefty, sustained losses attributable to climate change.

“Insurers were ready to sign up to all sorts of actions against climate change,” Mr. Muir-Wood told me from his office in London. Then the weather calmed down.

Still, Mr. Muir-Wood notes that the insurance industry faces a different sort of risk: political action. “That is the biggest threat,” he said. When insurers canceled policies and raised premiums in Florida in 2006, politicians jumped on them. “Insurers in Florida,” he said, “became Public Enemy No. 1.”

And that’s the best hope for those concerned about climate change: that global warming isn’t just devastating for society, but also bad for business.

Climate slowdown means extreme rates of warming ‘not as likely’ (BBC)

19 May 2013 Last updated at 17:31 GMT

By Matt McGrath – Environment correspondent, BBC News

ice

The impacts of rising temperature are being felt particularly keenly in the polar regions

Scientists say the recent downturn in the rate of global warming will lead to lower temperature rises in the short-term.

Since 1998, there has been an unexplained “standstill” in the heating of the Earth’s atmosphere.

Writing in Nature Geoscience, the researchers say this will reduce predicted warming in the coming decades.

But long-term, the expected temperature rises will not alter significantly.

“The most extreme projections are looking less likely than before” – Dr Alexander Otto, University of Oxford

The slowdown in the expected rate of global warming has been studied for several years now. Earlier this year, the UK Met Office lowered their five-year temperature forecast.

But this new paper gives the clearest picture yet of how any slowdown is likely to affect temperatures in both the short-term and long-term.

An international team of researchers looked at how the last decade would impact long-term, equilibrium climate sensitivity and the shorter term climate response.

Transient nature

Climate sensitivity looks to see what would happen if we doubled concentrations of CO2 in the atmosphere and let the Earth’s oceans and ice sheets respond to it over several thousand years.

Transient climate response is much shorter term calculation again based on a doubling of CO2.

The Intergovernmental Panel on Climate Change reported in 2007 that the short-term temperature rise would most likely be 1-3C (1.8-5.4F).

But in this new analysis, by only including the temperatures from the last decade, the projected range would be 0.9-2.0C.

IceThe report suggests that warming in the near term will be less than forecast

“The hottest of the models in the medium-term, they are actually looking less likely or inconsistent with the data from the last decade alone,” said Dr Alexander Otto from the University of Oxford.

“The most extreme projections are looking less likely than before.”

The authors calculate that over the coming decades global average temperatures will warm about 20% more slowly than expected.

But when it comes to the longer term picture, the authors say their work is consistent with previous estimates. The IPCC said that climate sensitivity was in the range of 2.0-4.5C.

Ocean storage

This latest research, including the decade of stalled temperature rises, produces a range of 0.9-5.0C.

“It is a bigger range of uncertainty,” said Dr Otto.

“But it still includes the old range. We would all like climate sensitivity to be lower but it isn’t.”

The researchers say the difference between the lower short-term estimate and the more consistent long-term picture can be explained by the fact that the heat from the last decade has been absorbed into and is being stored by the world’s oceans.

Not everyone agrees with this perspective.

Prof Steven Sherwood, from the University of New South Wales, says the conclusion about the oceans needs to be taken with a grain of salt for now.

“There is other research out there pointing out that this storage may be part of a natural cycle that will eventually reverse, either due to El Nino or the so-called Atlantic Multidecadal Oscillation, and therefore may not imply what the authors are suggesting,” he said.

The authors say there are ongoing uncertainties surrounding the role of aerosols in the atmosphere and around the issue of clouds.

“We would expect a single decade to jump around a bit but the overall trend is independent of it, and people should be exactly as concerned as before about what climate change is doing,” said Dr Otto.

Is there any succour in these findings for climate sceptics who say the slowdown over the past 14 years means the global warming is not real?

“None. No comfort whatsoever,” he said.

World Bank turns to hydropower to square development with climate change (Washington Post)

Michael Reynolds/European Photopress Agency – World Bank President Jim Yong Kim attends the Fragility Forum this month in Washington. The forum discussed ways for fragile nations to improve their economies, their infrastructure and the well-being of their citizens.

By , Published: May 8, 2013

The World Bank is making a major push to develop large-scale hydropower projects around the globe, something it had all but abandoned a decade ago but now sees as crucial to resolving the tension between economic development and the drive to tame carbon use.

Major hydropower projects in Congo, Zambia, Nepal and elsewhere — all of a scale dubbed “transformational” to the regions involved — are a focus of the bank’s fundraising drive among wealthy nations. Bank lending for hydropower has scaled up steadily in recent years, and officials expect the trend to continue amid a worldwide boom in water-fueled electricity.

Such projects were shunned in the 1990s, in part because they can be disruptive to communities and ecosystems. But the World Bank is opening the taps for dams, transmission lines and related infrastructure as its president, Jim Yong Kim, tries to resolve a quandary at the bank’s core: how to eliminate poverty while adding as little as possible to carbon emissions.

“Large hydro is a very big part of the solution for Africa and South Asia and Southeast Asia. . . . I fundamentally believe we have to be involved,” said Rachel Kyte, the bank’s vice president for sustainable development and an influential voice among Kim’s top staff members. The earlier move out of hydro “was the wrong message. . . . That was then. This is now. We are back.”

It is a controversial stand. The bank backed out of large-scale hydropower because of the steep trade-offs involved. Big dams produce lots of cheap, clean electricity, but they often uproot villages in dam-flooded areas and destroy the livelihoods of the people the institution is supposed to help. A 2009 World Bank review of hydro­power noted the “overwhelming environmental and social risks” that had to be addressed but also concluded that Africa and Asia’s vast and largely undeveloped hydropower potential was key to providing dependable electricity to the hundreds of millions of people who remain without it.

“What’s the one issue that’s holding back development in the poorest countries? It’s energy. There’s just no question,” Kim said in an interview.

Advocacy groups remain skeptical, arguing that large projects, such as Congo’s long-debated network of dams around Inga Falls, may be of more benefit to mining companies or industries in neighboring countries than poor communities.

“It is the old idea of a silver bullet that can modernize whole economies,” said Peter Bosshard, policy director of International Rivers, a group that has organized opposition to the bank’s evolving hydro policy and argued for smaller projects designed around communities rather than mega-dams meant to export power throughout a region.

“Turning back to hydro is being anything but a progressive climate bank,” said Justin Guay, a Sierra Club spokesman on climate and energy issues. “There needs to be a clear shift from large, centralized projects.”

The major nations that support the World Bank, however, have been pushing it to identify such projects — complex undertakings that might happen only if an international organization is involved in sorting out the financing, overseeing the performance and navigating the politics.

The move toward big hydro comes amid Kim’s stark warning that global warming will leave the next generation with an “unrecognizable planet.” That dire prediction, however, has left him struggling to determine how best to respond and frustrated by some of the bank’s inherent limitations.

In his speeches, Kim talks passionately about the bank’s ability to “catalyze” and “leverage” the world to action by mobilizing money and ideas, and he says he is hunting for ideas “equal to the challenge” of curbing carbon use. He has criticized the “small bore” thinking that he says has hobbled progress on the issue.

However, the bank remains in the business of financing traditional fossil-fuel plants, including those that use the dirtiest form of coal, as well as cleaner but ­carbon-based natural gas infrastructures.

Among the projects likely to cross Kim’s desk in coming months, for example, is a 600-megawatt power plant in Kosovo that would be fired by lignite coal, the bottom of the barrel when it comes to carbon emissions.

The plant has strong backing from the United States, the World Bank’s major shareholder. It also meshes with one of the bank’s other long-standing imperatives: Give countries what they ask for. The institution has 188 members to keep happy and can go only so far in trying to impose its judgment over that of local officials. Kim, who in his younger days demonstrated against World Bank-enforced “orthodoxy” in economic policy, now may be hard-pressed to enforce an energy orthodoxy of his own.

Kosovo’s domestic supplies of lignite are ample enough to free the country from imported fuel. Kim said there is little question that Kosovo needs more electricity, and the new plant will allow an older, more polluting facility to be shut down.

“I would just love to never sign a coal project,” Kim said. “We understand it is much, much dirtier, but . . . we have 188 members. . . . We have to be fair in balancing the needs of poor countries . . . with this other bigger goal of tackling climate change.”

The bank is working on other ideas. Kim said he is considering how it might get involved in creating a more effective world market for carbon, allowing countries that invest in renewable energy or “climate friendly” agriculture to be paid for their carbon savings by industries that need to use fossil fuels. Existing carbon markets have been plagued with volatile pricing — Europe’s cost of carbon has basically collapsed — or rules that prevent carbon trading with developing countries.

“We’ve got to figure out a way to establish a stable price of carbon,” Kim said. “Everybody knows that.”

He has also staked hope for climate progress on developments in agriculture.

Hydropower projects, however, seem notably inside what Kim says is the bank’s sweet spot — complex, high-impact, green and requiring the sort of joint public and private financing Kim says the bank can attract.

The massive hydropower potential of the Congo River, estimated at about 40,000 megawatts, is such a target. Its development is on a list of top world infrastructure priorities prepared by the World Bank and other development agencies for the Group of 20 major economic powers.

Two smaller dams on the river have been plagued by poor performance and are being rehabilitated with World Bank assistance. A third being planned would represent a quantum jump — a 4,800-megawatt, $12 billion giant that would move an entire region off carbon-based electricity.

The African Development Bank has begun negotiations over the financing, and the World Bank is ready to step in with tens of millions of dollars in technical-planning help.

“In an ideal world, we start building in 2016. By 2020, we switch on the lights,” said Hela Cheikhrouhou, energy and environment director for the African Development Bank.

It is the sort of project that the World Bank had stayed away from for many years — not least because of instability in the country. But as the country tries to move beyond its civil war and the region intensifies its quest for the power to fuel economic growth, the bank seems ready to move. Kim will visit Congo this month for a discussion about development in fragile and war-torn states.

Kyte, the World Bank vice president, said the Inga project will be high on the agenda.

“People have been looking at the Inga dam for as long as I have been in the development business,” she said. “The question is: Did the stars align? Did you have a government in place? Did people want to do it? Are there investors interested? Do you have the ability to do the technical work? The stars are aligned now. Let’s go.”

European carbon market in trouble (Washington Post)

By Published: May 5

LONDON — As the centerpiece of Europe’s pledge to lead the global battle against climate change, the region’s market for carbon emissions effectively turned pollution into a commodity that could be traded like gold or oil. But the once-thriving pollution trade here has turned into a carbon bust.Under the system, 31 nations slapped emission limits on more than 11,000 companies and issued carbon credits that could be traded by firms to meet their new pollution caps. More efficient ones could sell excess carbon credits, while less efficient ones were compelled to buy more. By August 2008, the price for carbon emission credits had soared above $40 per ton — high enough to become an added incentive for some companies to increase their use of cleaner fuels, upgrade equipment and take other steps to reduce carbon footprints.

Europe's carbon-trading market

Europe’s carbon-trading market

That system, however, is in deep trouble. A drastic drop in industrial activity has sharply reduced the need for companies to buy emission rights, causing a gradual fall in the price of carbon allowances since the region slipped into a multi-year economic crisis in the latter half of 2008. In recent weeks, however, the price has appeared to have entirely collapsed — falling below $4 as bickering European nations failed to agree on measures to shore up the program.The collapsing price of carbon in Europe is darkening the outlook for a greener future in a part of the world that was long the bright spot in the struggle against climate change. It is also presenting new challenges for those who once saw Europe’s program as the natural anchor for what would eventually be a linked network of cap-and-trade systems worldwide.

Carbon “started as the commodity of the future, but it has now deteriorated,” said Matthew Gray, a trader at Jefferies Bache in London and one of a diminishing breed of carbon dealers in Europe. “Its future is uncertain.”

The problems plaguing Europe’s cap-and-trade system underscore the uphill battle for international cooperation in the global-warming fight. After middling progress at various summits, officials from more than 190 countries have been charged with forging a global accord by 2015 aimed at cutting carbon emissions. But critics point to the inability of even the European Union — a largely progressive region bound by open borders and a shared bureaucracy — to come together on a fix for its cap-and-trade system as evidence of how difficult consensus building on climate change has become.

Negotiations to launch a similar system across the United States collapsed in 2010, replaced with a regional approach in which California, for instance, moved forward with its own program. Aided by a boom in cheaper and cleaner shale gas as well as the spread of more renewable energies, including wind and solar, the United States has — like Europe — nevertheless seen a continuing drop in its overall emission levels.

But there are also signs that years of increasing investment in clean energies are ebbing on both sides of the Atlantic. In 2012, overall clean-energy investment in the United States fell 37 percent,to $35.6 billion, compared with a year earlier, according to a new report by the Pew Charitable Trusts. European countries, including green leaders such as Germany, also saw declines, leading analysts to call the problems with the region’s cap-and-trade system that much more troubling.

“Obviously, what’s happening now is very disheartening for people who have been involved in trying to cut carbon emissions,” said Agustin Silvani, managing director of carbon finance at Conservation International in Arlington, Va. “The European system was at the center of the global fight, and the fact that it is collapsing is definitely a blow. Maybe a moral one more than anything else.”Lost incentive

The cap-and-trade program is based on a system of carbon allowances for large emitters such as utilities and manufacturers, with some bought and others awarded for free. Companies are allowed to draw on global mitigation projects — such as planting trees in tropical rain forests — to offset a small portion of their emissions. But for the most part, they must meet targets through carbon credits issued by European authorities.A number of other factors, including mandates and subsidies for renewable energy, have coaxed European companies to reduce their emissions in recent years. But in the early stages of the cap-and-trade program, “higher carbon prices were a big incentive for companies to take action,” said Marcus Ferdinand, senior market analyst for Thomson Reuters Point Carbon. “Now, they’ve lost that incentive.”

At the core of the problem is a massive oversupply of carbon allowances. Demand for carbon began to fade in the late 2000s as a recession set in and factories across Europe dramatically curbed production. But there were also built-in flaws. Unlike newer cap-and-trade programs such as the one in California, Europe’s system never established a price floor that could have prevented a market collapse. In addition, too many free allowances were given to too many companies. Some, in fact, never had to pay for allowances at all, allowing them to hoard them or even sell their carbon credits at a profit.

On April 16, the European Parliament was on the verge of temporarily tightening the supply of allowances to boost the price of carbon and shore up the ailing market. But opposition by countries led by Poland — a nation strongly dependent on heavy-emitting coal power plants — defeated the measure. The rejection sent the price of carbon plummeting to a historic low of roughly $3.60.

Shoring up prices

A bright future for cap-and-trade systems may yet exist. Promising new programs, for instance, are being rolled out in California, Australia, Quebec and a few provinces in China, with officials in some areas setting a minimum price for carbon credits to prevent the kind of market collapse seen in Europe.

But if Europe is unable to shore up the price for carbon credits here, observers say, it could complicate hopes down the line of linking various programs together. The price per ton in California, for instance, is above $10 — about two and half times the price in Europe.

Large emitters such as the steel industry, however, say the system is working just fine. With a price determined by supply and demand, industry groups say, it is only fitting for the price to be low now. Also, given the region’s weaker economic activity, they note that the European Union is still virtually assured of meeting its pledge to cut carbon emissions — a reduction of 20 percent by 2020 compared with 1990 levels — even with the cap-and-trade system faltering.

Yet critics argue that the low price of carbon has removed the incentive for European companies to reduce their carbon footprints. They point to a boom in the use of cheap imported American coal in European power plants. In addition, many fear that the lack of an incentive to make more green upgrades will create a boom in emissions if and when European economies recover.

As the regional plan falters, some countries are going it alone on domestic initiatives. This year, for instance, Britain introduced a carbon tax on emissions that British manufacturers say has put them at a competitive disadvantage with their counterparts on the continent. It suggests the potential pitfalls ahead as countries and even smaller jurisdictions such as states, provinces and cities introduce a disparate patchwork of climate-change measures.

Optimists point to hope that the European Parliament will once again vote on a measure to tighten the supply of carbon credits in the coming months, thus shoring up the price. They also note that the European Commission is studying more ambitious proposals for a bigger overhaul of the region’s cap-and-trade system.

But given the growing resistance in some European countries to anything that might drive energy costs up further, others wonder whether Europe’s leaders still have the political will to take aggressive action.

“We’re risking the credibility of European politicians by not fixing this system,” said Johannes Teyssen, chief executive of German energy giant E.ON. “How can they travel to world climate-change conferences claiming others should do more when our own system is on its deathbed and they do nothing?”

Eliza Mackintosh contributed to this report.

*   *   *

In Europe, Paid Permits for Pollution Are Fizzling (N.Y.Times)

Andrew Testa for The International Herald Tribune. The trading floor at CF Partners in West London. The market for carbon permits is more volatile than its founders envisioned.

By STANLEY REED and MARK SCOTT

Published: April 21, 2013

LONDON — On a showery afternoon last week in West London, a ripple of enthusiasm went through the trading floor of CF Partners, a privately owned financial company. The price of carbon allowances, shown in green lights on a board hanging from the ceiling, was creeping up toward three euros.

*The Emissions Trading System began with a test phase that ended in 2007. Note: Data are for the futures contract expiring in mid-December each year. Phase 2 price was initially for the December 2008 futures contract.

That is pretty small change — $3.90, or only about 10 percent of what the price was in 2008. But to the traders it came as a relief after the market had gone into free fall to record lows two days earlier, after the European Parliament spurned an effort to shore up prices by shrinking the number of allowances.

“The market still stands,” said Thomas Rassmuson, a native of Sweden who founded the company with Jonathan Navon, a Briton, in 2006.

Still, Europe’s carbon market, a pioneering effort to use markets to regulate greenhouse gases, is having a hard time staying upright. This year has been stomach-churning for the people who make their living in the arcane world of trading emissions permits. The most recent volatility comes on top of years of uncertainty during which prices have fluctuated from $40 to nearly zero for the right to emit one ton of carbon dioxide.

More important, though, than lost jobs and diminished payouts for traders and bankers, the penny ante price of carbon credits means the market is not doing its job: pushing polluters to reduce carbon emissions, which most climate scientists believe contribute to global warming.

The market for these credits, officially called European Union Allowances, or E.U.A.’s, has been both unstable and under sharp downward pressure this year because of a huge oversupply and a stream of bad political and economic news. On April 16, for instance, after the European Parliament voted down the proposed reduction in the number of credits, prices dropped about 50 percent, to 2.63 euros from nearly 5, in 10 minutes.

“No one was going to buy” on the way down, said Fred Payne, a trader with CF Partners.

Europe’s troubled experience with carbon trading has also discouraged efforts to establish large-scale carbon trading systems in other countries, including the United States, although California and a group of Northeastern states have set up smaller regional markets.

Traders do not mind big price swings in any market — in fact, they can make a lot of money if they play them right.

But over time, the declining prices for the credits have sapped the European market of value, legitimacy and liquidity — the ease with which the allowances can be traded — making it less attractive for financial professionals.

A few years ago, analysts thought world carbon markets were heading for the $2 trillion mark by the end of this decade.

Today, the reality looks much more modest. Total trading last year was 62 billion euros, down from 96 billion in 2011, according to Thomson Reuters Point Carbon, a market research firm based in Oslo. Close to 90 percent of that activity was in Europe, while North American trading represented less than 1 percent of worldwide market value.

Financial institutions that had rushed to increase staff have shrunk their carbon desks. Companies have also laid off other professionals who helped set up greenhouse gas reduction projects in developing countries like China and India.

When the emissions trading system was started in 2005, the goal was to create a global model for raising the costs of emitting greenhouse gases and for prodding industrial polluters to switch from burning fossil fuels to using clean-energy alternatives like wind and solar.

When carbon prices hit their highs of more than 30 euros in 2008 and companies spent billions to invest in renewables, policy makers hailed the market as a success. But then prices began to fall. And at current levels, they are far too low to change companies’ behaviors, analysts say. Emitting a ton of carbon dioxide costs about the same as a hamburger.

“At the moment, the carbon price does not give any signal for investment,” said Hans Bünting, chief executive of RWE, one of the largest utilities in Germany and Europe.

This cap-and-trade system in Europe places a ceiling on emissions. At the end of each year, companies like electric utilities or steel manufacturers must hand over to the national authorities the permits equivalent to the amount gases emitted.

Until the end of 2012, these credits were given to companies free according to their estimated output of greenhouse gases. Policy makers wanted to jump-start the trading market and avoid higher costs for consumers.

Beginning this year, energy companies must buy an increasing proportion of their credits in national auctions. Industrial companies like steel plants will follow later this decade.

Companies and other financial players like banks and hedge funds can also acquire and trade the allowances on exchanges like the IntercontinentalExchange, based in Atlanta. Over time the number of credits is meant to fall gradually, theoretically raising prices and cutting pollution.

The reality has been far different because of serious flaws in the design of the system. To win over companies and skeptical countries like Poland, which burn a lot of coal, far too many credits have been handed out.

At the same time, Europe’s debilitating economic slowdown has sharply curtailed industrial activity and reduced the Continent’s overall carbon emissions.

Steel making in Europe, for instance, has fallen about 30 percent since 2007, while new car registrations were at their lowest level last year since 1995.

Big investments in renewable energy sources like wind and solar also reduced carbon emissions, which have fallen about 10 percent in Europe since 2007.

As a result, there is a vast surplus of permits — about 800 million tons’ worth, according to Point Carbon. That has caused prices to plunge.

The cost of carbon is far too low to force electric utilities in Europe to switch from burning coal, a major polluter, to much cleaner natural gas. Just the opposite: Britain increased coal burning for electricity more than 30 percent last year, while cutting back gas use a similar amount, and other West European nations increased their coal use as well.

“The European energy scene is not a good one,” said Andrew Brown, head of exploration and production at Royal Dutch Shell. “They haven’t got the right balance in terms of promoting gas.”

Fearing that prices might go to zero because of the huge oversupply, the European authorities proposed a short-term solution known as backloading, which would have delayed the scheduled auctioning of a large portion of the credits that were supposed to be sold over the next three years. But the European Parliament in Strasbourg voted the measure down on April 16.

Lawmakers were worried about tampering with the market as well as doing anything that might increase energy costs in the struggling economy.

“It was the worst possible moment to try to implement something like that,” said Francesco Starace, chief executive of Enel Green Power, one of the largest European green-energy companies, which is based in Rome.

The European authorities, led by Connie Hedegaard, the European commissioner for climate change, have not given up on fixing the system. But analysts like Stig Scholset, at Point Carbon, say that there is not much the authorities can do in the short term and that prices may slump for months, if not years.

That means more tough times for financial institutions. Particularly troubled is the business of investing in greenhouse gas abatement projects like wind farms orhydroelectric dams in developing countries like China. JPMorgan Chase paid more than $200 million for one of the largest investors in these projects, EcoSecurities, in 2009.

Financiers say these projects used to be gold mines, generating credits that industrial companies could use to offset their emissions elsewhere. But so many credits have been produced by these projects — on top of the existing oversupply of credits in Europe — that they are trading at about a third of a euro.

Market participants say they see many rivals pulling back from world carbon markets. Deutsche Bank, the largest bank in Germany, has cut back its carbon trading. Smaller outfits like Mabanaft, based in Rotterdam, have also left the business.

Anthony Hobley, a lawyer in London and president of the Climate Market and Investors Association, an industry group, estimates that among the traders, analysts and bankers who flocked to the carbon markets in the early days, half may now be gone.

But carbon trading is unlikely to fade completely.

For one thing, European utilities and other companies now must buy the credits to comply with the rules. And they can buy credits to save for later use, when their emissions increase and the price of credits rises.

Despite Europe’s sputters, carbon trading is beginning to gain traction in places like China, Australia and New Zealand.

In London, Mr. Rassmuson concedes that the business has turned out to be more up-and-down than he anticipated when he and his partner set up their firm in a tiny two-man office in 2006.

But he said his firm was benefiting from others’ dropping out. He is also branching out into trading electric power and natural gas.

Like many in the carbon markets, he says what he is doing is not just about money.

“Trying to make the world more sustainable is important to us,” he said. “It is a good business opportunity that makes us proud.”

A version of this article appeared in print on April 22, 2013, on page B1 of the New York edition with the headline: In Europe, Paid Permits For Pollution Are Fizzling.

Conservative Koch Brothers Turning Focus to Newspapers (N.Y.Times)

Tannen Maury/European Pressphoto Agency. Tribune’s newspapers, including The Chicago Tribune, have caught the interest of a number of suitors.

By AMY CHOZICK

Published: April 20, 2013

Three years ago, Charles and David Koch, the billionaire industrialists and supporters of libertarian causes, held a seminar of like-minded, wealthy political donors at the St. Regis Resort in Aspen, Colo. They laid out a three-pronged, 10-year strategy to shift the country toward a smaller government with less regulation and taxes.

Kevork Djansezian/Getty Images. The Los Angeles Times is the fourth-largest paper in the country.

The first two pieces of the strategy — educating grass-roots activists and influencing politics — were not surprising, given the money they have given to policy institutes and political action groups. But the third one was: media.

Other than financing a few fringe libertarian publications, the Kochs have mostly avoided media investments. Now, Koch Industries, the sprawling private company of whichCharles G. Koch serves as chairman and chief executive, is exploring a bid to buy the Tribune Company’s eight regional newspapers, including The Los Angeles Times, The Chicago Tribune, The Baltimore Sun, The Orlando Sentinel and The Hartford Courant.

By early May, the Tribune Company is expected to send financial data to serious suitors in what will be among the largest sales of newspapers by circulation in the country. Koch Industries is among those interested, said several people with direct knowledge of the sale who spoke on the condition they not be named. Tribune emerged from bankruptcy on Dec. 31 and has hired JPMorgan Chase and Evercore Partners to sell its print properties.

The papers, valued at roughly $623 million, would be a financially diminutive deal for Koch Industries, the energy and manufacturing conglomerate based in Wichita, Kan., with annual revenue of about $115 billion.

Politically, however, the papers could serve as a broader platform for the Kochs’ laissez-faire ideas. The Los Angeles Times is the fourth-largest paper in the country, and The Tribune is No. 9, and others are in several battleground states, including two of the largest newspapers in Florida, The Orlando Sentinel and The Sun Sentinel in Fort Lauderdale. A deal could include Hoy, the second-largest Spanish-language daily newspaper, which speaks to the pivotal Hispanic demographic.

One person who attended the Aspen seminar who spoke on the condition of anonymity described the strategy as follows: “It was never ‘How do we destroy the other side?’ ”

“It was ‘How do we make sure our voice is being heard?’ ”

Guests at the Aspen seminar included Philip F. Anschutz, the Republican oil mogul who owns the companies that publish The Washington Examiner, The Oklahoman and The Weekly Standard, and the hedge fund executive Paul E. Singer, who sits on the board of the political magazine Commentary. Attendees were asked not to discuss details about the seminar with the press.

A person who has attended other Koch Industries seminars, which have taken place since 2003, says Charles and David Koch have never said they want to take over newspapers or other large media outlets, but they often say “they see the conservative voice as not being well represented.” The Kochs plan to host another conference at the end of the month, in Palm Springs, Calif.

At this early stage, the thinking inside the Tribune Company, the people close to the deal said, is that Koch Industries could prove the most appealing buyer. Others interested, including a group of wealthy Los Angeles residents led by the billionaire Eli Broad and Ronald W. Burkle, both prominent Democratic donors, and Rupert Murdoch’s News Corporation, would prefer to buy only The Los Angeles Times.

The Tribune Company has signaled it prefers to sell all eight papers and their back-office operations as a bundle. (Tribune, a $7 billion media company that also owns 23 television stations, could also decide to keep the papers if they do not attract a high enough offer.)

Koch Industries is one of the largest sponsors of libertarian causes — including the financing of policy groups like the Cato Institute in Washington and the formation of Americans for Prosperity, the political action group that helped galvanize Tea Party organizations and their causes. The company has said it has no direct link to the Tea Party.

This month a Koch representative contacted Eddy W. Hartenstein, publisher and chief executive of The Los Angeles Times, to discuss a bid, according to a person briefed on the conversation who spoke on the condition of anonymity because the conversation was private. Mr. Hartenstein declined to comment.

Koch Industries recently brought on Angela Redding, a consultant based in Salt Lake City, to analyze the media environment and assess opportunities. Ms. Redding, who previously worked at the Charles G. Koch Charitable Foundation, did not respond to requests for comment.

“As an entrepreneurial company with 60,000 employees around the world, we are constantly exploring profitable opportunities in many industries and sectors. So, it is natural that our name would come up in connection with this rumor,” Melissa Cohlmia, a spokeswoman for Koch Companies Public Sector, said in a statement last month.

“We respect the independence of the journalistic institutions referenced in the news stories,” Ms. Cohlmia continued. “But it is our longstanding policy not to comment on deals or rumors of deals we may or may not be exploring.”

One person who has previously advised Koch Industries said the Tribune Company papers were considered an investment opportunity, and were viewed as entirely separate from Charles and David Kochs’ lifelong mission to shrink the size of government.

At least in politically liberal Los Angeles, a conservative paper could be tricky. David H. Koch, who lives in New York and serves as executive vice president of Koch Industries, has said he supports gay marriage and could align with many residents on some social issues, Reed Galen, a Republican consultant in Orange County, Calif., said.

Koch Industries’ main competitor for The Los Angeles Times is a group of mostly Democratic local residents. In the 2012 political cycle, Mr. Broad gave $477,800, either directly or through his foundation, to Democratic candidates and causes, according to the Center for Responsive Politics. Mr. Burkle has long championed labor unions. President Bill Clinton served as an adviser to Mr. Burkle’s money management firm, Yucaipa Companies, which in 2012 gave $107,500 to Democrats and related causes. The group also includes Austin Beutner, a Democratic candidate for mayor of Los Angeles, and an investment banker who co-founded Evercore Partners.

“This will be a bipartisan group,” Mr. Beutner said. “It’s not about ideology, it’s about a civic interest.” (The Los Angeles consortium is expected to also include Andrew Cherng, founder of the Panda Express Chinese restaurant chain and a Republican.)

“It’s a frightening scenario when a free press is actually a bought and paid-for press and it can happen on both sides,” said Ellen Miller, executive director of the Sunlight Foundation, a nonpartisan watchdog group.

Last month, shortly after L.A. Weekly first reported on Koch Industries’ interest in the Tribune papers, the liberal Web site Daily Kos and Courage Campaign, a Los Angeles-based liberal advocacy group, collected thousands of signatures protesting such a deal. Conservatives, meanwhile, welcomed the idea of a handful of prominent papers spreading the ideas of economic “freedom” from taxes and regulation that the Kochs have championed.

Seton Motley, president of Less Government, an organization devoted to shrinking the role of the government, said the 2012 presidential election reinforced the view that conservatives needed a broader media presence.

“A running joke among conservatives as we watched the G.O.P. establishment spend $500 million on ineffectual TV ads is ‘Why don’t you just buy NBC?’ ” Mr. Motley said. “It’s good the Kochs are talking about fighting fire with a little fire.”

Koch Industries has for years felt the mainstream media unfairly covered the company and its founding family because of its political beliefs. KochFacts.com, a Web site run by the company, disputes perceived press inaccuracies. The site, which asserts liberal bias in the news media, has published private e-mail conversations between company press officers and journalists, including the Politico reporter Kenneth P. Vogel and editors at The New Yorker in response to an article about the Kochs by Jane Mayer.

“So far, they haven’t seemed to be particularly enthusiastic about the role of the free press,” Ms. Mayer said in an e-mail, “but hopefully, if they become newspaper publishers, they’ll embrace it with a bit more enthusiasm.”

A Democratic political operative who spoke on the condition of anonymity, said he admired how over decades the brothers have assembled a complex political infrastructure that supports their agenda. A media company seems like a logical next step.

This person said, “If they get some bad press that Darth Vader is buying Tribune, they don’t care.”

Hidrelétricas podem afetar sistema hidrológico do Pantanal (Fapesp)

Projeto para construção de mais 87 pequenas centrais hidrelétricas na bacia do Alto Paraguai pode afetar conectividade da área de planalto com a de planície do bioma pantaneiro e dificultar fluxo migratório de peixes e outras espécies aquáticas, alertam pesquisadores (Walfrido Tomas)

23/04/2013

Por Elton Alisson

Agência FAPESP – O projeto de construção de mais 87 Pequenas Centrais Hidrelétricas (PCHs) na Bacia do Alto Paraguai, em discussão atualmente, pode afetar a conectividade do planalto – onde nasce o Rio Paraguai e seus afluentes – e a planície inundada do Pantanal – por onde as águas desses rios escoam –, dificultando o fluxo migratório de peixes e outras espécies aquáticas e semiaquáticas pelo sistema hidrológico.

O alerta foi feito por pesquisadores durante o terceiro evento do Ciclo de Conferências 2013 do BIOTA Educação, que teve como tema o Pantanal. O evento foi realizado pelo programa BIOTA-FAPESP no dia 18 de abril, na sede da FAPESP.

De acordo com José Sabino, professor da Universidade Anhanguera-Uniderp, o impacto das PCHs já existentes na região da Bacia do Alto Paraguai não são tão grandes porque, em geral, baseiam-se em uma tecnologia denominada “a fio d’água” – que dispensa a necessidade de manter grandes reservatórios de água.

A somatória das cerca de 30 PCHs existentes com as 87 planejadas, no entanto, pode impactar a hidrologia e a conectividade das águas do planalto e da planície da Bacia do Alto Paraguai e dificultar processos migratórios de espécies de peixes do Pantanal, alertou o especialista.

“A criação dessas PCHs pode causar a quebra de conectividade hidrológica de populações e de processos migratórios reprodutivos, como a piracema, de algumas espécies de peixes”, disse Sabino.

Durante a piracema, o período de procriação que antecede as chuvas do verão, algumas espécies de peixes, como o curimbatá (Prochilodus lineatus) e o dourado (Salminus brasiliensis), sobem os rios até as nascentes para desovar.

Se o acesso às cabeceiras dos rios for interrompido por algum obstáculo, como uma PCH, a piracema pode ser dificultada. “A construção de mais PCHs na região do Pantanal pode ter uma influência sistêmica sobre o canal porque, além de mudar o funcionamento hidrológico, também deve alterar a força da carga de nutrientes carregada pelas águas das nascentes dos rios no planalto que entram na planície pantaneira”, disse Walfrido Moraes Tomas, pesquisador do Centro de Pesquisa Agropecuária do Pantanal (CPAP) da Empresa Brasileira de Pesquisa Agropecuária (Embrapa), no Mato Grosso do Sul, palestrante na conferência na FAPESP.

“Isso também poderá ter impactos nos hábitats de espécies aquáticas ou semiaquáticas”, reiterou Tomas. De acordo com o pesquisador, o Pantanal é uma das áreas úmidas mais ricas em espécies do mundo, distribuídas de forma abundante, mas não homogênea, pela planície pantaneira.

Alguns dos últimos levantamentos de espécies apontaram que o bioma possui 269 espécies de peixes, 44 de anfíbios, 127 de répteis, 582 de aves e 152 de mamíferos.

São necessários, no entanto, mais inventários de espécies para preencher lacunas críticas de conhecimento sobre outros grupos, como o dos invertebrados – sobre os quais ainda não há levantamento sobre o número de espécies –, além de crustáceos, moluscos e lepidópteros (ordem de insetos que inclui as borboletas), que ainda são pouco conhecidos.

“Uma iniciativa que vai nos dar uma grande contribuição nesse sentido será o programa Biota Mato Grosso do Sul, que começou ser implementado há três anos”, disse Tomas.

Inspirado no BIOTA-FAPESP, o programa Biota Mato Grosso do Sul pretende consolidar a infraestrutura de coleções e acervos em museus, herbários, jardins botânicos, zoológicos e bancos de germoplasma do Mato Grosso do Sul para preencher lacunas de conhecimento, taxonômicas e geográficas, sobre a diversidade biológica no estado.

Para atingir esse objetivo, pesquisadores pretendem informatizar os acervos e coleções científicas e estabelecer uma rede de informação em biodiversidade entre todas as instituições envolvidas com a pesquisa e conservação de biodiversidade do Mato Grosso do Sul.

“Começamos agora a fazer os primeiros inventários de espécies de regiões- chave do estado e estamos preparando um volume especial da revista Biota Neotropica sobre a biodiversidade de Mato Grosso do Sul, que será um passo fundamental para verificarmos as informações disponíveis sobre a biota do Pantanal e direcionar nossas ações”, disse Tomas à Agência FAPESP.

“Diferentemente do Estado de São Paulo, que tem coleções gigantescas, Mato Grosso do Sul não dispõe de grandes coleções para fazermos mapeamentos de diversidade. Por isso, precisaremos ir a campo para fazer os inventários”, explicou.

Espécies ameaçadas

Segundo Tomas, das espécies de aves ameaçadas, vulneráveis ou em perigo de extinção no Brasil, por exemplo, 188 podem ser encontradas no Pantanal. No entanto, diminuiu muito nos últimos anos a ocorrência de caça de espécies como onça-pintada, onça-parda, ariranha, arara-azul – ave símbolo do Pantanal – e jacaré.

E não há indícios de que a principal atividade econômica da região – a pecuária, que possibilitou a ocupação humana do bioma em um primeiro momento em razão de o ambiente ser uma savana inundada com pastagem renovada todo ano – tenha causado impactos na biota pantaneira.

“Pelo que sabemos até agora, nenhuma espécie da fauna do Pantanal foi levada a risco de extinção por causa da pecuária”, afirmou Tomas. Já a pesca – a segunda atividade econômica mais intensiva no Pantanal – pode ter impactos sobre algumas espécies de peixes.

Isso porque a atividade está focalizada em 20 das 270 espécies de peixes do bioma pantaneiro, em razão do tamanho, sabor da carne e pela própria cultura regional.

Entre elas, estão o dourado, o curimbatá, a piraputanga (Brycon hilarii), o pacu (Piaractus mesopotamicus) e a cachara (Pseudoplatystoma reticulatum) – um peixe arisco encontrado em rios como Prata e Olho D’água, que pode chegar a medir 1,20 metro e pesar 40 quilos.

“Há indícios de que, pelo fato de a pesca no Pantanal ser direcionada a algumas espécies, a atividade possa reduzir algumas populações de peixes”, disse Sabino.

Além de Sabino e Tomas, o professor Arnildo Pott, da Universidade Federal de Mato Grosso do Sul (UFMS), de Campo Grande, também proferiu palestra, sobre a origem, evolução e diversidade da vegetação do Bioma Pantanal.

Estratégias de conservação

Os pesquisadores também chamaram a atenção para o fato de que, atualmente, apenas cerca de 5% do Pantanal está protegido por unidades de conservação. E que muitas das espécies de animais da região, como a onça- pintada, a ariranha e a arara-azul, por exemplo, não são protegidas efetivamente, porque ficam fora dessas unidades de conservação.

“A conservação de espécies ameaçadas no Pantanal requer estratégias mais amplas do que apenas a implantação ou gestão das unidades de conservação”, destacou Tomas. “São necessárias políticas de gestão de bacias hidrográficas e de remuneração por serviços ecossistêmicos para assegurar a conservação de espécies ameaçadas.”

Organizado pelo Programa BIOTA-FAPESP, o Ciclo de Conferências 2013 tem o objetivo de contribuir para o aperfeiçoamento do ensino de ciência. A quarta etapa será no dia 16 de maio, quando o tema será “Bioma Cerrado”. Seguem-se conferências sobre os biomas Caatinga (20 de junho), Mata Atlântica (22 de agosto), Amazônia (19 de setembro), Ambientes Marinhos e Costeiros (24 de outubro) e Biodiversidade em Ambientes Antrópicos – Urbanos e Rurais (21 de novembro).

Carbon Dioxide Removal Can Lower Costs of Climate Protection (Science Daily)

Apr. 12, 2013 — Directly removing CO2 from the air has the potential to alter the costs of climate change mitigation. It could allow prolonging greenhouse-gas emissions from sectors like transport that are difficult, thus expensive, to turn away from using fossil fuels. And it may help to constrain the financial burden on future generations, a study now published by the Potsdam Institute for Climate Impact Research (PIK) shows. It focuses on the use of biomass for energy generation, combined with carbon capture and storage (CCS). According to the analysis, carbon dioxide removal could be used under certain requirements to alleviate the most costly components of mitigation, but it would not replace the bulk of actual emissions reductions. 

Directly removing CO2 from the air has the potential to alter the costs of climate change mitigation. It could allow prolonging greenhouse-gas emissions from sectors like transport that are difficult, thus expensive, to turn away from using fossil fuels. And it may help to constrain the financial burden on future generations, a new study shows. It focuses on the use of biomass for energy generation, combined with carbon capture and storage. (Credit: © Jürgen Fälchle / Fotolia)

“Carbon dioxide removal from the atmosphere allows to separate emissions control from the time and location of the actual emissions. This flexibility can be important for climate protection,” says lead-author Elmar Kriegler. “You don’t have to prevent emissions in every factory or truck, but could for instance plant grasses that suck CO2 out of the air to grow — and later get processed in bioenergy plants where the CO2 gets stored underground.”

In economic terms, this flexibility allows to lower costs by compensating for emissions which would be most costly to eliminate. “This means that a phase-out of global emissions by the end of the century — that we would need to hold the 2 degree line adopted by the international community — does not necessarily require to eliminate each and every source of emissions,” says Kriegler. “Decisions whether and how to protect future generations from the risks of climate change have to be made today, but the burden of achieving these targets will increase over time. The costs for future generations can be substantially reduced if carbon dioxide removal technologies become available in the long run.”

Balancing the financial burden across generations

The study now published is the first to quantify this. If bioenergy plus CCS is available, aggregate mitigation costs over the 21st century might be halved. In the absence of such a carbon dioxide removal strategy, costs for future generations rise significantly, up to a quadrupling of mitigation costs in the period of 2070 to 2090. The calculation was carried out using a computer simulation of the economic system, energy markets, and climate, covering a range of scenarios.

Options for carbon dioxide removal from the atmosphere include afforestation and chemical approaches like direct air capture of CO2 from the atmosphere or reactions of CO2 with minerals to form carbonates. But the use of biomass for energy generation combined with carbon capture and storage is less costly than chemical options, as long as sufficient biomass feedstock is available, the scientists point out.

Serious concerns about large-scale biomass use combined with CCS

“Of course, there are serious concerns about the sustainability of large-scale biomass use for energy,” says co-author Ottmar Edenhofer, chief-economist of PIK. “We therefore considered the bioenergy with CCS option only as an example of the role that carbon dioxide removal could play for climate change mitigation.” The exploitation of bioenergy can conflict with land-use for food production or ecosystem protection. To account for sustainability concerns, the study restricts the bioenergy production to a medium level, that may be realized mostly on abandoned agricultural land.

Still, global population growth and changing dietary habits, associated with an increased demand for land, as well as improvements of agricultural productivity, associated with a decreased demand for land, are important uncertainties here. Furthermore, CCS technology is not yet available for industrial-scale use and, due to environmental concerns, is controversial in countries like Germany. Yet in this study it is assumed that it will become available in the near future.

“CO2 removal from the atmosphere could enable humankind to keep the window of opportunity open for low-stabilization targets despite of a likely delay in international cooperation, but only under certain requirements,” says Edenhofer. “The risks of scaling up bioenergy use need to be better understood, and safety concerns about CCS have to be thoroughly investigated. Still, carbon dioxide removal technologies are no science fiction and need to be further explored.” In no way should they be seen as a pretext to neglect emissions reductions now, notes Edenhofer. “By far the biggest share of climate change mitigation has to come from a large effort to reduce greenhouse-gas emissions globally.”

Journal Reference:

  1. Elmar Kriegler, Ottmar Edenhofer, Lena Reuster, Gunnar Luderer, David Klein. Is atmospheric carbon dioxide removal a game changer for climate change mitigation? Climatic Change, 2013; DOI: 10.1007/s10584-012-0681-4

Segue o Seco (Rolling Stone)

Edição 77 – Fevereiro de 2013

Enquanto a Bahia sofre com “a pior seca dos últimos 50 anos”, os habitantes do sertão se desdobram para superar os percalços. A esperança persiste, mas é minguada como a água da chuva

Segue o SecoFoto: Flavio Forner

Por MAÍRA KUBÍK MANO

“Para o carro! para o carro! olha ali, em cima das pedras! Tá vendo?” Não, eu não via nada. A paisagem parecia exatamente a mesma da última meia hora. Toda cor de terra, com uma ou outra catingueira no horizonte e os mandacarus, sempre em maior número, acompanhando o traçado da estrada de chão. “Lembra da cena em que o Fabiano vai tentar pegar um preá? Olha ali!”, o interlocutor insiste, apontando. Vidro abaixado, olhos a postos. Dois bichos pequenos, amarronzados e amendoados, de focinho pontudo, se mexem e se fazem notar. Pronto, lá estão os preás. Júlio César Santos fica satisfeito. Afinal, ele fora parar no sertão justamente depois de ler Vidas Secas.

“Eu sou da Zona da Mata, mas quando li Graciliano Ramos quis vir para cá”, conta Santos, um engenheiro agrônomo que se encantou pela caatinga quando ainda era estudante da Universidade Federal do Recôncavo Baiano (UFRB). Hoje, é chefe do escritório da EBDA (Empresa Baiana de Desenvolvimento Agrícola) em Ipirá, um dos 258 municípios da Bahia em situação de emergência por causa da seca. Junto com outros 17 órgãos e secretarias do governo de Jaques Wagner (PT), a EDBA faz parte do Comitê Estadual de Ações de Convivência com a Seca.

Estamos a caminho da cidade vizinha, Pintadas, onde a estiagem é ainda mais crítica. No percurso, cruzamos quatro rios. Três deles, secos. O céu nublado ao longe parece o prenúncio da mudança. Um chuvisco havia caído naquela madrugada, algo que não acontecia há muito tempo. As marcas ainda estavam na terra, em alguns sulcos rasos que provavelmente abrigaram fios de água corrente. Santos parece aliviado. “Agora precisa chover mais”, diz.

Em uma curva à esquerda surge a casa de Messias e Ginalva Jesus Pereira. A plantação de palmas logo se destaca da monocromia – é verde-escura, com nenhum tom de marrom. Na seca, o vegetal tem sido fonte de alimento imprescindível para garantir a sobrevivência dos animais, que já não têm mais pasto. “O povo vem, visita, admira. Outros ficam com usura”, fala Ginalva, sobrancelhas levantadas, há cerca de 20 anos vivendo naquele roçado.

Como era de se esperar, a conversa envereda para o clima e as gotas que caíram à noite. “Choveu em Ipirá, foi? Ah, aqui foi só uma neblina”, rebate o pequeno Matheus, filho do meio de Ginalva. “Aqui não chove mesmo há três anos. Perdemos dois bezerros e dois umbuzeiros para a seca. Painho está pedindo a Deus para esse resto de palma pegar”, diz, referindo-se a uma área mais distante da casa, plantada há pouco, onde o verde já está quase desbotando.

O cálculo de Matheus não é exagerado. Geralmente, chove na caatinga entre janeiro e maio, justamente a época do plantio. Em 2012, porém, a água não caiu e um período de estiagem emendou no outro, fazendo desta a maior seca dos últimos 50 anos, segundo a Coordenação de Defesa Civil da Bahia (Cordec). A previsão é que ela se estenda por mais um ou dois anos. “Agora, com a chuva, vai ser outra coisa. Vai mudar tudo”, avalia uma experiente Ginalva. Assim como o protagonista Fabiano da obra de Graciliano Ramos, ela sabe que a caatinga ressuscita.

Na casa dela, canos estrategicamente posicionados aguardam a próxima precipitação para recolher a água em cisternas. Enquanto isso não ocorre, Ginalva mantém, por meio de irrigação artificial, a produção – que inclui também feijão de corda, cebolinha, coentro, mamão, batata-doce e quiabo, além da criação de ovinos, caprinos e bovinos. O poço, recém-construído, foi financiado via Pronaf (Programa Nacional de Fortalecimento da Agricultura Familiar) Emergencial.

Assim como Ginalva, outros 6 mil agricultores da região apresentaram projetos para acessar o Programa. Segundo o Banco do Nordeste do Brasil (BNB), foram liberados R$ 10 milhões do Pronaf Emergencial até janeiro de 2013 para os 17 municípios do entorno de Feira de Santana, entre eles Pintadas e Ipirá. “São pequenos agricultores que você vê aqui, solicitando financiamento para plantar palmas ou fazer aguada para recuperar o pasto”, diz José Wilson Junqueira Queiroz, gerente de negócios do BNB. Em todo o Brasil, entre maio e dezembro de 2012, o governo federal autorizou R$ 656,2 milhões em linhas de crédito emergenciais para atender os atingidos pela seca.

“São essas políticas públicas que estão segurando as famílias no campo”, avalia Jeane de Almeida Santiago. Agrônoma que trabalha em uma ONG chamada Fundação Apaeba, ela presta assistência técnica para os produtores de Pintadas, Ipirá, Riachão do Jacuípe, Pé de Serra, Baixa Grande e Nova Fátima, todas na Bahia. “Antes, tinha muito mais gente que ia para São Paulo e outros estados para fazer migração.”

O relato é de alguém que conhece de perto a situação. Jeane nasceu em Pintadas. Estudou na escola agrícola e saiu para fazer curso técnico em Juazeiro e faculdade no Recôncavo Baiano. Voltou quando se formou, querendo transmitir os conhecimentos aprendidos. Olhos vivos e atentos, ela muda o tom e reavalia sua afirmação: “É, mas este ano muitos jovens estão indo. Com a seca, a rentabilidade das propriedades está zero. E as pessoas não vão ficar aqui sem ter dinheiro. Infelizmente, são obrigadas a sair, de coração partido, para São Paulo em busca de trabalho, ver se conseguem mandar dinheiro para a família que ficou aqui manter o rebanho vivo”.

De fato, o ponto de ônibus de Pintadas estava cheio naquela manhã. A cidade ainda não tem rodoviária e o asfalto que a conecta com o resto do mundo foi inaugurado há apenas um ano, como avisam as placas do governo do estado logo na entrada. Todos aguardavam na calçada o próximo transporte para a capital paulista, malas e parentes em pé, sol a pino. Há cerca de três semanas, Ginalva se despedia ali mesmo do filho mais velho, de 18 anos, que decidiu tentar a vida fora dali. “Me ligou ontem dizendo que já arrumou um emprego numa fábrica. É temporário, mas é um emprego”, ela conta. É a famosa ponte aérea Pintadas-São Paulo.

“O pior é que não temos previsão boa para este ano”, lamenta Jeane. Ela conta que até a palma e o mandacaru, também usados para alimentar o rebanho, começaram a desaparecer, e que a maioria das terras da região está na mão de pequenos agricultores de subsistência ou pecuaristas. “Já faz mais de um ano que o município está dando ração aos animais porque não tem mais pasto. Mas agora a ração esgotou. Você procura e não acha. Quando acha, é um valor que não dá para colocar no orçamento.”

Jeane preocupa-se: “Tem produtores que estão pagando três ou quatro projetos. Vai chegar uma hora que ninguém vai conseguir pegar mais [crédito], de tanto que devem. E aí, não sei como vai ser. Porque a propriedade não está tendo rentabilidade para pagar os empréstimos que já deve. Sem crédito, eu acredito que na zona rural fica impossível.”

“A causa desta seca é a destruição do meio ambiente”, ela sentencia, citando uma pesquisa recente que constata que 90% da mata nativa da região havia desaparecido. “A natureza está respondendo. O território está descoberto. E a partir daí vêm as queimadas. Muitos solos já se perderam ou estão enfraquecidos. O pessoal não tem a cultura de adubar e vão explorando e explorando. Os rios que tínhamos morreram. As nascentes estão desmatadas.”

Em Ipirá, logo ao lado, a realidade é semelhante. No lugar da caatinga, estão os bois. A cena mais comum é ver o gado ou os cavalos amontoados embaixo das poucas árvores que restam para escapar do sol escaldante – cabeça na sombra, lombo de fora. “Ipirá era um município cheio de minifúndios”, explica Orlando Cintra, gerente de Agricultura e Cooperativismo da Prefeitura. “Os grandes criadores começaram a chegar nos anos 1960. Este pessoal comprou a terra barata e empurrou o homem que produzia a batata, a mandioca e a mamona para a periferia daqui ou para São Paulo, Mato Grosso e Paraná.” Outros tantos foram trabalhar no corte da cana-de-açúcar. “Aqui não tinha boi e os pequenos produtores não desmatavam”, continua. “O que criávamos mais era o bode. Foi com a chegada dos grandes fazendeiros que o clima em Ipirá começou a mudar mais rapidamente. Desmataram para plantar capim.”

“A caatinga não é uma área para agropecuária. É para criação de caprinos, ovinos, animais de médio porte. Trouxeram a cultura do Sul, de pecuarista, e todo mundo quis ter fazenda de boi aqui”, completa Meire Oliveira, assessora da Secretaria de Agricultura e Meio Ambiente de Ipirá.

Meire passou a infância na zona rural do município e ainda se lembra do cheiro dessa mata. Conta que, quando criança, fazia burros a partir de umbus: enfiava quatro pedaços de galhinhos na fruta, representando as quatro patas. “Pena que, muitas vezes, quando eu digo para não desmatar, nem meu pai me ouve”, lamenta. Ela parece conhecer todas as plantas da caatinga. Quando encontra um cacto coroa-de-frade, mostra que é possível comer seu fruto, pequenino e vermelho. Caminhando pelas propriedades da região, cruza as cercas de arame farpado com desenvoltura. Pega um punhado de maxixe ainda verde e explica como cozinhá-lo. “Igualzinho a quiabo, sabe?” No sertão, tudo pode ser aproveitado. “A caatinga tem um poder de regeneração incrível”, explica. “A solução seria deixá-la descansar. Algumas áreas no entorno do Rio do Peixe já estão em processo de desertificação.”

Um exemplo de preservação ambiental é o assentamento D. Mathias, que completou sete anos de existência. Ali, a caatinga aos poucos renasce entre bodes, cabras e ovelhas. As árvores são podadas apenas o suficiente para não machucarem os animais, que circulam livremente pelas aroeiras, xique-xiques e umbuzeiros. Organizado pelo Movimento Luta Camponesa (MLC), o símbolo do assentamento é uma família de retirantes desenhada em preto e vermelho. A fila é puxada por uma mulher com uma foice nas mãos. Em seguida vem um homem, com uma enxada nos ombros. Dois filhos, um menino e uma menina seguem-nos de mãos dadas. Por último, um cachorro que, quiçá, se chama Baleia.

Júlio César Santos, dirigente da EBDA, presta assistência aos assentados e explica que os camponeses estão muito atentos às políticas públicas e linhas de crédito oferecidas pelos governos estadual e federal. Com isso, já conseguiram construir casas, comprar uma resfriadeira de leite e ampliar a criação de ovelhas. Entre as últimas iniciativas no local está a plantação adensada de palmas, mais rentável do que a tradicional. Em um primeiro momento, os agricultores não confiaram na técnica e continuaram plantando os cactos distantes uns dos outros, como sempre fizeram. Para contornar as dificuldades, Santos utilizou o “método de Paulo Freire”. Plantou dois roçados: de um lado, as palmas, adensadas; de outro, as tradicionais. Agora, as duas estão crescendo e ele espera, em breve, provar sua teoria. “Tomara que a falta de chuva não queime elas”, diz.

O sucesso do assentamento motivou, há 11 meses, um acampamento no latifúndio vizinho. Leidinaura Souza Santana, ou simplesmente Leila, é uma das moradoras do acampamento Elenaldo Teixeira. “O problema maior aqui é a água para beber e cozinhar. Ficamos quase 15 dias sem água. O caminhão-pipa chegou só ontem”, reclama. “A Embasa [Empresa Baiana de Águas e Saneamento] suspendeu o pipa por causa do rio, que já estava muito baixo, e também porque deu um problema na bomba”, explica Meire, que acompanha a visita. “Tivemos que tomar uma água que não é boa para beber”, murmura Leila.

Leila nasceu em Coração de Maria, ao norte de Feira de Santana. O marido trabalhava como vaqueiro em Malhador, povoado no município de Ipirá, quando souberam dos boatos da ocupação. Vieram logo participar. “Estamos esperando chegar a hora para entrar dentro da fazenda e acabar com o sofrimento. A área já foi atestada como improdutiva. O assentamento aqui do lado é uma maravilha. Me animei de ver que esse pessoal era acampado como a gente. Não desisto, não”, afirma. Meire aproveita para dar uma injeção de ânimo: “Eu acompanhei o outro acampamento desde o começo e era igualzinho. Acho que era até mais quente que este. Este é mais fresco. E olha como estão hoje”.

A conversa acontece na escola do acampamento, onde jovens e adultos são alfabetizados. A pequena construção de palha e madeira da escola fica no início daquela que foi batizada de “Avenida Brasil”, uma sequência bem aprumada de cerca de 15 barracos de lona. Leila acabou de passar para a 4a série do ensino fundamental e soletra o nome para mim. “L-E-I-D-I-N-A-U-R-A.” “Não é com ‘l’, não?”, pergunta Meire. “Não, é com ‘u’ mesmo”, Leila responde.

Em Tamanduá, povoado do entorno de Ipirá, motos e jegues passam com gente e baldes na garupa. Tudo lembra a estiagem. Egecivaldo Oliveira Nunes está à beira da estrada, ao volante do caminhão-pipa estacionado em frente à casa azul e branca. “Só trabalho particular, não trabalho com Exército nem Prefeitura. Pegamos água das barragens porque os açudes estavam secos”, ele conta, afirmando que nos piores dias da seca não “acha tempo” para as entregas solicitadas. O pagamento é por distância, e a cada quilômetro rodado muda o valor: 5 quilômetros são equivalentes a 9 mil litros e custam R$ 80. Quem não puder pagar (como os acampados) pode esperar pela Defesa Civil estadual – que afirma ter investido R$ 4 milhões em caminhões-pipa – ou pelo Exército, que mensalmente abastece de água 137 municípios.

“A cada ano, a seca vem mais intensa e a tendência é sempre durar mais”, lamenta Orlando Cintra, gerente de Agricultura e Cooperativismo de Ipirá. “A perspectiva é a de que em cinco ou seis anos ninguém vá produzir mais nada aqui, na área da agricultura. O clima vem se transformando. A cada ano piora.”

“Já tivemos tantas previsões, e nada”, diz Jeane Santiago. “Passa a previsão de chuva no jornal e as pessoas dizem: ‘Não tenho mais fé, só acredito se eu vir’. O pessoal da zona rural tem simpatias, como ‘se a flor do mandacaru desabrochar é sinal de que vai chover’. Mas todas deram errado até agora. A fé está acabando.” Os mandacarus já florearam. O vermelho-forte chama atenção. Agora é esperar.

Multiplying the Old Divisions of Class in Britain (N.Y.Times)

By SARAH LYALL

Published: April 3, 2013

LONDON — Class in Britain used to be a relatively simple matter, or at least it used to be treated that way. It came in three flavors — upper, middle and working — and people supposedly knew by some mysterious native sixth sense exactly where they stood. As the very tall John Cleese declared to the less-tall Ronnie Corbett in the famous 1966 satirical television sketch meant to illustrate class attitudes in Britain — or, possibly, attitudes toward class attitudes — “I look down on him, because I am upper class.”

From left: John Cleese, Ronnie Barker and Ronnie Corbett in a video still from a satirical British TV sketch illustrating class. And height.

It is not as easy as all that, obviously. The 2010 election was enlivened at one point by a perfectly serious discussion of whether David Cameron, now the prime minister, counted as upper upper-middle class, or lower upper-middle class. But on Wednesday, along came the BBC, muddying the waters with a whole new set of definitions.

Having commissioned what it called The Great British Class Survey, an online questionnaire filled out by more than 161,000 people, the BBC concluded that in today’s complicated world, there are now seven different social classes. (“As if three weren’t annoying enough,” a woman named Laura Phelps said on Twitter.) These range from the “elite” at the top, distinguished by money, connections and rarefied cultural interests, to the “precariat” at the bottom, characterized by lack of money, lack of connections and unrarefied cultural interests.

That might sound kind of familiar, but Fiona Devine, a sociologist who helped devise the study, said, “It’s what’s in the middle which is really interesting and exciting.”

The middle categories, as the study defines them, include the “technical middle class,” a group that has a lot of money but few superior social connections or cultural activity; the “emergent service workers,” a young, urban group that has little money but a high amount of social and cultural capital; and the “new affluent workers,” who score high on social and cultural activity, but have only a middling amount of money.

“There’s a much more fuzzy area between the traditional working class and the traditional middle class,” Ms. Devine, a professor of sociology at Manchester University, said in remarks accompanying the research. “The survey has really allowed us to drill down and get a much more complete picture of class in modern Britain.”

Not everyone sees it that way. In a country that is not sure whether it is (a.) obsessed with class, or (b.) merely obsessed with whether it is as obsessed about class as it used to be (if it ever really was), the survey got widespread attention. But some Britons thought the researchers had not considered the correct criteria.

“There are only two classes: those with tattoos, and those without,” said one Daily Mail reader, commenting on the paper’s article about the new categories.

Another wrote: “What are they called in ‘Brave New World’? Alphas, Betas, Gammas and Epsilons? That’s well on the way to becoming a factual book. We already have most of the population on ‘Soma,’ ” a reference to the antidepressant in the book.

The study was published in the journal Sociology and conducted by Ms. Devine in conjunction with Mike Savage, a professor of sociology at the London School of Economics, and the BBC Lab UK.

Throwing out the old formula by which class was defined according to occupation, wealth and education, it created in its place a definition calculated according to “economic capital,” which includes income and savings; “social capital,” which refers to whom one knows from among 37 different occupations; and “cultural capital,” which is defined as the sorts of cultural interests one pursues, from a list of 27.

In the 1950s, the author Nancy Mitford argued that it was possible to tell which class people were in — upper class (“U”) or not upper class (“non-U”) — according to their choice of vocabulary. U-speakers said “rich” and “jam,” she observed, while non-U speakers said “wealthy” and “preserves,” among other things.

(“Almost everyone I know has some personal antipathy which they condemn as middle class quite irrationally,” Evelyn Waugh wrote in response. “My mother-in-law believes it is middle class to decant claret.”)

Mitford was being mischievous, except that she kind of wasn’t, since she was describing the way people actually spoke. In conjunction with today’s study, the BBC offered a modern adaptation of the Mitford test, a handy do-it-yourself online class calculator.

In their report, the researchers acknowledged that their Web survey showed a large amount of bias, in that the type of people who filled it out were the type of people inclined to fill out BBC surveys (well educated, and 90 percent white, for instance). So they conducted a separate face-to-face survey of 1,026 nationally representative people and then combined the two sets of results, arriving at the seven categories.

Cary L. Cooper, a professor at Lancaster University and the chairman of the Academy of Social Sciences, said that what he found intriguing was not what the study said about different social categories, but rather what it said about people’s desire to place themselves in one or another such category .

“People love filling in questionnaires,” Mr. Cooper said in an interview. “From a psychologists’ point of view, it’s very interesting that they love to pigeonhole themselves — ‘I am that kind of person,’ ‘No matter what people like to say, I am an X.’ ”

Britain remains a “status-conscious society,” he said, especially at times of social and economic insecurity. He attributed the public’s love of “Downton Abbey” and other class-related nostalgic entertainment to a yearning for a time when things were simpler, when “even though there was a rigid class system, at least it was stable.”

Back on the Daily Mail Web site, readers continued to debate the conclusions, and the limitations, of the BBC research.

“I couldn’t find ‘awesome’ class,” one commenter complained.

Another wrote: “What rubbish. Only three classes, working, middle and wealthy. You either have money, no money or some money.”

In Big Data, We Hope and Distrust (Huffington Post)

By Robert Hall

Posted: 04/03/2013 6:57 pm

“In God we trust. All others must bring data.” — W. Edwards Deming, statistician, quality guru

Big data helped reelect a pesident, find Osama bin Laden, and contributed to the meltdown of our financial system. We are in the midst of a data revolution where social media introduces new terms like Arab Spring, Facebook Depression and Twitter anxiety that reflect a new reality: Big data is changing the social and relationship fabric of our culture.

We spend hours installing and learning how to use the latest versions of our ever-expanding technology while enduring a never-ending battle to protect our information. Then we labor while developing practices to rid ourselves of technology — rules for turning devices off during meetings or movies, legislation to outlaw texting while driving, restrictions in classrooms to prevent cheating, and scheduling meals or family time where devices are turned off. Information and technology: We love it, hate it, can’t live with it, can’t live without it, use it voraciously, and distrust it immensely. I am schizophrenic and so am I.

Big data is not only big but growing rapidly. According to IBM, we create 2.5 quintillion bytes a day and that “ninety percent of the data in the world has been created in the last two years.” Vast new computing capacity can analyze Web-browsing trails that track our every click, sensor signals from every conceivable device, GPS tracking and social network traffic. It is now possible to measure and monitor people and machines to an astonishing degree. How exciting, how promising. And how scary.

This is not our first data rodeo. The early stages of the customer relationship management movement were filled with hope and with hype. Large data warehouses were going to provide the kind of information that would make companies masters of customer relationships. There were just two problems. First, getting the data out of the warehouse wasn’t nearly as hard as getting it into the person or device interacting with the customers in a way that added value, trust and expanded relationships. We seem to always underestimate the speed of technology and overestimate the speed at which we can absorb it and socialize around it.

Second, unfortunately the customers didn’t get the memo and mostly decided in their own rich wisdom they did not need or want “masters.” In fact as providers became masters of knowing all the details about our lives, consumers became more concerned. So while many organizations were trying to learn more about customer histories, behaviors and future needs — customers and even their governments were busy trying to protect privacy, security, and access. Anyone attempting to help an adult friend or family member with mental health issues has probably run into well-intentioned HIPAA rules (regulations that ensure privacy of medical records) that unfortunately also restrict the ways you can assist them. Big data gives and the fear of big data takes away.

Big data does not big relationships make. Over the last 20 years as our data keeps getting stronger, our customer relationships keep getting weaker. Eighty-six percent of consumers trust corporations less than they did five years ago. Customer retention across industries has fallen about 30 percent in recent years. Is it actually possible that we have unwittingly contributed in the undermining of our customer relationships? How could that be? For one thing, as companies keep getting better at targeting messages to specific groups and those groups keep getting better at blocking their messages. As usual, the power to resist trumps the power to exert.

No matter how powerful big data becomes, if it is to realize its potential, it must build trust on three levels. First, customers must trust our intentions. Data that can be used for us can also be used against us. There is growing fear institutions will become a part of a “surveillance state.” While organizations have gone to great length to promote protection of our data — the numbers reflect a fair amount of doubt. For example, according to MainStreet, “87 percent of Americans do not feel large banks are transparent and 68 percent do not feel their bank is on their side.:

Second, customers must trust our actions. Even if they trust our intentions, they might still fear that our actions put them at risk. Our private information can be hacked, then misused and disclosed in damaging and embarrassing ways. After the Sandy Hook tragedy a New York newspaper published the names and addresses of over 33,000 licensed gun owners along with an interactive map that showed exactly where they lived. In response names and addresses of the newspaper editor and writers were published on-line along with information about their children. No one, including retired judges, law enforcement officers and FBI agents expected their private information to be published in the midst of a very high decibel controversy.

Third, customers must trust the outcome — that sharing data will benefit them. Even with positive intentions and constructive actions, the results may range from disappointing to damaging. Most of us have provided email addresses or other contact data — around a customer service issue or such — and then started receiving email, phone or online solicitations. I know a retired executive who helps hard-to-hire people. She spent one evening surfing the Internet to research about expunging criminal records for released felons. Years later, Amazon greets her with books targeted to the felon it believes she is. Even with opt-out options, we felt used. Or, we provide specific information, only to repeat it in the next transaction or interaction — not getting the hoped for benefit of saving our time.

It will be challenging to grow the trust at anywhere near the rate we grow the data. Information develops rapidly, competence and trust develop slowly. Investing heavily in big data and scrimping on trust will have the opposite effect desired. To quote Dolly Parton who knows a thing or two about big: “It costs a lot of money to look this cheap.”

Everybody Knows. Climate Denialism has peaked. Now what are we going to do? (EcoEquity)

– Tom Athanasiou (toma@ecoequity.org).  April 2, 2013.

It was never going to be easy to face the ecological crisis.  Even back in the 1970s, before climate took center stage, it was clear that we the prosperous were walking far too heavily.  And that “environmentalism,” as it was called, was only going to be a small beginning.  But it was only when the climate crisis pushed fossil energy into the spotlight that the real stakes were widely recognized.  Fossil fuels are the meat and potatoes of industrial civilization, and the need to rapidly and radically reduce their emissions cut right through to the heart of the great American dream.  And the European dream.  And, inevitably, the Chinese dream as well.

Decades later, 81% of global energy is still supplied by the fossil fuels: coal, gas, and oil.[1]  And though the solar revolution is finally beginning, the day is late.  The Arctic is melting, and, soon, as each year the northern ocean lies bare beneath the summer sun, the warming will accelerate.  Moreover, our plight is becoming visible.  We have discovered, to our considerable astonishment, that most of the fossil fuel on the books of our largest corporations is “unburnable” – in the precise sense that, if we burn it, we are doomed.[2]  Not that we know what to do with this rather strange knowledge.  Also, even as China rises, it’s obvious that it’s not the last in line for the promised land.  Billions of people, all around the world, watch the wealthy on TV, and most all of them want a drink from the well of modern prosperity.  Why wouldn’t they?  Life belongs to us all, as does the Earth.

The challenge, in short, is rather daunting.

The denial of the challenge, on the other hand, always came ready-made.  As Francis Bacon said so long ago, “what a man would rather were true, he more readily believes.”  And we really did want to believe that ours was still a boundless world.  The alternative – an honest reckoning – was just too challenging.  For one thing, there was no obvious way to reconcile the Earth’s finitude with the relentless expansion of the capitalist market.  And as long as we believed in a world without limits, there was no need to see that economic stratification would again become a fatal issue.  Sure, our world was bitterly riven between haves and have-nots, but this problem, too, would fade in time.  With enough growth – the universal balm – redistribution would never be necessary.  In time, every man would be a king.

The denial had many cheerleaders.  The chemical-company flacks who derided Rachel Carson as a “hysterical woman” couldn’t have known that they were pioneering a massive trend.  Also, and of course, big money always has plenty of mouthpieces.  But it’s no secret that, during the 20th Century, the “engineering of consent” reached new levels of sophistication.  The composed image of benign scientific competence became one of its favorite tools, and somewhere along the way tobacco-industry science became a founding prototype of anti-environmental denialism.  On this front, I’m happy to say that the long and instructive history of today’s denialist pseudo-science has already been expertly deconstructed.[3]  Given this, I can safely focus on the new world, the post-Sandy world of manifest climatic disruption in which the denialists have lost any residual aura of scientific legitimacy, and have ceased to be a decisive political force.  A world in which climate denialism is increasingly seen, and increasingly ridiculed, as the jibbering of trolls.

To be clear, I’m not claiming that the denialists are going to shut up anytime soon.  Or that they’ll call off their suicidal, demoralizing campaigns.  Or that their fogs and poisons are not useful to the fossil-fuel cartel.  But the battle of the science is over, at least as far as the scientists are concerned.  And even on the street, hard denialism is looking pretty ridiculous.  To be sure, the core partisans of the right will fight on, for the win and, of course, for the money.[4]  And they’ll continue to have real weight too, for just as long as people do not believe that life beyond carbon is possible.  But for all this, their influence has peaked, and their position is vulnerable.  They are – and visibly now – agents of a mad and dangerous ideology.  They are knaves, and often they are fools.[5]

As for the rest of us, we can at least draw conclusions, and make plans.

As bad as the human prospect may be – and it is quite bad – this is not “game over.”  We have the technology we need to save ourselves, or most of it in any case; and much of it is ready to go.  Moreover, the “clean tech” revolution is going to be disruptive indeed.  There will be cascades of innovation, delivering opportunities of all kinds, all around the world.  Also, our powers of research and development are strong.  Also, and contrary to today’s vogue for austerity and “we’re broke” political posturing, we have the money to rebuild, quickly and on a global scale.  Also, we know how to cooperate, at least when we have to.  All of which is to say that we still have options.  We are not doomed.

But we are in extremely serious danger, and it is too late to pretend otherwise.  So allow me to tip my hand by noting Jorgen Randers’ new book, 2052: A Global Forecast for the Next Forty Years.[6]  Randers is a Norwegian modeler, futurist, professor, executive, and consultant who made his name as co-author of 1972’s landmark The Limits to Growth.  Limits, of course, was a global blockbuster; it remains the best-selling environmental title of all times.  Also, Limits has been relentlessly ridiculed (the early denialists cut their teeth by distorting it[7]) so it must be said that – very much contrary to the mass-produced opinions of the denialist age – its central, climate-related projections are holding up depressingly well.[8]

By 2012 (when he published 2052) Randers had decided to step away from the detached exploration of multiple scenarios that was the methodological core of Limits, and to make actual predictions.  After a lifetime of frustrated efforts, these predictions are vivid, pessimistic and bitter.  In a nutshell, Randers doesn’t expect anything beyond what he calls “progress as usual,” and while he expects it to yield a “light green” buildout (e.g., solar on a large scale) he doesn’t think it will suffice to stabilize the climate system.  Such stabilization, he grants, is still possible, but it would require concerted global action on a scale that neither he nor Dennis Meadows, the leader of the old Limits team, see on today’s horizon.  Let’s call that kind of action global emergency mobilization.  Meadows, when he peers forwards, sees instead “many decades of uncontrolled climatic disruption and extremely difficult decline.”[9]  Randers is more precise, and predicts that we will by 2052 wake to find ourselves on a dark and frightening shore, knowing full well that our planet is irrevocably “on its way towards runaway climate change in the last third of the twenty-first century.”

This is an extraordinary claim, and it requires extraordinary evidence.[10]  Such evidence, unfortunately, is readily available, but for the moment let me simply state the public secret of this whole discussion.  To wit: we (and I use this pronoun advisedly) can still avoid a global catastrophe, but it’s not at all obvious that we will do so.  What is obvious is that stabilizing the global climate is going to be very, very hard.  Which is a real problem, because we don’t do hard anymore.  Rather, when confronted with a serious problem, we just do what we can, hoping that it will be enough and trying our best not to offend the rich.  In truth, and particularly in America, we count ourselves lucky if we can manage governance at all.

This essay is about climate politics after legitimate skepticism.  Climate politics in a world where, as Leonard Cohen put it, “everybody knows.”  What does this mean?  In the first place, it means that we’ve reached the end of what might be called “environmentalism-as-usual.”  This point is widely understood and routinely granted, as when people say something like “climate is not a merely environmental problem,” but my concern is a more particular one.  As left-green writer Eddie Yuen astutely noted in a recent book on “catastrophism,” the problems of the environmental movement are to a very large degree rooted in “the pairing of overwhelmingly bleak analysis with inadequate solutions.”[11]  This is exactly right.

The climate crisis demands a “new environmentalism,” and such a thing does seem to be emerging.  It’s final shape is unknowable, but one thing is certain – the environmentalism that we need will only exist when its solutions and strategies stand up to its own analyses.  The problem is that this requires us to take our “overwhelmingly bleak” analyses straight, rather than soft-pedaling them so that our “inadequate solutions” might look good.  Pessimism, after all, is closely related to realism.  It cannot just be wished away.

Soft-pedaling, alas, has long been standard practice, on both the scientific and the political sides of the climate movement.  Examples abound, but the best would have to be the IPCC itself, the U.N’s Intergovernmental Panel on Climate Change.  The world’s premier climate-science clearinghouse, the IPCC is often attacked from the right, and has developed a shy and reticent culture.  Even more importantly, though, and far more rarely noted, is that the IPCC is conservative by definition and by design.[12]  It almost has to be conservative to do its job, which is to herd the planet’s decision makers towards scientific realism.  The wrinkle is that, at this point, this isn’t even close to being good enough, not at least in the larger scheme.  At this point, we need strategic realism as well as baseline scientific realism, and it demands a brutal honesty in which underlying scientific and political truths are clearly drawn and publicly expressed.

Yet when it comes to strategic realism, we balk.  The first impulse of the “messaging” experts is always to repeat their perennial caution that sharp portraits of the danger can be frightening, and disempowering, and thus lead to despair and passivity.  This is an excellent point, but it’s only the beginning of the truth, not the end.  The deeper problem is that the physical impacts of climate disruption – the destruction and the suffering – will continue to escalate.  “Superstorm Sandy” was bad, but the future will be much worse.  Moreover, the most severe suffering will be far away, and easy for the good citizens of the wealthy world to ignore.  Imagine, for example, a major failure of the Indian Monsoon, and a subsequent South Asian famine.  Imagine it against a drumbeat background in which food is becoming progressively more expensive.  Imagine the permanence of such droughts, and increasing evidence of tipping points on the horizon, and a world in which ever more scientists take it upon themselves to deliver desperate warnings.  The bottom line will not be the importance of communications strategies, but rather the manifest reality, no longer distant and abstract, and the certain knowledge that we are in deep trouble.  And this is where the dangers of soft-pedaling lie.  For as people come to see the scale of the danger, and then to look about for commensurate strategies and responses, the question will be if such strategies are available, and if they are known, and if they are plausible.  If they’re not, then we’ll all going, together, down the road “from aware to despair.”

Absent the public sense of a future in which human resourcefulness and cooperation can make a decisive difference, we assuredly face an even more difficult future in which denial fades into a sense of pervasive hopelessness.  The last third of the century (when Randers is predicting “runaway climate change”) is not so very far away.  Which is to say that, as denialism collapses – and it will – the challenge of working out a large and plausible response to the climate crisis will become overwhelmingly important.  If we cannot imagine such a response, and explain how it would actually work, then people will draw their own conclusions.  And, so far, it seems that we cannot.  Even those of us who are now climate full-timers don’t have a shared vision, not in any meaningful detail, nor do we have a common sense of the strategic initiatives that could make such a vision cohere.

The larger landscape is even worse.  For though many scientists are steeling themselves to speak, the elites themselves are still stiff and timid, and show few signs of rising to the occasion.  Each month, it seems, there’s another major report on the approaching crisis – the World Bank, the National Intelligence Council, and the International Energy Agency have all recently made hair-raising contributions – but they never quite get around to the really important questions.  How should we contrive the necessary global mobilization?  What conditions are needed to absolutely maximize the speed of the clean-tech revolution?  By what strategy will we actually manage to keep the fossil-fuels in the ground?  What kind of international treaties are necessary, and how shall we establish them?  What would a fast-enough global transition cost, and how shall we pay for it?  What about all those who are forced to retreat from rising waters and drying lands?  How shall they live, and where?  How shall we talk about rights and responsibilities in the Greenhouse Century?  And what about the poor?  How shall they find futures in a climate-constrained world?  Can we even imagine a world in which they do?

In the face of such questions, you have a choice.  You can conclude that we’ll just have to do the best we can, and then you can have a drink.  Or maybe two.  Or you can conclude that, despite all evidence to the contrary, enough of us will soon awaken to reality.  What’s certain is that, all around us, there is a vast potentiality – for reinvention, for resistance, for redistribution, and for renewal of all kinds – and that it could at any time snap into solidity.  And into action.

Forget about “hope.”  What we need now is intention.

***

About a decade ago, in San Francisco, I was on a PBS talk show with, among others, Myron Ebell, chief of climate propaganda at the Competitive Enterprise Institute.  Ebell is an aggressive professional, and given the host’s commitment to phony balance he was easily able to frame the conversation.[13]  The result was a travesty, but not an entirely wasted time, at least not for me.  It was instructive to speak, tentatively, of the need for global climate justice, and to hear, in response, that I was a non-governmental fraud that was only in it for the money.  Moreover, as the hour wore on, I came to appreciate the brutal simplicity of the denialist strategy.  The whole point is to suck the oxygen out of the room, to weave such a tangle of confusionism and pseudo-debate that the Really Big Question – What is to be done? – becomes impossible to even ask, let alone discuss.

When Superstorm Sandy slammed into the New York City region, Ebell’s style of hard denialism took a body blow, though obviously it has not dropped finally to the mat.  Had it done do, the Big Question, in all its many forms, would be buzzing constantly around us.  Clearly, that great day has not yet come.  Still, back in November of 2012, when Bloomberg’s Business Week blared “It’s Global Warming, Stupid” from its front cover, this was widely welcomed as a overdue milestone.  It may even be that Michael Tobis, the editor of the excellent Planet 3.0, will prove correct in his long-standing, half-facetious prediction that 2015 will be the date when “the Wall Street Journal will acknowledge the indisputable and apparent fact of anthropogenic climate change; the year in which it will simply be ridiculous to deny it.”[14]  Or maybe not.  Maybe that day will never come.  Maybe Ebell’s style of well-funded, front-group denialism will live on, zombie-like, forever.  Or maybe (and this is my personal prediction) hard climate denialism will soon go the way of creationism and far-right Christianity, becoming a kind of political lifestyle choice, one that’s dangerous but contained.  One that’s ultimately more dangerous to the right than it is to the reality-based community.

If so, then at some point we’re going to have to ask ourselves if we’ve been so long distracted by the hard denialists that we’ve missed the parallel danger of a “soft denialism.”  By which I mean the denialism of a world in which, though the dangers of climate change are simply too ridiculous to deny, they still – somehow – are not taken to imply courage, and reckoning, and large-scale mobilization.  This is a long story, but the point is that, now that the Big Question is finally on the table, we’re going to have to answer it.  Which is to say that we’re going to have to face the many ways in which political timidity and small-bore realism have trained us to calibrate our sense of what must be done by our sense of what can be done, which these days is inadequate by definition.

And not just because of the denialists.

George Orwell once said that “To see what is in front of one’s nose needs a constant struggle.”[15]  As we hurtle forward, this struggle will rage as never before.  The Big Question, after all, changes everything.  Another way of saying this is that our futures will be shaped by the effort to avoid a full-on global climate catastrophe.  Despite all the rest of the geo-political and geo-economic commotion that will mark the 21st Century (and there’ll be plenty) it will be most fundamentally the Greenhouse Century.  We know this now, if we care to, though still only in preliminary outline.  The details, inevitably, will surprise us all.

The core problem, of course, will be “ambition” – action on the scale that’s actually necessary, rather than the scale that is or appears to be possible.  And here, the legacies of the denialist age – the long-ingrained habits of soft-pedaling and strained optimism – will weigh heavily.  Consider the quasi-official global goal (codified, for example, in the Copenhagen Accord) to hold total planetary warming to 2°C (Earth surface average) above pre-industrial levels.  This is the so-called “2°C target.”  What are we to do with it in the post-denialist age?  Let me count the complications: One, all sorts of Very Important People are now telling us it’s going to all but impossible to avoid overshooting 2°C.[16]  Two, in so doing, they are making a political and not a scientific judgment, though they’re not always clear on this point.  (It’s probably still technically possible to hold the 2°C line – if we’re not too unlucky – though it wouldn’t be easy under the best of circumstances.)[17]  Three, the 2°C line, which was once taken to be reasonably safe, is now widely seen (at least among the scientists) to mark the approximate point of transition from “dangerous” to “extremely dangerous,” and possibly to altogether unmanageable levels of warming.[18]  Four, and finally, it’s now widely recognized that any future in which we approach the 2°C line (which we will do) is one in which we also have a real possibility of pushing the average global temperature up by 3°C, and if this were to come to pass we’d be playing a very high-stakes game indeed, one in which uncontrolled positive feedbacks and worst-case scenarios were surrounding us on every side.

The bottom line is today as it was decades ago.  Greenhouse-gas emissions were increasing then, and they are increasing now.  In late 2012, the authoritative Global Carbon Project reported that, since 1990, they had risen by an astonishing 58 percent.[19]  The climate system has unsurprisingly responded with storms, droughts, ice-melt, conflagrations and floods.  The weather has become “extreme,” and may finally be getting our attention.  In Australia, according to the acute Mark Thomson of the Institute for Backyard Studies in Adelaide, the crushing heatwave of early 2013 even pushed aside “the idiot commentariat” and cleared the path for a bit of 11th-hour optimism: “Another year of this trend will shift public opinion wholesale.  We’re used to this sort of that temperature now and then and even take a perverse pride in dealing with it, but there seems to be a subtle shift in mood that ‘This Could Be Serious.’”  Let’s hope he’s right.  Let’s hope, too, that the mood shift that swept through America after Sandy also lasts, and leads us, too, to conclude that ‘This Could Be Serious.’  Not that this alone would be enough to support a real mobilization – the “moral equivalent of war” that we need – but it would be something.  It might even lead us to wonder about our future, and about the influence of money and power on our lives, and to ask how serious things will have to get before it becomes possible to imagine a meaningful change of direction.

The wrinkle is that, before we can advocate for a meaningful change of direction, we have to have one we believe in, one that we’re willing to explain in global terms that actually scale to the problem.  None of which is going to be easy, given that we’re fast approaching a point where only tales of existential danger ring true.  (cf the zombie apocalypse).  The Arctic ice, as noted above, offers an excellent marker.  In fact, the first famous photos of Earth from space – the “blue marble” photos taken in 1972 by the crew of the Apollo 17 – allow us to anchor our predicament in time and in memory.  For these are photos of an old Earth now passed away; they must be, because they show great expanses of ice that are nowhere to be found.  By August of 2012 the Arctic Sea’s ice cover had declined by 40%,[20] a melt that’s easily large enough to be visible from space.  Moreover, beneath the surface, ice volume is dropping even more precipitously.  The polar researchers who are now feverishly evaluating the great melting haven’t yet pushed the entire scientific community to the edge of despair, though they have managed to inspire a great deal of dark muttering about positive feedbacks and tipping points.  Soon, it seems, that muttering will become louder.  Perhaps as early as 2015, the Arctic Ocean will become virtually ice free for the first time in recorded history.[21]  When it does, the solar absorptivity of the Arctic waters will increase, and shift the planetary heat balance by a surprisingly large amount, and by so doing increase the rate of  planetary warming.  And this, of course, will not be end of it.  The feedbacks will continue.  The cycles will go on.

Should we remain silent about such matters, for risk of inflaming the “idiot commentariat?”  It’s absurd to even ask.  The suffering is already high, and if you know the science, you also know that the real surprise would be an absence of positive feedbacks.  The ice melt, the methane plumes, the drying of the rainforests – they’re all real.  Which is to say that there are obviously tipping points before us, though we do not and can not know how much time will pass before they force themselves upon our attention.  The real question is what we must do if we would talk of them in good earnest, while at the same time speaking, without despair and effectively, about the human future.


[1] Jorgen Randers, 2052: A Global Forecast for the Next Forty Years, Chelsea Green, 2012, page 99.

[2] Begin at the Carbon Track Initiative’s website.  http://www.carbontracker.org/

[3] Two excellent examples: Naomi Oreskes, Erik M. M. Conway, Merchants of Doubt: How a Handful of Scientists Obscured the Truth on Issues from Tobacco Smoke to Global Warming, Bloomsbury Press, 2011,  Chris Mooney, The Republican War on Science, Basic Books, 2006.

[4] See, for example, Suzanne Goldenberg, “Secret funding helped build vast network of climate denial thinktanks,” February 14, 2013, The Guardian.

[5] “Lord Monckton,” in particular, is fantastic.  See http://www.youtube.com/watch?v=w833cAs9EN0

[6] Randers, 2012.  See also Randers’ essay and video at the University of Cambridge 2013 “State of Sustainability Leadership,” athttp://www.cpsl.cam.ac.uk/About-Us/What-is-Sustainability-Leadership/The-State-of-Sustainability-Leadership.aspx

[7] Ugo Bardi, in The Limits to Growth Revisited (Springer Briefs, 2011) offers this summary:

“If, at the beginning, the debate on LTG had seemed to be balanced, gradually the general attitude on the study became more negative. It tilted decisively against the study when, in 1989, Ronald Bailey published a paper in “Forbes” where he accused the authors of having predicted that the world’s economy should have already run out of some vital mineral commodities whereas that had not, obviously, occurred.

Bailey’s statement was only the result of a flawed reading of the data in a single table of the 1972 edition of LTG. In reality, none of the several scenarios presented in the book showed that the world would be running out of any important commodity before the end of the twentieth century and not even of the twenty-first. However, the concept of the “mistakes of the Club of Rome” caught on. With the 1990s, it became commonplace to state that LTG had been a mistake if not a joke designed to tease the public, or even an attempt to force humankind into a planet-wide dictatorship, as it had been claimed in some earlier appraisals (Golub and Townsend 1977; Larouche 1983). By the end of the twentieth century, the victory of the critics of LTG seemed to be complete. But the debate was far from being settled.”

[8] See, for example, Graham Turner, “A Comparison of The Limits to Growth with Thirty Years of Reality.” Global Environmental Change, Volume 18, Issue 3, August 2008, Pages 397–411.  An unprotected copy (without the graphics) can be downloaded at www.csiro.au/files/files/plje.pdf.  Also

[9] In late 2012, Dennis Meadows said that “In the early 1970s, it was possible to believe that maybe we could make the necessary changes.  But now it is too late.  We are entering a period of many decades of uncontrolled climatic disruption and extremely difficult decline.”  See Christian Parenti, “The Limits to Growth’: A Book That Launched a Movement,” The Nation, December 24, 2012.

[11] Eddie Yuen, “The Politics of Failure Have Failed: The Environmental Movement and Catastrophism,” in Catastrophism: The Apocalyptic Politics of Collapse and Rebirth, Sasha Lilley, David McNally, Eddie Yuen, James Davis, with a foreword by Doug Henwood. PM Press 2012.  Yuen’s whole line is “the main reasons that [it] has not led to more dynamic social movements; these include catastrophe fatigue, the paralyzing effects of fear; the pairing of overwhelmingly bleak analysis with inadequate solutions, and a misunderstanding of the process of politicization.” 

[12] See Glenn Scherer, “Special Report: IPCC, assessing climate risks, consistently underestimates,” The Daily Climate, December 6, 2012.   More formally (and more interestingly) see Brysse, Oreskes, O’Reilly, and Oppenheimer, “Climate change prediction: Erring on the side of least drama?,” Global Environmental Change 23 (2013), 327-337.

[13] KQED-FM, Forum, July 22, 2003.

[14] Michael Tobis, editor of Planet 3.0, is amusing on this point.  He notes that “many data-driven climate skeptics are reassessing the issue,” that “In 1996 I defined the turning point of the discussion about climate science (the point where we could actually start talking about policy) as the date when theWall Street Journal would acknowledge the indisputable and apparent fact of anthropogenic climate change; the year in which it would simply be ridiculous to deny it.  My prediction was that this would happen around 2015… I’m not sure the WSJ has actually accepted reality yet.  It’s just starting to squint in its general direction.  2015 still looks like a good bet.”  See http://planet3.org/2012/08/07/is-the-tide-turning/

[15] The Collected Essays, Journalism and Letters of George Orwell: In Front of Your Nose, 1945-1950, Sonia Orwell and Ian Angus, Editors / Paperback / Harcourt Brace Jovanovich, 1968, p. 125.

[16] See for example, Fatih Birol and Nicholas Stern, “Urgent steps to stop the climate door closing,” The Financial Times, March 9, 2011.  And see Sir Robert Watson’s Union Frontiers of Geophysics Lecture at the 2012 meeting of the American Geophysical Union, athttp://fallmeeting.agu.org/2012/events/union-frontiers-of-geophysics-lecture-professor-sir-bob-watson-cmg-frs-chief-scientific-adviser-to-defra/

[17] I just wrote “probably still technically possible.”  I could have written “Excluding the small probability of a very bad case, and the even smaller probability of a very good case, it’s probably still technically possible to hold the 2°C line, though it wouldn’t be easy.”  This, however, is a pretty ugly sentence.  I could also have written “Unless we’re unlucky, and the climate sensitivity turns out be on the high side of the expected range, it’s still technically possible to hold the 2°C line, though it wouldn’t be easy, unless we’re very lucky, and the climate sensitivity turns out to be on the low side.”  Saying something like this, though, kind of puts the cart before the horse, since I haven’t said anything about “climate sensitivity,” or about how the scientists think about probability – and of course it’s even uglier.  The point, at least for now, is that climate projections are probabilistic by nature, which does not mean that they are merely “uncertain.”  We know a lot about the probabilities.

[18] See Kevin Anderson, a former director of Britain’s Tyndall Center, who has been unusually frank on this point.  His views are clearly laid out in a (non-peer-reviewed) essay published by the Dag Hammarskjold Foundation in Sweden.  See “Climate change going beyond dangerous – Brutal numbers and tenuous hope” in Development Dialog #61, September 2012, available at http://www.dhf.uu.se/wordpress/wp-content/uploads/2012/10/dd61_art2.pdf.  For a peer-reviewed paper, see Anderson and Bows, “Beyond ‘dangerous’ climate change: emission scenarios for a new world.”  Philosophical Transactions of The Royal Society, (2011) 369, 20-44 and for a lecture, see “Are climate scientists the most dangerous climate skeptics?” a Tyndall Centre video lecture (September 2010) at http://www.tyndall.ac.uk/audio/are-climate-scientist-most-dangerous-climate-sceptics.

[19] “The challenge to keep global warming below 2°C,” Glen P. Peters, et. al., Nature Climate Change (2012) 3, 4–6 (2013) doi:10.1038/nclimate1783.  December 2, 2012.  This figure might actually be revised upward, as 2012 saw the second-largest annual  concentration increase on record (http://climatedesk.org/2013/03/large-rise-in-co2-emissions-sounds-climate-change-alarm/)

[20] The story of the photos is on Wikipedia – see “blue marble.”  For the latest on the Arctic ice, see the “Arctic Sea Ice News and Analysis” page that the National Snow and Ice Data Center — http://nsidc.org/arcticseaicenews/

[21] Climate Progress is covering the “Arctic Death Spiral” in detail.  See for example Joe Romm, “NOAA: Climate Change Driving Arctic Into A ‘New State’ With Rapid Ice Loss And Record Permafrost Warming,” Climate Progress, Dec 6, 2012.  Give yourself a few hours and follow the links.

You Don’t ‘Own’ Your Own Genes: Researchers Raise Alarm About Loss of Individual ‘Genomic Liberty’ Due to Gene Patents (Science Daily)

Mar. 25, 2013 — Humans don’t “own” their own genes, the cellular chemicals that define who they are and what diseases they might be at risk for. Through more than 40,000 patents on DNA molecules, companies have essentially claimed the entire human genome for profit, report two researchers who analyzed the patents on human DNA.

In a new study, researchers report that through more than 40,000 patents on DNA molecules, companies have essentially claimed the entire human genome for profit. (Credit: © X n’ Y hate Z / Fotolia)

Their study, published March 25 in the journal Genome Medicine, raises an alarm about the loss of individual “genomic liberty.”

In their new analysis, the research team examined two types of patented DNA sequences: long and short fragments. They discovered that 41 percent of the human genome is covered by longer DNA patents that often cover whole genes. They also found that, because many genes share similar sequences within their genetic structure, if all of the “short sequence” patents were allowed in aggregate, they could account for 100 percent of the genome.

Furthermore, the study’s lead author, Dr. Christopher E. Mason of Weill Cornell Medical College, and the study’s co-author, Dr. Jeffrey Rosenfeld, an assistant professor of medicine at the University of Medicine & Dentistry of New Jersey and a member of the High Performance and Research Computing Group, found that short sequences from patents also cover virtually the entire genome — even outside of genes.

“If these patents are enforced, our genomic liberty is lost,” says Dr. Mason, an assistant professor of physiology and biophysics and computational genomics in computational biomedicine at the Institute for Computational Biomedicine at Weill Cornell. “Just as we enter the era of personalized medicine, we are ironically living in the most restrictive age of genomics. You have to ask, how is it possible that my doctor cannot look at my DNA without being concerned about patent infringement?”

The U.S. Supreme Court will review genomic patent rights in an upcoming hearing on April 15. At issue is the right of a molecular diagnostic company to claim patents not only on two key breast and ovarian cancer genes — BRCA1 and BRCA2 — but also on any small sequence of code within BRCA1, including a striking patent for only 15 nucleotides.

In its study, the research team matched small sequences within BRCA1 to other genes and found that just this one molecular diagnostic company’s patents also covered at least 689 other human genes — most of which have nothing to do with breast or ovarian cancer; rather, its patents cover 19 other cancers as well as genes involved in brain development and heart functioning.

“This means if the Supreme Court upholds the current scope of the patents, no physician or researcher can study the DNA of these genes from their patients, and no diagnostic test or drug can be developed based on any of these genes without infringing a patent,” says Dr. Mason.

One Patented Sequence Matched More Than 91 Percent of Human Genes

Dr. Mason undertook the study because he realized that his research into brain and cancer disorders inevitably involved studying genes that were protected by patents.

Under U.S. patent law, genes can be patented by those researchers, either at companies or institutions, who are first to find a gene that promises a useful application, such as for a diagnostic test. For example, the patents received by a company in the 1990s on BRCA1 and BRCA2 enables it to offer a diagnostic test to women who may have, or may be at risk for, breast or ovarian cancer due to mutations in one or both of these genes. Women and their doctors have no choice but to use the services of the patents’ owner, which costs $3,000 per test, “whereas any of the hundreds of clinical laboratories around the country could perform such a test for possibly much less,” says Dr. Mason.

The impact on these patents is equally onerous on research, Dr. Mason adds.

“Almost every day, I come across a gene that is patented — a situation that is common for every geneticist in every lab,” says Dr. Mason.

Dr. Mason and his research partner sought to determine how many other genes may be impacted by gene patents, as well as the overall landscape of intellectual property on the human genome.

To conduct the study, Dr. Mason and Dr. Rosenfeld examined the structure of the human genome in the context of two types of patented sequences: short and long fragments of DNA. They used matches to known genes that were confirmed to be present in patent claims, ranging from as few as 15 nucleotides (the building blocks of DNA) to the full length of all patented DNA fragments.

Before examining the patented sequences, the researchers first calculated how many genes had common segments of 15 nucleotide (15mer), and found that every gene in the human genome matched at least one other gene in this respect, ranging from as few as five matches 15mer to as many as 7,688 gene matches. They also discovered that 99.999 percent of 15mers in the human genome are repeated at least twice.

“This demonstrates that short patent sequences are extremely non-specific and that a 15mer claim from one gene will always cross-match and patent a portion of another gene as well,” says Dr. Mason. “This means it is actually impossible to have a 15mer patent for just one gene.”

Next, researchers examined the total sequence space in human genes covered by 15mers in current patent claims. They found 58 patents whose claims covered at least 10 percent of all bases of all human genes. The broadest patent claimed sequences that matched 91.5 percent of human genes. Then, when they took existing gene patents and matched patented 15mers to known genes, they discovered that 100 percent of known genes are patented.

“There is a real controversy regarding gene ownership due to the overlap of many competing patent claims. It is unclear who really owns the rights to any gene,” says Dr. Rosenfeld. “While the Supreme Court is hearing one case concerning just the BRCA1 patent, there are also many other patents whose claims would cover those same genes. Do we need to go through every gene to look at who made the first claim to that gene, even if only one small part? If we resort to this rule, then the first patents to be granted for any DNA will have a vast claim over portions of the human genome.”

A further issue of concern is that patents on DNA can readily cross species boundaries. A company can have a patent that they received for cow breeding and have that patent cover a large percentage of human genes. Indeed, the researchers found that one company owns the rights to 84 percent of all human genes for a patent they received for cow breeding. “It seems silly that a patent designed to study cow genetics also claims the majority of human genes,” says Dr. Rosenfeld.

Finally, they also examined the impact of longer claimed DNA sequences from existing gene patents, which ranged from a few dozen bases up to thousands of bases of DNA, and found that these long, claimed sequences matched 41 percent (9,361) of human genes. Their analysis concluded that almost all clinically relevant genes have already been patented, especially for short sequence patents, showing all human genes are patented many times over.

“This is, so to speak, patently ridiculous,” adds Dr. Mason. “If patent claims that use these small DNA sequences are upheld, it could potentially create a situation where a piece of every gene in the human genome is patented by a phalanx of competing patents.”

In their discussion, the researchers argue that the U.S. Supreme Court now has a chance to shape the balance between the medical good versus inventor protection, adding that, in their opinion, the court should limit the patenting of existing nucleotide sequences, due to their broad scope and non-specificity in the human genome.

“I am extremely pro-patent, but I simply believe that people should not be able to patent a product of nature,” Dr. Mason says. “Moreover, I believe that individuals have an innate right to their own genome, or to allow their doctor to look at that genome, just like the lungs or kidneys. Failure to resolve these ambiguities perpetuates a direct threat to genomic liberty, or the right to one’s own DNA.”

Journal Reference:

  1. Jeffrey Rosenfeld, and Christopher E Mason. Pervasive sequence patents cover the entire human genome.Genome Medicine, 2013 (in press) DOI: 10.1186/gm431

The Tar Sands Disaster (N.Y.Times)

OP-ED CONTRIBUTOR

By THOMAS HOMER-DIXON

Published: March 31, 2013

WATERLOO, Ontario

Rick Froberg

IF President Obama blocks the Keystone XL pipeline once and for all, he’ll do Canada a favor.

Canada’s tar sands formations, landlocked in northern Alberta, are a giant reserve of carbon-saturated energy — a mixture of sand, clay and a viscous low-grade petroleum called bitumen. Pipelines are the best way to get this resource to market, but existing pipelines to the United States are almost full. So tar sands companies, and the Alberta and Canadian governments, are desperately searching for export routes via new pipelines.

Canadians don’t universally support construction of the pipeline. A poll by Nanos Research in February 2012 found that nearly 42 percent of Canadians were opposed. Many of us, in fact, want to see the tar sands industry wound down and eventually stopped, even though it pumps tens of billions of dollars annually into our economy.

The most obvious reason is that tar sands production is one of the world’s most environmentally damaging activities. It wrecks vast areas of boreal forest through surface mining and subsurface production. It sucks up huge quantities of water from local rivers, turns it into toxic waste and dumps the contaminated water into tailing ponds that now cover nearly 70 square miles.

Also, bitumen is junk energy. A joule, or unit of energy, invested in extracting and processing bitumen returns only four to six joules in the form of crude oil. In contrast, conventional oil production in North America returns about 15 joules. Because almost all of the input energy in tar sands production comes from fossil fuels, the process generates significantly more carbon dioxide than conventional oil production.

There is a less obvious but no less important reason many Canadians want the industry stopped: it is relentlessly twisting our society into something we don’t like. Canada is beginning to exhibit the economic and political characteristics of a petro-state.

Countries with huge reserves of valuable natural resources often suffer from economic imbalances and boom-bust cycles. They also tend to have low-innovation economies, because lucrative resource extraction makes them fat and happy, at least when resource prices are high.

Canada is true to type. When demand for tar sands energy was strong in recent years, investment in Alberta surged. But that demand also lifted the Canadian dollar, which hurt export-oriented manufacturing in Ontario, Canada’s industrial heartland. Then, as the export price of Canadian heavy crude softened in late 2012 and early 2013, the country’s economy stalled.

Canada’s record on technical innovation, except in resource extraction, is notoriously poor. Capital and talent flow to the tar sands, while investments in manufacturing productivity and high technology elsewhere languish.

But more alarming is the way the tar sands industry is undermining Canadian democracy. By suggesting that anyone who questions the industry is unpatriotic, tar sands interest groups have made the industry the third rail of Canadian politics.

The current Conservative government holds a large majority of seats in Parliament but was elected in 2011 with only 40 percent of the vote, because three other parties split the center and left vote. The Conservative base is Alberta, the province from which Prime Minister Stephen Harper and many of his allies hail. As a result, Alberta has extraordinary clout in federal politics, and tar sands influence reaches deep into the federal cabinet.

Both the cabinet and the Conservative parliamentary caucus are heavily populated by politicians who deny mainstream climate science. The Conservatives have slashed financing for climate science, closed facilities that do research on climate change, told federal government climate scientists not to speak publicly about their work without approval and tried, unsuccessfully, to portray the tar sands industry as environmentally benign.

The federal minister of natural resources, Joe Oliver, has attacked “environmental and other radical groups” working to stop tar sands exports. He has focused particular ire on groups getting money from outside Canada, implying that they’re acting as a fifth column for left-wing foreign interests. At a time of widespread federal budget cuts, the Conservatives have given Canada’s tax agency extra resources to audit registered charities. It’s widely assumed that environmental groups opposing the tar sands are a main target.

This coercive climate prevents Canadians from having an open conversation about the tar sands. Instead, our nation behaves like a gambler deep in the hole, repeatedly doubling down on our commitment to the industry.

President Obama rejected the pipeline last year but now must decide whether to approve a new proposal from TransCanada, the pipeline company. Saying no won’t stop tar sands development by itself, because producers are busy looking for other export routes — west across the Rockies to the Pacific Coast, east to Quebec, or south by rail to the United States. Each alternative faces political, technical or economic challenges as opponents fight to make the industry unviable.

Mr. Obama must do what’s best for America. But stopping Keystone XL would be a major step toward stopping large-scale environmental destruction, the distortion of Canada’s economy and the erosion of its democracy.

Thomas Homer-Dixon, who teaches global governance at the Balsillie School of International Affairs, is the author of “The Upside of Down: Catastrophe, Creativity and the Renewal of Civilization.”

New Models For Clean Energy Funding Offer Hope (Earth Techling)

by Institute For Local Self-Reliance

March 23rd, 2013

Three years ago, the prospects for Americans to own their energy future seemed relatively bleak. There were almost no replicable models for doing community-based energy projects or investment, despite falling costs and technology – solar and wind – that lend themselves to local development.

But thanks to recent opportunities in community solar and crowdfunding, we may see a renewable energy market in America where everyone wins.

Let’s start with solar. It’s the ultimate decentralized renewable energy – sunshine falls everywhere – and its cost is falling so fast that, within a decade, 300 gigawatts of unsubsidized solar will be competitive with local electricity prices in communities across the country. In 2010, just one model for developing community solar had proved readily replicable and there was no practical way to pool a community’s collective capital to invest in local energy (except perhaps a municipal utility, a story for another time). Since nearly three-quarters of residential rooftops are not suitable for solar, it was hard to see how most Americans could use the sun to brighten their energy future.

But in 2013, community solar is rising fast. Colorado’s community solar gardens program – selling out its 9 megawatt limit in a half hour – illustrates a powerful model for letting people pool their money to go solar, even if their own roof isn’t theirs or isn’t sunny. Some companies in Colorado have already brought their model to other states, like the Clean Energy Collective‘s community solar project with the Wright-Hennepin Electric Cooperative in Minnesota, and other states (like Minnesota) are considering legislation to expand the opportunity.

mosaic solar crowdfunding kickstarter

image via Mosaic

The year 2013 may also be remembered for opening the crowdfunding floodgates.

In late 2012, California-based (Solar) Mosaic launched their first community solar investment project, allowing 51 California investors earn 6.38% returns for investing in a 47 kilowatt (kW) solar array on the roof of the Youth Employment Partnership in Oakland. Their subsequent 235 kW project ups the ante, and was open to regular folks in California and New York (and accredited investors in all 50 states). It sold out in just 24 hours to over 400 investors with an average stake of just $700. The investment uses a common securities law exemption (Rule 506 of Regulation D), and investors will earn a 4.5% annual return (net of fees) over 9 years, greening the economy and their pocketbooks.

The key advantage of Solar Mosaic is the investment. Previous community solar projects have relied on shared electricity savings for participants, sometimes called virtual net metering. This limits prospective investors to the same utility service territory, and the savings can’t be taken to a property outside that area. The Mosaic model turns community solar into a simple investment, letting prospective investors select a particular Mosaic project to invest in, with significantly higher returns than parking money in a U.S. Treasury or savings account. For now, it’s limited to broad participation in just two states, New York and California, but Mosaic is “working hard” to expand the opportunity.

Mosaic may be just the first salvo in a firestorm of community renewable energy investment. The federal JOBS Act of 2012 intends to create a new segment of investment security with much lower upfront and legal costs that would let crowds pool up to $1 million for solar and other renewable energy projects.The only “drawback” in the Mosaic model is that it doesn’t explicitly connect geography with investment. A New York City resident, for example, can invest in a project in California, but not in Manhattan or the Bronx. If this model continues to be successful, however, it’s likely that will change.

Crowdfunding doesn’t have to be limited to renewable energy, either. People could pool their resources to invest in block-by-block residential energy efficiency retrofits, reducing their own and their neighbors’ energy bills and sharing the energy savings with other local investors. Crowdfunding for energy efficiency could be combined with commercial building energy ratings (just enacted in Minneapolis, MN, for example) to target the least efficient buildings with the most potential for savings. Local shared investment wouldn’t just tap and share more energy savings, but would boost the local economy by putting idled laborers to work making buildings more cost-effective and less climate harming.

Both community solar and crowdfunding are in their infancy, but they represent two powerful tools for Americans to take charge of their energy future.

Germany Has Created An Accidental Empire (Social Europe)

25/03/2013 BY ULRICH BECK

ulrich_beckAre we now living in a German Europe? In an interview with EUROPP editors Stuart A Brown and Chris Gilson, Ulrich Beck discusses German dominance of the European Union, the divisive effects of austerity policies, and the relevance of his concept of the ‘risk society’ to the current problems being experienced in the Eurozone.

How has Germany come to dominate the European Union?

Well it happened somehow by accident. Germany has actually created an ‘accidental empire’. There is no master plan; no intention to occupy Europe. It doesn’t have a military basis, so all the talk about a ‘Fourth Reich’ is misplaced. Rather it has an economic basis – it’s about economic power – and it’s interesting to see how in the anticipation of a European catastrophe, with fears that the Eurozone and maybe even the European Union might break down, the landscape of power in Europe has changed fundamentally.

First of all there’s a split between the Eurozone countries and the non-Eurozone countries. Suddenly for example the UK, which is only a member of the EU and not a member of the Eurozone, is losing its veto power. It’s a tragic comedy how the British Prime Minister is trying to tell us that he is still the one who is in charge of changing the European situation. The second split is that among the Eurozone countries there is an important division of power between the lender countries and the debtor countries. As a result Germany, the strongest economic country, has become the most powerful EU state.

Are austerity policies dividing Europe?

Indeed they are, in many ways. First of all we have a new line of division between northern European and southern European countries. Of course this is very evident, but the background from a sociological point of view is that we are experiencing the redistribution of risk from the banks, through the states, to the poor, the unemployed and the elderly. This is an amazing new inequality, but we are still thinking in national terms and trying to locate this redistribution of risk in terms of national categories.

At the same time there are two leading ideologies in relation to austerity policies. The first is pretty much based on what I call the ‘Merkiavelli’ model – by this I mean a combination of Niccolò Machiavelli and Angela Merkel. On a personal level, Merkel takes a long time to make decisions: she’s always waiting until some kind of consensus appears. But this kind of waiting makes the countries depending on Germany’s decision realise that actually Germany holds the power. This deliberate hesitation is quite an interesting strategy in terms of the way that Germany has taken over economically.

The second element is that Germany’s austerity policies are not based simply on pragmatism, but also underlying values. The German objection to countries spending more money than they have is a moral issue which, from a sociological point of view, ties in with the ‘Protestant Ethic’. It’s a perspective which has Martin Luther and Max Weber in the background. But this is not seen as a moral issue in Germany, instead it’s viewed as economic rationality. They don’t see it as a German way of resolving the crisis; they see it as if they are the teachers instructing southern European countries on how to manage their economies.

This creates another ideological split because the strategy doesn’t seem to be working so far and we see many forms of protest, of which Cyprus is the latest example. But on the other hand there is still a very important and powerful neo-liberal faction in Europe which continues to believe that austerity policies are the answer to the crisis.

Is the Eurozone crisis proof that we live in a risk society?

Yes, this is the way I see it. My idea of the risk society could easily be misunderstood because the term ‘risk’ actually signifies that we are in a situation to cope with uncertainty, but to me the risk society is a situation in which we are not able to cope with the uncertainty and consequences that we produce in society.

I make a distinction between ‘first modernity’ and our current situation. First modernity, which lasted from around the 18th century until perhaps the 1960s or 1970s, was a period where there was a great deal of space for experimentation and we had a lot of answers for the uncertainties that we produced: probability models, insurance mechanisms, and so on. But then because of the success of modernity we are now producing consequences for which we don’t have any answers, such as climate change and the financial crisis. The financial crisis is an example of the victory of a specific interpretation of modernity: neo-liberal modernity after the breakdown of the Communist system, which dictates that the market is the solution and that the more we increase the role of the market, the better. But now we see that this model is failing and we don’t have any answers.

We have to make a distinction between a risk society and a catastrophe society. A catastrophe society would be one in which the motto is ‘too late’: where we give in to the panic of desperation. A risk society in contrast is about the anticipation of future catastrophes in order to prevent them from happening. But because these potential catastrophes are not supposed to happen – the financial system could collapse, or nuclear technology could be a threat to the whole world – we don’t have the basis for experimentation. The rationality of calculating risk doesn’t work anymore. We are trying to anticipate something that is not supposed to happen, which is an entirely new situation.

Take Germany as an example. If we look at Angela Merkel, a few years ago she didn’t believe that Greece posed a major problem, or that she needed to engage with it as an issue. Yet now we are in a completely different situation because she has learned that if you look into the eyes of a potential catastrophe, suddenly new things become possible. Suddenly you think about new institutions, or about the fiscal compact, or about a banking union, because you anticipate a catastrophe which is not supposed to happen. This is a huge mobilising force, but it’s highly ambivalent because it can be used in different ways. It could be used to develop a new vision for Europe, or it could be used to justify leaving the European Union.

How should Europe solve its problems?

I would say that the first thing we have to think about is what the purpose of the European Union actually is. Is there any purpose? Why Europe and not the whole world? Why not do it alone in Germany, or the UK, or France?

I think there are four answers in this respect. First, the European Union is about enemies becoming neighbours. In the context of European history this actually constitutes something of a miracle. The second purpose of the European Union is that it can prevent countries from being lost in world politics. A post-European Britain, or a post-European Germany, is a lost Britain, and a lost Germany. Europe is part of what makes these countries important from a global perspective.

The third point is that we should not only think about a new Europe, we also have to think about how the European nations have to change. They are part of the process and I would say that Europe is about redefining the national interest in a European way. Europe is not an obstacle to national sovereignty; it is the necessary means to improve national sovereignty. Nationalism is now the enemy of the nation because only through the European Union can these countries have genuine sovereignty.

The fourth point is that European modernity, which has been distributed all over the world, is a suicidal project. It’s producing all kinds of basic problems, such as climate change and the financial crisis. It’s a bit like if a car company created a car without any brakes and it started to cause accidents: the company would take these cars back to redesign them and that’s exactly what Europe should do with modernity. Reinventing modernity could be a specific purpose for Europe.

Taken together these four points form what you could say is a grand narrative of Europe, but one basic issue is missing in the whole design. So far we’ve thought about things like institutions, law, and economics, but we haven’t asked what the European Union means for individuals. What do individuals gain from the European project? First of all I would say that, particularly in terms of the younger generation, more Europe is producing more freedom. It’s not only about the free movement of people across Europe; it’s also about opening up your own perspective and living in a space which is essentially grounded on law.

Second, European workers, but also students as well, are now confronted with the kind of existential uncertainty which needs an answer. Half of the best educated generation in Spanish and Greek history lack any future prospects. So what we need is a vision for a social Europe in the sense that the individual can see that there is not necessarily social security, but that there is less uncertainty. Finally we need to redefine democracy from the bottom up. We need to ask how an individual can become engaged with the European project. In that respect I have made a manifesto, along with Daniel Cohn-Bendit, called “We Are Europe”, arguing that we need a free year for everyone to do a project in another country with other Europeans in order to start a European civil society.

A more detailed discussion of the topics covered in this article is available in Ulrich Beck’s latest book, German Europe (Polity 2013). This interview was first published on EUROPP@LSE

‘Networked Minds’ Require Fundamentally New Kind of Economics (Science Daily)

Mar. 20, 2013 — In their computer simulations of human evolution, scientists at ETH Zurich find the emergence of the “homo socialis” with “other-regarding” preferences. The results explain some intriguing findings in experimental economics and call for a new economic theory of “networked minds”.

In their computer simulations of human evolution, scientists at ETH Zurich find the emergence of the “homo socialis” with “other-regarding” preferences. The results explain some intriguing findings in experimental economics and call for a new economic theory of “networked minds”. (Credit: © violetkaipa / Fotolia)

Economics has a beautiful body of theory. But does it describe real markets? Doubts have come up not only in the wake of the financial crisis, since financial crashes should not occur according to the then established theories. Since ages, economic theory is based on concepts such as efficient markets and the “homo economicus”, i.e. the assumption of competitively optimizing individuals and firms. It was believed that any behavior deviating from this would create disadvantages and, hence, be eliminated by natural selection. But experimental evidence from behavioral economics show that, on average, people behave more fairness-oriented and other-regarding than expected. A new theory by scientists from ETH Zurich now explains why.

“We have simulated interactions of individuals facing social dilemma situations, where it would be favorable for everyone to cooperate, but non-cooperative behavior is tempting,” explains Dr. Thomas Grund, one of the authors of the study. “Hence, cooperation tends to erode, which is bad for everyone.” This may create tragedies of the commons such as over-fishing, environmental pollution, or tax evasion.

Evolution of “friendliness”

Prof. Dirk Helbing of ETH Zurich, who coordinated the study, adds: “Compared to conventional models for the evolution of social cooperation, we have distinguished between the actual behavior – cooperation or not – and an inherited character trait, describing the degree of other-regarding preferences, which we call the friendliness.” The actual behavior considers not only the own advantage (“payoff”), but also gives a weight to the payoff of the interaction partners depending on the individual friendliness. For the “homo economicus”, the weight is zero. The friendliness spreads from one generation to the next according to natural selection. This is merely based on the own payoff, but mutations happen.

For most parameter combinations, the model predicts the evolution of a payoff-maximizing “homo economicus” with selfish preferences, as assumed by a great share of the economic literature. Very surprisingly, however, biological selection may create a “homo socialis” with other-regarding preferences, namely if offsprings tend to stay close to their parents. In such a case, clusters of friendly people, who are “conditionally cooperative”, may evolve over time.

If an unconditionally cooperative individual is born by chance, it may be exploited by everyone and not leave any offspring. However, if born in a favorable, conditionally cooperative environment, it may trigger cascade-like transitions to cooperative behavior, such that other-regarding behavior pays off. Consequently, a “homo socialis” spreads.

Networked minds create a cooperative human species

“This has fundamental implications for the way, economic theories should look like,” underlines Professor Helbing. Most of today’s economic knowledge is for the “homo economicus”, but people wonder whether that theory really applies. A comparable body of work for the “homo socialis” still needs to be written.

While the “homo economicus” optimizes its utility independently, the “homo socialis” puts himself or herself into the shoes of others to consider their interests as well,” explains Grund, and Helbing adds: “This establishes something like “networked minds”. Everyone’s decisions depend on the preferences of others.” This becomes even more important in our networked world.

A participatory kind of economy

How will this change our economy? Today, many customers doubt that they get the best service by people who are driven by their own profits and bonuses. “Our theory predicts that the level of other-regarding preferences is distributed broadly, from selfish to altruistic. Academic education in economics has largely promoted the selfish type. Perhaps, our economic thinking needs to fundamentally change, and our economy should be run by different kinds of people,” suggests Grund. “The true capitalist has other-regarding preferences,” adds Helbing, “as the “homo socialis” earns much more payoff.” This is, because the “homo socialis” manages to overcome the downwards spiral that tends to drive the “homo economicus” towards tragedies of the commons. The breakdown of trust and cooperation in the financial markets back in 2008 might be seen as good example.

“Social media will promote a new kind of participatory economy, in which competition goes hand in hand with cooperation,” believes Helbing. Indeed, the digital economy’s paradigm of the “prosumer” states that the Internet, social platforms, 3D printers and other developments will enable the co-producing consumer. “It will be hard to tell who is consumer and who is producer”, says Christian Waloszek. “You might be both at the same time, and this creates a much more cooperative perspective.”

Journal Reference:

  1. Thomas Grund, Christian Waloszek, Dirk Helbing. How Natural Selection Can Create Both Self- and Other-Regarding Preferences, and Networked Minds.Scientific Reports, 2013; 3 DOI: 10.1038/srep01480

Na avaliação de especialistas, pré-sal deve trazer benefícios econômicos e científicos para o Brasil (Jornal da Ciência)

JC e-mail 4665, de 15 de Fevereiro de 2013.

Viviane Monteiro

O país não pode perder a oportunidade de utilizar os royalties do petróleo para investir em educação e em pesquisas científicas

Apesar dos riscos ambientais, a exploração do petróleo da camada pré-sal deve assegurar ao país, em longo prazo, novos patamares de desenvolvimento, tanto econômico quanto cientifico e tecnológico. Essa é a opinião que prevalece entre especialistas e pesquisadores da área de petróleo do Instituto Alberto Luiz Coimbra de Pós-Graduação e Pesquisa de Engenharia da Universidade Federal do Rio de Janeiro (Coppe-UFRJ), da Universidade Federal do Espírito Santo (UFES) e da Escola Politécnica da Universidade de São Paulo (Poli-USP).

Para eles, o Brasil não pode perder a oportunidade de explorar o pré-sal e nem de utilizar os royalties do petróleo extraído dessa camada profunda para investir em educação e em pesquisas científicas e tecnológicas. Um dos objetivos desses investimentos deve ser produzir energias limpas e renováveis, que devem substituir o combustível fóssil no período “pós-petróleo”, o que deve ocorrer nas próximas cinco décadas, aproximadamente.

Diante da exploração do pré-sal, o diretor de tecnologia e inovação da Coppe/UFRJ, Segen Estefen, diz que o Brasil deve se tornar um dos líderes mundiais na produção de tecnologias de ponta tanto para a exploração de petróleo quanto para o desenvolvimento de energias limpas e renováveis. A exploração do pré-sal, segundo ele acredita, representa uma janela de oportunidades para o Brasil figurar entre os maiores produtores de petróleo do mundo, tornando-se um dos “pelotões” de frente da Organização dos Países Exportadores de Petróleo (OPEP).

Nos últimos dois anos, o país passou da 18ª para a 13ª posição no ranking dos produtores de petróleo, conforme o relatório “Statistical Review of World Energy 2011”, da empresa britânica British Petroleum (BP). Com as descobertas das jazidas do pré-sal, as estimativas para as reservas nacionais de petróleo cresceram de 8 bilhões de barris, por volta de 2006, para algo entre 60 bilhões e 70 bilhões, atualmente. Ao colocar esses números na ponta do lápis, Segen calcula que tais cifras representariam uma receita de US$ 4 trilhões para o país, levando-se em conta o preço atual (US$ 100) do barril de petróleo. Ou seja, é um montante similar ao valor corrente do Produto Interno Bruto (PIB) nacional de R$ 4,143 trilhões, em 2011.

Em termos de reservas de petróleo, o pesquisador e professor Eustáquio Vinícius de Castro, do Laboratório de Petróleo da UFES, concorda que o pré-sal colocará o Brasil entre os cinco maiores produtores do petróleo do mundo, como Arábia Saudita, Estados Unidos e Venezuela. “A tecnologia a ser desenvolvida para atender à exploração do pré-sal deve ser estendida, também, para outras áreas, sobretudo as indústrias metal-mecânica e a de química ambiental”, diz.

Como exemplo, Castro cita equipamentos de perfuração de áreas ultraprofundas capazes de suportar fortes pressões, que podem ser utilizados pela construção civil; e agentes químicos (aditivos) que devem estar presente nos aparelhos para remoção de impurezas e purificação do óleo do pré-sal. “Esses aditivos, inclusive, podem ser utilizados na purificação de água residual, gerada por empresas fabricantes de tinta, na despoluição de rios ou de esgotos urbanos”, acrescenta.

Modelo norueguês – Também defensor da exploração do pré-sal, o professor Ricardo Cabral de Azevedo, do Departamento de Engenharia de Minas e de Petróleo da Poli/USP, aconselha o Brasil a adotar o modelo da Noruega na extração do petróleo da camada pré-sal e evitar a chamada “doença holandesa”. “Outros países que tiveram grandes reservas a explorar e produzir são exemplos do que devemos ou não fazer no Brasil”, explica. “A Holanda, por exemplo, sofreu o que ficou sendo conhecido como ‘doença holandesa’, porque sua economia se tornou excessivamente dependente do petróleo. Já a Noruega se transformou radicalmente e hoje é um dos países com maior IDH [Índice de Desenvolvimento Humano] do mundo”, lembra.

Até então, a Noruega era um dos países mais pobres da Europa, cujas finanças dependiam principalmente de exportações de commodities, como minérios e peixes enlatados. A virada da economia norueguesa ocorreu a partir de 1969, quando foram descobertas grandes reservas de petróleo no Mar do Norte e a receita foi dirigida principalmente para saúde e educação. Hoje, esse país europeu detém a terceira maior renda per capita do mundo (US$ 59,3 mil) e o IDH mais alto do planeta.

Royalties para educação e CT&I – Assim, para fazer frente aos desafios que se apresentam na extração do petróleo na camada pré-sal no Brasil, os especialistas reforçam a necessidade de destinar parte significativa da receita dessa atividade para educação, ciência, tecnologia e inovação, seguindo o modelo norueguês. Aliás, essa é uma bandeira levantada pela comunidade científica, representada pela Sociedade Brasileira para o Progresso da Ciência (SBPC).

Os especialistas são unânimes em afirmar que o país precisa aproveitar as riquezas do pré-sal a fim de conquistar novos patamares de desenvolvimento, dar um salto na qualidade na educação e melhorar o capital humano – lembrando que um dia as reservas do petróleo acabarão. “Lembramos que são reservas muito grandes, mas finitas”, alerta Azevedo. “Cabe a nós transformá-las em um legado permanente, investindo na educação e no desenvolvimento do nosso país”, defende.

Já o diretor de tecnologia e inovação da Coppe/UFRJ, Estefen, acrescenta que o país precisa preparar o terreno, na área de pesquisas científicas e tecnológicas, para o período pós-petróleo. Nesse caso, ele considera fundamental assegurar investimentos para ampliar consideravelmente as pesquisas e estudos científicos para o desenvolvimento de tecnologias para produção de energias limpas e renováveis, lembrando que há um esforço de vários países em prol da redução de emissões em médio prazo. Vale destacar que o petróleo é um combustível fóssil que contribui significativamente para o aumento do efeito estufa.

Para o pesquisador da UFES, Castro, que considera positiva a proposta de criação do fundo do pré-sal (fundo soberano) – para o qual deve ser destinada metade da receita do óleo a ser extraído de águas ultraprofundas para educação – a exploração do pré-sal precisa ser inteligente, com responsabilidade ambiental e investimento em educação. “O petróleo traz muita riqueza, mas pode trazer, também, muita pobreza e muito dano ambiental”, lembra. “Por isso, a exploração tem de ser de forma inteligente, com responsabilidade ambiental e investimento em educação.” Hoje as riquezas do petróleo são distribuídas a estados, municípios e União por intermédio de royalties. Pela lei em vigor, os recursos devem ser investidos na parte social do país, “mas as prefeituras fazem mau uso dos recursos”, avalia.

Explorar o pré-sal requer esforços científicos e tecnológicos, considerando que os reservatórios estão a quase sete mil metros de profundidade a partir do nível do mar, com destaque para as Bacias de Santos (SP) e de Campos (RJ). Para fazer frente a esses desafios, Estefen diz que o país precisa mobilizar a comunidade científica nacional, seu conhecimento disponível, criar novos laboratórios, formar capital humano e gerar empregos de qualidade. “Extrair o petróleo do pré-sal vai demandar grande esforço tecnológico, esforços que vão ajudar o Brasil a conquistar novos patamares de desenvolvimento, futuramente”, diz. “Isto é, se usarmos bem os recursos do pré-sal, vamos educar as crianças, desenvolver a indústria, a ciência e a tecnologia. Se seguir tal receituário, o Brasil deverá se destacar no cenário internacional como um dos líderes tecnológicos, dentre os quais figuram Estados Unidos, Japão e países europeus”, conclui.

Pesquisadores analisam os custos ambientais da exploração profunda
Ao colocar na balança os benefícios que as riquezas do pré-sal podem proporcionar ao país e os eventuais custos ambientais, pesquisadores avaliam que o Brasil não pode renunciar à exploração do petróleo em águas profundas, unilateralmente, mesmo reconhecendo que a queima do petróleo contribui para o aquecimento global. Isso não significa que o processo de exploração do pré-sal desconsidere os danos ambientais.

O diretor de tecnologia e inovação da Coppe/UFRJ, Segen Estefen, insiste em dizer que todas as pesquisas em andamento vislumbram a proteção do meio ambiente, em uma tentativa de dar mais segurança às operações. “Não faz sentido o Brasil se beneficiar do petróleo por três ou quatro décadas, mas deixar o país em uma situação ruim para o meio ambiente”, explica.

Hoje os pesquisadores da Coppe, por exemplo, trabalham, simultaneamente, com assuntos ligados tanto à produção de petróleo, nos dias atuais, quanto a outras tecnologias que podem ser usadas na era “pós-petróleo”. Estudam, entre outros aspectos, a produção de eletricidade pelas ondas do mar – uma energia limpa e renovável – aproveitando a mesma estrutura montada e financiada pela indústria do petróleo para desenvolver conhecimento para o período pós-petróleo.

Para o especialista da Coppe/UFRJ, o Brasil não pode renunciar ao óleo do pré-sal porque essa “é uma riqueza importante para o Brasil” por ser uma fonte de energia competitiva. Dessa forma, ele acrescenta, a extração do pré-sal deverá render frutos positivos ao país. “No Brasil, ainda com tanta desigualdade, não podemos abdicar dessas riquezas”, diz. “Se não forem exploradas, talvez, daqui a 50 anos o preço do petróleo não valha metade dos valores atuais.” Por enquanto, Estefen acrescenta, não existe nenhum combustível capaz de substituir o petróleo e nem previsões para os próximos 20 anos, aproximadamente. Além disso, a demanda por essa energia tende a aumentar muito em função do aumento da população e da demanda de países, principalmente nos países asiáticos.

Demonstrando a mesma opinião, o professor Ricardo Cabral de Azevedo, do Departamento de Engenharia de Minas e de Petróleo da Escola Politécnica da Universidade de São Paulo (Poli/USP), considera ideal o país investir no conhecimento para substituir o uso do combustível fóssil, paulatinamente, em uma tentativa de minimizar os impactos ambientais. “O fato é que sempre haverá riscos, nessa ou em qualquer outra atividade, mas o ser humano ainda precisa do petróleo”, lembra. “Desse modo, o fundamental é procurarmos reduzi-los ao máximo. Aí também as experiências do passado são fundamentais, para aprendermos com os erros já cometidos.”

O eventual retorno socioeconômico proporcionado pela exploração de petróleo na camada pré-sal compensam os riscos ambientais, na observação do pesquisador e professor Eustáquio Vinícius de Castro, do Laboratório de Petróleo da Universidade Federal do Espírito Santo (UFES). “Compensam desde que as coisas aconteçam de forma inteligente e sustentável e com racionalidade no processo de produção”, diz. ” Hoje, as empresas petrolíferas, que no passado foram mais poluentes, adotam mais segurança no processo de extração do petróleo, mesmo que alguns problemas aconteçam de vez em quando”.

Dimensão – O petróleo na camada pré-sal ocupa, aproximadamente, uma área de 800 km de comprimento por 200 km de largura, acompanhando a linha do litoral sudeste brasileiro. Segundo dados da Petrobras, desde 2006, foram perfurados mais de 80 poços, tanto na Bacia de Santos quanto na de Campos, com índice “de sucesso exploratório” acima de 80%. A estimativa é de que outras 19 novas plataformas entrem em operação até 2016; e outras 19 entrem em atividade até 2020. Segundo dados de suas assessoria de imprensa, a companhia petrolífera, líder na exploração do pré-sal, encomendou ainda 21 plataformas de produção e 28 sondas de exploração marítima a serem construídas até 2020 no País, além de 49 navios-tanque e centenas de barcos de apoio e serviços offshore.

The Folly of Defunding Social Science (Huffington Post)

Scott Atran

Posted: 03/15/2013 10:55 pm

With the so-called sequester geared to cut billions of dollars to domestic programs, military funding, social services, and government-sponsored scientific research — including about a 6 percent reduction for the National Institutes of Health and the National Science Foundation — policymakers and professionals are scrambling to stave off the worst by resetting priorities. One increasingly popular proposal among congressional budget hawks is to directly link federal funding of science to graduate employment data that seriously underestimates the importance and impact of social sciences to the nation at large, in order to effectively justify eliminating social science from the federal research budget. For example, federal legislation introduced by Senators Marco Rubio (R-Fla.), and Ron Wyden (D-Ore.), would require states to match information from unemployment insurance databases with individual student data and publish the results, which would show earnings by program at each institution of higher education. But educators and economists note that measuring return on investment via salary alone is too simplistic: liberal arts majors often start out at lower salaries but make more than their peers in later decades. Even more worrisome, in the guise of practicality, such maneuvers offer up a not-so-veiled attempt to justify eliminating government funding for the social sciences, perilously underestimating their importance and impact to the economy and national welfare.

In a major speech last month, Eric Cantor, the U.S. House majority leader, proposed outright to defund political and social science: “Funds currently spent by the government on social science — including on politics of all things — would be better spent on curing diseases.” Cantor’s call to gut the federal research budget for social science echoes Florida governor Rick Scott’s push to eliminate state funding for disciplines like anthropology and psychology in favor of “degrees where people can get jobs,” especially in technology and medicine. Targeting the social sciences with little understanding of their content is an old story for legislature looking to score cheap political points. The late Sen. William Proxmire (D-Ark.) used to scour the titles of NSF-funded projects in psychology and anthropology, looking for recipients of his Golden Fleece Awards without bothering to examine the results of the research he myopically pilloried. Such shenanigans ignore the fact that social science research provides precise knowledge that is relevant to people’s practical needs and the nation’s economic and security priorities. Most government laws, programs, and outlays directly concern social issues, including the establishment and means of government itself, and the need for law enforcement, military capabilities, education, and commerce.

Gutting social science also undermines national security. For, despite hundreds of billions of taxpayer dollars poured into the global war on terrorism, radicalization against our country’s core interests continues to spread — and social science offers better ways than war to turn the tide. Moreover, social science is in fact moving the “hard” sciences forward. For example, recent research based on social science modeling of cancer cells as cooperative agents in competition with communities of healthy cells holds the promise of more effective cancer treatment. Those who would defund social science seriously misconstrue the relationship between the wide-ranging freedom of scientific research and its ability to unlock the deeper organizing principles linking seemingly unrelated phenomena.

The Founding Fathers envisioned a Republic with an enlightened citizenry educated in “all philosophical Experiments that Light into the Nature of Things … and multiply the Conveniences or Pleasures of Life” — not just technical training for jobs that pay well.

Social Warfare (Foreign Policy)

Budget hawks’ plans to cut funding for political and social science aren’t just short-sighted and simple-minded — they’ll actually hurt national security.

BY SCOTT ATRAN | MARCH 15, 2013

With the automatic sequestration cuts geared up to slash billions of dollars from domestic programs, military funding, social services, and government-sponsored scientific research — including about a 6 percent reduction for the National Institutes of Health (NIH) and the National Science Foundation (NSF) — policymakers and professionals are scrambling to stave off the worst by resetting priorities. In a major speech last month, House majority leader Eric Cantor (R-VA), proposed outright to defund political and social science: “Funds currently spent by the government on social science — including on politics of all things — would be better spent on curing diseases,” he said, echoing a similar proposal he made in 2009. Florida Governor Rick Scott has made a similar push, proposing to divert state funds from disciplines like anthropology and psychology “to degrees where people can get jobs,” especially in technology and medicine. Those are fighting words, but they’re also simple-minded.

Social science may sound like a frivolous expenditure to legislative budget hawks, but far from trimming fat, defunding these programs would fundamentally undercut core national interests. Like it or not, social science research informs everything from national security to technology development to healthcare and economic management. For example, we can’t decide which drugs to take, unless their risks and benefits are properly assessed, and we can’t know how much faith to have in a given science or engineering project, unless we know how much to trust expert judgment. Likewise, we can’t fully prepare to stop our adversaries, unless we understand the limits of our own ability to see why others see the world differently. Despite hundreds of billions of taxpayer dollars poured into the global war on terrorism, radicalization against our country’s core interests continues to spread — and social science offers better ways than war to turn the tide.

In support of Rep. Cantor’s push to defund political and social science, a recent article in theAtlantic notes that “money [that] could have gone to towards life-saving cancer research” instead went to NSF-sponsored projects that “lack real-world impact” such as “the $750,000 spent studying the ‘sacred values‘ involved in cultural conflict.” Perhaps the use of words like “sacred” or “culture” incites such scorn, but as often occurs in many denunciations of social science, scant attention is actually paid to what the science proposes or produces. In fact, the results of this particular project — which I direct — have figured into numerous briefings to the National Security Staff at the White HouseSenate and House committees, the Department of State and Britain’s Parliament, and the Israeli Knesset (including the prime minister and defense minister). In addition, the research offices of the Department of Defense have also supported my team’s work, which figures prominently in recent strategy assessments that focus on al Qaeda and broader problems of radicalization and political violence.

Let me try to explain just exactly what it is that we do. My research team conducts laboratory experiments, including brain imaging studies — supported by field work with political leaders, revolutionaries, terrorists, and others — that show sacred values to be core determinants of personal and social identity (“who I am” and “who we are”). Humans process these identities as moral rules, duties, and obligations that defy the utilitarian and instrumental calculations ofrealpolitik or the marketplace. Simply put, people defending a sacred value will not trade its incarnation (Israel’s settlements, Iran’s nuclear fuel rods, America’s guns) for any number of iPads, or even for peace.

The sacred values of “devoted actors,” it turns out, generate actions independent of calculated risks, costs, and consequences — a direct contradiction of prevailing “rational actor” models of politics and economics, which focus on material interests. Devoted actors, in contrast, act because they sincerely and deeply believe “it’s the right thing to do,” regardless of risks or rewards. Practically, this means that such actors often harness deep and abiding social and political commitments to confront much stronger foes. Think of the American revolutionaries, who were willing to sacrifice “our lives, our fortunes and our sacred honor” in the fight for liberty against the greatest military power of the age — or modern suicide bombers willing to sacrifice everything for their cause.

Sacred values — as when land becomes “Holy Land” — sustain the commitment of revolutionaries and some terrorist groups to resist, and often overcome, more numerous and better-equipped militaries and police that function with measured rewards like better pay or promotion. Our research with political leaders and general populations also shows that sacred values — not political games or economics — underscore intractable conflicts like those between the Israelis and the Palestinians that defy the rational give-and-take of business-like negotiation. Field experiments in Israel, Palestine, Nigeria, and the United States indicate that commitment to such values can motivate and sustain wars beyond reasonable costs and casualties.

So what are the practical implications of these findings? Perhaps most importantly, our research explains why efforts to broker peace that rely on money or other material incentives are doomed when core values clash. In our studies with colleagues in Afghanistan, India, Indonesia, Iran, the Levant, and North Africa, we found that offers of material incentives to compromise on sacred values often backfire, actually increasing anger and violence toward a deal. For example, a 2010 study of attitudes toward Iran’s nuclear program found that most Iranians do not view the country’s nuclear program as sacred. But for about 13 percent of the population, the program has been made sacred through religious rhetoric. This group, which tends to be close to the regime, now believes a nuclear program is bound up with the national identity and with Islam itself. As a result, offering these people material rewards or punishments to abandon the program only increases their anger and support for it. Predictably, new sanctions, or heightened perception of sanctions, generate even more belligerent statements and actions by the regime to increase the pace, industrial capacity, and level of uranium enrichment. Of course, majority discontent with sanctions may yet force the regime to change course, or to double down on repression.

Understanding how this process plays out over time is a key to helping friends, thwarting enemies, and managing conflict. The ultimate goal of such research is to help save lives, resources, and national treasure. And by generating psychological knowledge about how culturally diverse individuals and groups advance values and interests that are potentially compatible or fundamentally antagonistic to our own, it can help keep the nation’s citizens, soldiers, and potential allies out of harm’s way. Our related research on the spiritual and material aspects of environmental disputes between Native American and majority-culture populations in North America andCentral America has also revealed surprising but practical ways to reduce conflict andsustainably manage forest commons and wildlife.

The would-be defunders of social science denounce an ivory tower that seems to exist only in their imagination — willfully ignoring evidence-based reasoning and results in order to advance a political agenda. Only $11 million of the NSF’s $7 billion-plus budget goes to political science research. It is exceedingly doubtful that getting rid of the entire NSF political science budget, which is equal to 0.5 percent of the cost of a single B-2 bomber, would really help to produce life-saving cancer research, where testing for even a single drug can cost more to develop than a B-2. Not that we must choose between either, mind you.

Social science is in fact moving the “hard” sciences forward. Consider the irony: a close collaborator on the “sacred values” project, Robert Axelrod, former president of the American Political Science Association, recently produced a potentially groundbreaking cancer study based on social science modeling of cancer cells as cooperative agents in competition with communities of healthy cells. Independent work by cancer researchers in the United States and abroad hasestablished that the cooperation among tumor cells that Axelrod and colleagues proposed does in fact take place in cell lines derived from human cancers, which has significant implications for the development of effective treatments.

Research from other fields of social science, including social and cognitive psychology and anthropology, continue to have deep implications for an enormous range of human problems: including how to better design and navigate transportation and communication networks, or manage airline crews and cockpits; on programming robots for industry and defense; on modeling computer systems and cybersecurity; on reconfiguring emergency medical care and diagnoses; in building effective responses to economic uncertainty; and enhancing industrial competitiveness and innovation. For example, perhaps the greatest long-term menace to the security of U.S. industry and defense is cyberwarfare, where the most insidious and hard-to-manage threat may stem not from hardware or software vulnerabilities but from “wetware,” the inclinations and biases of socially interacting human brains — as in just doing a friend a favor (like “click this link” or “can I borrow your flash drive?”). In recognition of that fact, Axelrod has suggested to the White House and Defense Department an “honor code” encouraging individuals to not only maintain cybersecurity themselves, but also not to lapse into doing favors for friends and to report such lapses in others.

Elected officials have the mandate to set priorities for research funding in the national interest. Ever since Abraham Lincoln established the National Academy of Sciences, however, a clear priority has been to allow scientific inquiry fairly free rein — to doubt, challenge, and ultimately change received wisdom if based on solid logic and evidence. What Rep. Cantor and like-minded colleagues seem to be saying is that this is fine, but only in the fields they consider expedient: in technology, medicine, and business. (Though possibly they mean to make an exception for the lucrative social science of polling, which can help to sell almost anything — even terrible ideas like defunding the rest of social science.)

It’s stunning to think that these influential politicians and the people who support them don’t want evidence-based reasoning and research to inform decisions concerning the nature and needs of our society — despite the fact that the vast majority of federal and state legislation deals with social issues, rather than technology or defense. To be sure, there is significant waste and wrongheadedness in the social sciences, as there is in any science (in fact, in any evolutionary process that progresses by trial and error), including, most recently, billions spent on possibly misleading use of mice in cancer research.

But those who would defund social science seriously underestimate the relationship between the wide-ranging freedom of scientific research and its pointed impact, and between theory and practice: Where disciplined imagination sweeps broadly to discover, say, that devoted actors do not respond to material incentives or disincentives (e.g., sanctions) in the same way that rational actors do, or that communities of people and body cells may share deep underlying organizational principles and responses to threats from outside aggressors, such knowledge can have a profound influence on our lives and wellbeing.

Even before they revolted in 1776, the American colonists may have already enjoyed the world’s highest standard of living. But they wanted something different: a free and progressive society, which money couldn’t buy. “Money has never made man happy, nor will it,” gibed Ben Franklin, but “if a man empties his purse into his head no one can take it away from him; an investment in knowledge always pays the best interest.” He founded America’s first learned society “to improve the common stock of knowledge,” which called for inquiry into many practical matters as well as “all philosophical Experiments that Light into the Nature of Things … and multiply the Conveniences or Pleasures of Life.” George Washington, Thomas Jefferson, John Adams, Alexander Hamilton, Thomas Paine, James Madison, and John Marshall all joined Franklin’s society and took part in the political, social, and economic revolution it helped spawn. Like the Founding Fathers, we want our descendants to be able to envision great futures for our country and a better world for all. For that, our children need the broad understanding of how the world works that the social sciences can provide — not just a technical education for well-paying jobs.

How to Predict the Future of Technology (Science Daily)

Jan. 25, 2013 — The bread and butter of investing for Silicon Valley tech companies is stale. Instead, a new method of predicting the evolution of technology could save tech giants millions in research and development or developments of new products — and help analysts and venture capitalists determine which companies are on the right track.

The high-tech industry has long used Moore’s Law as a method to predict the growth of PC memory. Moore’s Law states that the number of chips on a transistor doubles every 18 months (initially every year). A paper by Gareth James and Gerard Tellis, professors at the USC Marshall School of Business and their co-authors Ashish Sood, at Emory and Ji Zhu at the University of Michigan, concludes that Moore’s Law does not apply for most industries, including the PC industry.

High-tech companies traditionally use Moore’s Law and other similar heuristics to predict the path of evolution of competing technologies and to decide where to funnel millions into research and development or new product development. The paper’s researchers claim that these models are outdated and inaccurate.

The paper offers a new model, Step and Wait (SAW), which more accurately tracks the path of technological evolution in six markets that the authors tested. According to the researchers, Moore’s Law and other models such as Kryder’s Law and Gompertz Law predict a smooth increasing exponential curve for the improvement in performance of various technologies. In contrast, the authors found that the performance of most technologies proceeds in steps (or jumps) of big improvements interspersed with waits (or periods of no growth in performance).

The sweet spot is in knowing which technology to back based on predicting when a new technology is going to have a jump in performance.

“We looked at the forest rather than the trees and see ‘steps’ and ‘waits’ across a variety of technologies,” Tellis said. While no one law applies to every market, Tellis and his co-authors looked at 26 technologies in six markets from lighting to automobile batteries, and found that the SAW model worked in all six, in contrast to several other competing models.

What Tellis and his colleagues did come up with, are average performance improvements for the industry in terms of “steps” and wait times (see table to the right). The challenge for strategists is to invest in various technologies to beat these averages.

Tellis said that tablet and mobile phone manufacturers can leverage this data. “Any manager has first to break down his or her products into components, find components for each technology, and then predict the future path of those technologies. For example, the mobile phone consists of three important technological components: memory, display, or CPU, the first two of which the authors analyzed. Similarly, tablets, manufacturers could rely on the figures for display and memory technologies.”

An example of how the SAW model could have saved a company from decline is Sony’s investment in TVs. Sony kept investing in cathode ray tube technology (CRT) even after liquid crystal display technology (LCD) first crossed CRT in performance in 1996. Instead of considering LCD, Sony introduced the FD Trinitron/WEGA series, a flat version of the CRT. CRT out-performed LCD for a few years, but ultimately lost decisively to LCD in 2001. In contrast, by backing LCD, Samsung grew to be the world’s largest manufacturer of the better performing LCD. The former market leader, Sony, had to seek a joint venture with Samsung in 2006 to manufacture LCDs.

Having the SAW model at the ready might have changed their course. “Prediction of the next step size and wait time using SAW could have helped Sony’s managers make a timely investment in LCD technology,” according to the study.

Journal Reference:

  1. Sood, Ashish, James, Gareth, Tellis, Gerard J. and Zhu, Ji.Predicting the Path of Technological Innovation: SAW Versus Moore, Bass, Gompertz, and KryderMarketing Science., July 22, 2012 [link]