Arquivo da tag: Incerteza

Mainstream green is still too white (Color Lines)

By Brentin Mock; Cross-posted from ColorLines

We missing anything here?Last year was the hottest on record for the continental United States, and it wasn’t an outlier. The last 12 years have been the warmest years since 1880, the year the National Oceanic and Atmospheric Administration began tracking this information. And climate scientists predict that the devastating blizzards, droughts, hurricanes, and wildfires we’ve been experiencing lately will worsen due to climate change.

In many ways these punishing weather events feel like Mother Nature seeking revenge for our failure to reduce greenhouse gas emissions, the primary cause of global warming. Despite abundant evidence, the U.S. government has yet to pass a law that would force a reduction in these emissions.

During his first term, President Obama did make climate change a priority, both in his campaign and in office. The American Clean Energy and Security Act that Congress produced passed through the House in June 2009 by a narrow margin. Yet the bill never reached a vote in the Senate, and it died quietly.

Environmentalists have been flummoxed ever since. One prominent cause-of-death theory says that large mainstream (and predominantly white) environmental groups failed to mobilize grassroots support and ignored those who bear a disproportionate burden of climate change, namely poor people of color.

With Obama in for a second term and reaffirmed in his environmental commitments, climate legislation has another chance at life. Now, observers are wondering if mainstream environmentalists learned the right lessons from the first climate bill failure and how they’ll work with people of color this time around.

Anatomy of a conflict

To hear some environmental leaders tell it, their defeat wasn’t due to a lack of investment in black and brown people living in poor and working class communities, but to an over-investment in Obama. For example, Dan Lashof, climate and clean air director for Natural Resources Defense Council (NRDC), has blamed the president for having the audacity to push healthcare reform and he’s pointed the finger at green groups for being too patient with Obama.

Asked what environmental advocates who led the first climate bill effort could have done differently in 2009, Bill McKibben, founder of the online grassroots organizing campaign 350.org, says their game plan was too insular. “There was no chance last time because all the action was in the closed rooms, not in the streets,” he tells Colorlines.com.

Yet that “action” took place behind closed doors for a reason: Major mainstream green groups including the Environmental Defense Fund and The Nature Conservancy teamed up with oil companies and some of the biggest polluters and emitters in the nation to form the United States Climate Action Partnership (USCAP). This ad hoc alliance was the driving force behind the failed 2009 bill and there were no environmental justice, civil rights, or people-of-color groups at the USCAP table.

Obama can’t be blamed for the blind spots of major groups. As recent Washington Post and Politico articles have pointed out, their leadership and membership simply don’t reflect the race or socieconomic class of people most vulnerable to climate change’s wrath.

Sarah Hansen, former executive director of the Environmental Grantmakers Association, argued recently that the mainstream has been stingy with funding and resources and inept at engaging environmental justice communities. In a National Committee for Responsive Philanthropy (NCRP) study, “Cultivating the Grassroots: A Winning Approach for Environmental and Climate Funders,” Hansen reported that philanthropies awarded most of their environmental dollars to large, predominantly white groups but received little return in terms of law and policy. Meanwhile, wrote Hansen, too few dollars have been invested in community- and environmental justice-based organizations.

According to the NCRP report, environmental organizations with $5 million-plus budgets made up only 2 percent of green groups in general but in 2009 received half of all grants in the field. The NCRP also found that 15 percent of all green dollars benefited marginalized populations between 2007 and 2009. Only 11 percent went to social justice causes.

In January, Harvard professor Theda Skocpol released a study of the first climate bill campaign’s failure and faulted green groups involved for choosing direct congressional lobbying over grassroots organizing. Some of the major organizations did spend money on field organizers, wrote Skocpol, but only to push public messaging like billboards and advertisements.

“The messaging campaigns would not make it their business to actually shape legislation — or even talk about details with ordinary citizens or grassroots groups,” Skocpol wrote in the report. The public “is seen as a kind of background chorus that, hopefully, will sing on key.”

Take one for the team?

That the environmental movement thought billboards and ads could replace educating and organizing actual people was their biggest flaw, a position shared by Hansen and Skocpol. In comparison, health reform advocates took a lobbying and grassroots approach while the climate-change bill made the rounds and got a law passed.

“If you want to gain the trust of the emerging non-white majority, it’s not just a messaging thing,” explains Ryan Young, legal counsel for the California-based Greenlining Institute, a policy research nonprofit focused on economic, environmental, and racial justice. “It’s a values thing. You must understand the values of these communities and craft policy around that.”

Why does this matter?

Consider how the website of the National Wildlife Federation (NWF) recently featured an article on city bird sanctuaries from the group’s print magazine titled “Urban Renewal.”

Having people of color on staff might have helped NWF understand that for some, “urban renewal” signifies a historical legacy of black and Latino neighborhoods being effectively erased by development projects such as sports stadiums. Cultural snafus like this have led to white environmental groups being clowned in influential outlets including The Daily Show.

In an interview about the unintended message of “Urban Renewal,” Jim Lyon, NWF’s vice president for conservation policy, told Colorlines.com that the group doesn’t “always get everything right” and that “he’d take it back to his staff.” (Ironically, one of the harshest critiques of urban renewal came from Jane Jacobs, a white conservationist.) On the topic of staff diversity, Lyon said the organization isn’t where they want it to be, but that they’ve made “good progress.” He would not release staff demographics, but said NWF achieves diversity through partnerships with other groups and programs like Eco-Schools USA, which he says “engages more than 1 million children of color” daily.

Beverly Wright, who heads the New Orleans-based Deep South Center for Environmental Justice, says racial oversights of traditionally white groups are the main reason black and Latino environmentalists have formed their own organizations. The culturally divided camps sometimes use the same words, but they’re often speaking different languages.

Take “cap-and-trade,” a scheme that would commodify greenhouse gas emissions for market-trading as a way to reduce those emissions. The first climate bill centered on cap-and-trade because most major environmental groups supported it. But cap-and-trade was anathema to environmental justice because it did nothing to curb local co-pollutants such as smog and soot, direct threats to communities of color. That’s not to mention that cap-and-trade was the brainchild of C. Boyden Gray, a conservative member of the Federalist Society and leader of FreedomWorks, today a major Tea Party funder.

Wright says major green groups tried to coax environmental justice organizations into supporting cap-and-trade by claiming it was for the “greater good.”

“But that meant white people get all the greater goods and we get the rest,” says Wright. “Until they want to have real discussions around racism, they won’t have our support. That’s what happened last time with the climate bill. It did not move, because they did not have diversity in their voices.”

“Diversity” doesn’t just mean hiring more people of color. As the 30-year-old Center for Health, Environment and Justice stated in March, the diversity conversation “really needs to be about resources and assistance to the front line communities rather than head counting.”

What’s next?

So in the new round of climate bill talks, will large environmental groups meaningfully engage community-based environmental justice groups?

The prognosis is mixed. Look at MomentUs, a mammoth collaborative started in January to ramp up support for new climate legislation. While MomentUs claims to be a game-changer, the strategy behind it seems very similar to that of USCAP’s — the one that failed to deliver a climate-change law the first time around. On its website, MomentUs describes its board of directors as “cultural, environmental, business, and marketing leaders who offer the diversity of viewpoints and keen insight vital to advancing MomentUs’s mission.” At press time, all of the directors are white. So is the staff, except for one office administrator.

Looking at MomentUs partners, it appears that the same traditionally white environmental organizations who teamed up for USCAP are now working with corporations including ALEC funder Duke Energy, predatory subprime mortgage king Wells Fargo, perennial labor union target Sodexho, and Disney. At press time there are no environmental justice or civil rights groups involved.

On the other side of the spectrum, The Sierra Club — one of the nation’s largest and whitest green groups — has had an expansive role in environmental justice and advocacy, particularly in the Gulf Coast. In January it joined the NAACP and labor unions in launching the Democracy Initiative, which will tackle voting rights, environmental justice, and other civil rights concerns.

To be sure, it’s way too early to make a conclusion about MomentUs or the Democracy Initiative, but the latter appears to be a step in the right direction in terms of highlighting the intersection between poor environmental outcomes and racism.

McKibben, the 350.org founder, has helped cultivate a multicultural fight against the Keystone XL pipeline project, but he admits that the overall environmental movement has “tons of work to do” on racial equity and inclusion.

“The sooner [mainstream environmentalists] absorb the message and are led by members of the environmental justice movement, the better,” he says.

In that case, the question is a matter of timing and power, of who decides when and which environmental justice activists get to lead.

Stay tuned.

Brentin Mock is a New Orleans-based journalist who serves as ColorLines’s reporting fellow on voting rights.     

Indigenous rights are the best defence against Canada’s resource rush (Guardian)

First Nations people – and the decision of Canadians to stand alongside them – will determine the fate of the planet

By Martin Lukacs

Friday 26 April 2013 16.12 BST – guardian.co.uk

Canada blog about Aboriginal rights : First Nations protesters in Idle No More demonstration Toronto

First Nations protesters are silhouetted against a flag as the take in a Idle No More demonstration in Toronto, January 16, 2013. Photograph: Mark Blinch/Reuters

In a boardroom in a soaring high-rise on Wall Street, Indigenous activist Arthur Manuel is sitting across from one of the most powerful financial agents in North America.

It’s 2004, and Manuel is on a typical mission. Part of a line of distinguished Indigenous leaders from western Canada, Manuel is what you might call an economic hit-man for the right cause. A brilliant thinker trained in law, he has devoted himself to fighting Canada’s policies toward Indigenous peoples by assailing the government where it hurts most – in its pocketbook.

Which is why he secured a meeting in New York with a top-ranking official at Standard & Poor’s, the influential credit agency that issues Canada’s top-notch AAA rating. That’s what assures investors that the country has its debts covered, that it is a safe and profitable place to do business.

This coveted credit rating is Manuel’s target. His line of attack is to try to lift the veil on Canada’s dirty business secret: that contrary to the myth that Indigenous peoples leech off the state, resources taken from their lands have in fact been subsidizing the Canadian economy. In their haste to get at that wealth, the government has been flouting their own laws, ignoring Supreme Court decisions calling for the respect of Indigenous and treaty rights over large territories. Canada has become very rich, and Indigenous peoples very poor.

In other words, Canada owes big. Some have even begun calculating how much. According to economist Fred Lazar, First Nations in northern Ontario alone are owed $32 billion for the last century of unfulfilled treaty promises to share revenue from resources. Manuel’s argument is that this unpaid debt – a massive liability of trillions of dollars carried by the Canadian state, which it has deliberately failed to report – should be recognized as a risk to the country’s credit rating.

How did the official who could pull the rug under Canada’s economy respond? Unlike Canadian politicians and media who regularly dismiss the significance of Indigenous rights, he took Manuel seriously. It was evident he knew all the jurisprudence. He followed the political developments. He didn’t contradict any of Manuel’s facts.

He no doubt understood what Manuel was remarkably driving at: under threat of a dented credit rating, Canada might finally feel pressure to deal fairly with Indigenous peoples. But here was the hitch: Standard & Poor’s wouldn’t acknowledge the debt, because the official didn’t think Manuel and First Nations could ever collect it. Why? As author Naomi Klein, who accompanied Manuel at the meeting, remembers, his answer amounted to a realpolitik shoulder shrug.

“Who will able to enforce the debt? You and what army?”

This was his brutal but illuminating admission: Indigenous peoples may have the law on their side, but they don’t have the power. Indeed, while Indigenous peoples’ protests have achieved important environmental victories – mining operations stopped here, forest conservation areas set up there – these have remained sporadic and isolated. Canada’s country-wide policies of ignoring Indigenous land rights have rarely been challenged, and never fundamentally.

Until now. If it’s only a social movement that can change the power equation upholding the official’s stance, then the Idle No More uprisingmay be it. Triggered initially in late 2012 by opposition to the Conservative government’s roll-back of decades of environmental protection, this Indigenous movement quickly tapped into long-simmering indignation. Through the chilly winter months, Canada witnessed unprecedented mobilizations, with blockades and round-dances springing up in every corner of the country, demanding a basic resetting of the relationship between Canada and Indigenous peoples.

Money is not the main form this justice will take. First Nations desperately need more funding to close the gap that exists between them and Canadians. But if Indigenous peoples hold a key to the Canadian economy, the point is to use this leverage to steer the country in a different direction. “Draw that power back to the people on the land, the grassroots people fighting pipelines and industrial projects,” Manuel says. “That will determine what governments can or cannot do on the land.”

The stakes could not be greater. The movement confronts a Conservative Canadian government aggressively pursuing $600 billion of resource development on or near Indigenous lands. That means the unbridled exploitation of huge hydrocarbon reserves, including the three-fold expansion of one of the world’s most carbon-intensive projects, the Alberta tar sands. Living closest to these lands, Indigenous peoples are the best and last defence against this fossil fuel scramble. In its place, they may yet host the energy alternatives – of wind, water, or solar.

No surprise, then, about the government’s basic approach toward First Nations: “removing obstacles to major economic development.” Hence the movement’s next stage – a call for defiance branded Sovereignty Summer – is to put more obstacles up. The assertion of constitutionally-protected Indigenous and treaty rights – backed up by direct action, legal challenges and massive support from Canadians – is exactly what can create chronic uncertainty for this corporate and government agenda. For those betting on more than a half-trillion in resource investments, that’s a very big warning sign.

Industry has taken notice. A recent report on mining dropped Canada out of the top spot for miners: “while Canadian jurisdictions remain competitive globally, uncertainties with Indigenous consultation and disputed land claims are growing concerns for some.” And if the uncertainty is eventually tagged with a monetary sum, then Canada will, as Manuel warned Standard & Poor’s, face a large and serious credit risk. Trying to ward off such a threat, the government is hoping to lock mainstream Indigenous leaders into endless negotiations, or sway them with promises of a bigger piece of the resource action.

But this bleak outlook intent on a final ransacking of the earth doesn’t stand up to the vision the movement offers Canadians. Implementing Indigenous rights on the ground, starting with the United Nations Declaration on the Rights of Indigenous Peoples, could tilt the balance of stewardship over a vast geography: giving Indigenous peoples much more control, and corporations much less. Which means that finally honouring Indigenous rights is not simply about paying off Canada’s enormous legal debt to First Nations: it is also our best chance to save entire territories from endless extraction and destruction. In no small way, the actions of Indigenous peoples – and the decision of Canadians to stand alongside them – will determine the fate of the planet.

This new understanding is dawning on more Canadians. Thousands are signing onto educational campaigns to become allies to First Nations. Direct action trainings for young people are in full swing. As Chief Allan Adam from the First Nation in the heart of the Alberta oil patch has suggested, it might be “a long, hot summer.”

Sustained action that puts real clout behind Indigenous claims is what will force a reckoning with the true nature of Canada’s economy – and the possibility of a transformed country. That is the promise of a growing mass protest movement, an army of untold power and numbers.

What BP Doesn’t Want You to Know About the 2010 Gulf Spill (The Daily Beast)

The 2010 Gulf of Mexico oil spill was even worse than BP wanted us to know.

by  | April 22, 2013 4:45 AM EDT

“It’s as safe as Dawn dishwashing liquid.” That’s what Jamie Griffin says the BP man told her about the smelly, rainbow-streaked gunk coating the floor of the “floating hotel” where Griffin was feeding hundreds of cleanup workers during the BP oil disaster in the Gulf of Mexico. Apparently, the workers were tracking the gunk inside on their boots. Griffin, as chief cook and maid, was trying to clean it. But even boiling water didn’t work.

BP Oil Spill
An agonizing 87 days passed before the BP oil spill was finally sealed off. According to US government estimates, 210 million gallons of Louisiana sweet crude had escaped into the Gulf, making this disaster the largest unintentional oil leak in world history. (Benjamin Lowy/Getty)

“The BP representative said, ‘Jamie, just mop it like you’d mop any other dirty floor,’” Griffin recalls in her Louisiana drawl.

It was the opening weeks of what everyone, echoing President Barack Obama, was calling “the worst environmental disaster in American history.” At 9:45 p.m. local time on April 20, 2010, a fiery explosion on the Deepwater Horizon oil rig had killed 11 workers and injured 17. One mile underwater, the Macondo well had blown apart, unleashing a gusher of oil into the gulf. At risk were fishing areas that supplied one third of the seafood consumed in the U.S., beaches from Texas to Florida that drew billions of dollars’ worth of tourism to local economies, and Obama’s chances of reelection. Republicans were blaming him for mishandling the disaster, his poll numbers were falling, even his 11-year-old daughter was demanding, “Daddy, did you plug the hole yet?”

Griffin did as she was told: “I tried Pine-Sol, bleach, I even tried Dawn on those floors.” As she scrubbed, the mix of cleanser and gunk occasionally splashed onto her arms and face.

Within days, the 32-year-old single mother was coughing up blood and suffering constant headaches. She lost her voice. “My throat felt like I’d swallowed razor blades,” she says.

Then things got much worse.

Like hundreds, possibly thousands, of workers on the cleanup, Griffin soon fell ill with a cluster of excruciating, bizarre, grotesque ailments. By July, unstoppable muscle spasms were twisting her hands into immovable claws. In August, she began losing her short-term memory. After cooking professionally for 10 years, she couldn’t remember the recipe for vegetable soup; one morning, she got in the car to go to work, only to discover she hadn’t put on pants. The right side, but only the right side, of her body “started acting crazy. It felt like the nerves were coming out of my skin. It was so painful. My right leg swelled—my ankle would get as wide as my calf—and my skin got incredibly itchy.”

“These are the same symptoms experienced by soldiers who returned from the Persian Gulf War with Gulf War syndrome,” says Dr. Michael Robichaux, a Louisiana physician and former state senator, who treated Griffin and 113 other patients with similar complaints. As a general practitioner, Robichaux says he had “never seen this grouping of symptoms together: skin problems, neurological impairments, plus pulmonary problems.” Only months later, after Kaye H. Kilburn, a former professor of medicine at the University of Southern California and one of the nation’s leading environmental health experts, came to Louisiana and tested 14 of Robichaux’s patients did the two physicians make the connection with Gulf War syndrome, the malady that afflicted an estimated 250,000 veterans of that war with a mysterious combination of fatigue, skin inflammation, and cognitive problems.

Meanwhile, the well kept hemorrhaging oil. The world watched with bated breath as BP failed in one attempt after another to stop the leak. An agonizing 87 days passed before the well was finally plugged on July 15. By then, 210 million gallons of Louisiana sweet crude had escaped into the Gulf of Mexico, according to government estimates, making the BP disaster the largest accidental oil leak in world history.

In 2010, Pulitzer Prize-winning animator Mark Fiore created this humorous and poignant take on the BP oil spill.

Yet three years later, the BP disaster has been largely forgotten, both overseas and in the U.S. Popular anger has cooled. The media have moved on. Today, only the business press offers serious coverage of what the Financial Times calls “the trial of the century”—the trial now under way in New Orleans, where BP faces tens of billions of dollars in potential penalties for the disaster. As for Obama, the same president who early in the BP crisis blasted the “scandalously close relationship” between oil companies and government regulators two years later ran for reelection boasting about how much new oil and gas development his administration had approved.

Such collective amnesia may seem surprising, but there may be a good explanation for it: BP mounted a cover-up that concealed the full extent of its crimes from public view. This cover-up prevented the media and therefore the public from knowing—and above all, seeing—just how much oil was gushing into the gulf. The disaster appeared much less extensive and destructive than it actually was. BP declined to comment for this article.

That BP lied about the amount of oil it discharged into the gulf is already established. Lying to Congress about that was one of 14 felonies to which BP pleaded guilty last year in a legal settlement with the Justice Department that included a $4.5 billion fine, the largest fine ever levied against a corporation in the U.S.

What has not been revealed until now is how BP hid that massive amount of oil from TV cameras and the price that this “disappearing act” imposed on cleanup workers, coastal residents, and the ecosystem of the gulf. That story can now be told because an anonymous whistleblower has provided evidence that BP was warned in advance about the safety risks of attempting to cover up its leaking oil. Nevertheless, BP proceeded. Furthermore, BP appears to have withheld these safety warnings, as well as protective measures, both from the thousands of workers hired for the cleanup and from the millions of Gulf Coast residents who stood to be affected.

The financial implications are enormous. The trial now under way in New Orleans is wrestling with whether BP was guilty of “negligence” or “gross negligence” for the Deepwater Horizon disaster. If found guilty of “negligence,” BP would be fined, under the Clean Water Act, $1,100 for each barrel of oil that leaked. But if found guilty of “gross negligence”—which a cover-up would seem to imply—BP would be fined $4,300 per barrel, almost four times as much, for a total of $17.5 billion. That large a fine, combined with an additional $34 billion that the states of Louisiana, Alabama, Mississippi, and Florida are seeking, could have a powerful effect on BP’s economic health.

Yet the most astonishing thing about BP’s cover-up? It was carried out in plain sight, right in front of the world’s uncomprehending news media (including, I regret to say, this reporter).

BP Oil Spill
More than half of the Corexit was dispersed by C-130 airplanes, often hitting workers. (Benjamin Lowy/Getty)

The chief instrument of BP’s cover-up was the same substance that apparently sickened Jamie Griffin and countless other cleanup workers and local residents. Its brand name is Corexit, but most news reports at the time referred to it simply as a “dispersant.” Its function was to attach itself to leaked oil, break it into droplets, and disperse them into the vast reaches of the gulf, thereby keeping the oil from reaching Gulf Coast shorelines. And the Corexit did largely achieve this goal.

But the 1.84 million gallons of Corexit that BP applied during the cleanup also served a public-relations purpose: they made the oil spill all but disappear, at least from TV screens. By late July 2010, the Associated Press and The New York Times were questioning whether the spill had been such a big deal after all. Time went so far as to assert that right-wing talk-radio host Rush Limbaugh “has a point” when he accused journalists and environmentalists of exaggerating the crisis.

But BP had a problem: it had lied about how safe Corexit is, and proof of its dishonesty would eventually fall into the hands of the Government Accountability Project, the premiere whistleblower-protection group in the U.S. The proof? A technical manual BP had received from NALCO, the firm that supplied the Corexit that BP used in the gulf.

An electronic copy of that manual is included in a new report GAPhas issued, “Deadly Dispersants in the Gulf.” On the basis of interviews with dozens of cleanup workers, scientists, and Gulf Coast residents, GAP concludes that the health impacts endured by Griffin were visited upon many other locals as well. What’s more, the combination of Corexit and crude oil also caused terrible damage to gulf wildlife and ecosystems, including an unprecedented number of seafood mutations; declines of up to 80 percent in seafood catch; and massive die-offs of the microscopic life-forms at the base of the marine food chain. GAP warns that BP and the U.S. government nevertheless appear poised to repeat the exercise after the next major oil spill: “As a result of Corexit’s perceived success, Corexit … has become the dispersant of choice in the U.S. to ‘clean up’ oil spills.”

BP Oil Spill
Numerous fishermen on BP’s payroll helped with the cleanup by dispersing Corexit. (Benjamin Lowy/Getty)

BP’s cover-up was not planned in advance but devised in the heat of the moment as the oil giant scrambled to limit the PR and other damages of the disaster. Indeed, one of the chief scandals of the disaster is just how unprepared both BP and federal and state authorities were for an oil leak of this magnitude. U.S. law required that a response plan be in place before drilling began, but the plan was embarrassingly flawed.

“We weren’t managing for actual risk; we were checking a box,” says Mark Davis, director of the Institute on Water Resources Law and Policy at Tulane University. “That’s how we ended up with a response plan that included provisions for dealing with the impacts to walruses: because [BP] copied word for word the response plans that had been developed after the Exxon-Valdez oil spill [in Alaska, in 1989] instead of a plan tailored to the conditions in the gulf.”

As days turned into weeks and it became obvious that no one knew how to plug the gushing well, BP began insisting that Corexit be used to disperse the leaking oil. This triggered alarms from scientists and from a leading environmental NGO in Louisiana, the Louisiana Environmental Action Network (LEAN).

The group’s scientific adviser, Wilma Subra, a chemist whose work on environmental pollution had won her a “genius grant” from the MacArthur Foundation, told state and federal authorities that she was especially concerned about how dangerous the mixture of crude and Corexit was: “The short-term health symptoms include acute respiratory problems, skin rashes, cardiovascular impacts, gastrointestinal impacts, and short-term loss of memory,” she told GAP investigators. “Long-term impacts include cancer, decreased lung function, liver damage, and kidney damage.”

(Nineteen months after the Deepwater Horizon explosion, a scientific study published in the peer-reviewed journal Environmental Pollution found that crude oil becomes 52 times more toxic when combined with Corexit.)

BP even rebuffed a direct request from the administrator of the Environmental Protection Agency, Lisa Jackson, who wrote BP a letter on May 19, asking the company to deploy a less toxic dispersant in the cleanup. Jackson could only ask BP to do this; she could not legally require it. Why? Because use of Corexit had been authorized years before under the federal Oil Pollution Act.

In a recent interview, Jackson explains that she and other officials “had to determine, with less-than-perfect scientific testing and data, whether use of dispersants would, despite potential side effects, improve the overall situation in the gulf and coastal ecosystems. The tradeoff, as I have said many times, was potential damage in the deep water versus the potential for larger amounts of undispersed oil in the ecologically rich coastal shallows and estuaries.” She adds that the presidential commission that later studied the BP oil disaster did not fault the decision to use dispersants.

Knowing that EPA lacked the authority to stop it, BP wrote back to Jackson on May 20, declaring that Corexit was safe. What’s more, BP wrote, there was a ready supply of Corexit, which was not the case with alternative dispersants. (A NALCO plant was located just 30 miles west of New Orleans.)

But Corexit was decidedly not safe without taking proper precautions, as the manual BP got from NALCO spelled out in black and white. The “Vessel Captains Hazard Communication” resource manual, which GAP shared with me, looks innocuous enough. A three-ring binder with a black plastic cover, the manual contained 61 sheets, each wrapped in plastic, that detailed the scientific properties of the two types of Corexit that BP was buying, as well as their health hazards and recommended measures against those hazards.

BP applied two types of Corexit in the gulf. The first, Corexit 9527, was considerably more toxic. According to the NALCO manual, Corexit 9527 is an “eye and skin irritant. Repeated or excessive exposure … may cause injury to red blood cells (hemolysis), kidney or the liver.” The manual adds: “Excessive exposure may cause central nervous system effects, nausea, vomiting, anesthetic or narcotic effects.” It advises, “Do not get in eyes, on skin, on clothing,” and “Wear suitable protective clothing.”

When available supplies of Corexit 9527 were exhausted early in the cleanup, BP switched to the second type of dispersant, Corexit 9500. In its recommendations for dealing with Corexit 9500, the NALCO manual advised, “Do not get in eyes, on skin, on clothing,” “Avoid breathing vapor,” and “Wear suitable protective clothing.”

It’s standard procedure—and required by U.S. law—for companies to distribute this kind of information to any work site where hazardous materials are present so workers can know about the dangers they face and how to protect themselves. But interviews with numerous cleanup workers suggest that this legally required precaution was rarely if ever followed during the BP cleanup. Instead, it appears that BP told NALCO to stop including the manuals with the Corexit that NALCO was delivering to cleanup work sites.

“It’s my understanding that some manuals were sent out with the shipments of Corexit in the beginning [of the cleanup],” the anonymous source tells me. “Then, BP told NALCO to stop sending them. So NALCO was left with a roomful of unused binders.”

Roman Blahoski, NALCO’s director of global communications, says: “NALCO responded to requests for its pre-approved dispersants from those charged with protecting the gulf and mitigating the environmental, health, and economic impact of this event. NALCO was never involved in decisions relating to the use, volume, and application of its dispersant.”

BP Oil Spill
The gulf’s vital tourism industry lost billions as oil poured into the water. (Benjamin Lowy/Getty)

Misrepresenting the safety of Corexit went hand in hand with BP’s previously noted lie about how much oil was leaking from the Macondo well. As reported by John Rudolf in The Huffington Post, internal BP emails show that BP privately estimated that “the runaway well could be leaking from 62,000 barrels a day to 146,000 barrels a day.” Meanwhile, BP officials were telling the government and the media that only 5,000 barrels a day were leaking.

In short, applying Corexit enabled BP to mask the fact that a much larger amount of oil was actually leaking into the gulf. “Like any good magician, the oil industry has learned that if you can’t see something that was there, it must have ‘disappeared,’” Scott Porter, a scientist and deep-sea diver who consults for oil companies and oystermen, says in the GAP report. “Oil companies have also learned that, in the public mind, ‘out of sight equals out of mind.’ Therefore, they have chosen crude oil dispersants as the primary tool for handling large marine oil spills.”

BP also had a more direct financial interest in using Corexit, argues Clint Guidry, president of the Louisiana Shrimp Association, whose members include not only shrimpers but fishermen of all sorts. As it happens, local fishermen constituted a significant portion of BP’s cleanup force (which numbered as many as 47,000 workers at the height of the cleanup). Because the spill caused the closure of their fishing grounds, BP and state and federal authorities established the Vessels of Opportunity (VoO) program, in which BP paid fishermen to take their boats out and skim, burn, and otherwise get rid of leaked oil. Applying dispersants, Guidry points out, reduced the total volume of oil that could be traced back to BP.

“The next phase of this trial [against BP] is going to turn on how much oil was leaked,” Guidry tells me. [If found guilty, BP will be fined a certain amount for each barrel of oil judged to have leaked.] “So hiding the oil with Corexit worked not only to hide the size of the spill but also to lower the amount of oil that BP may get charged for releasing.”

BP Oil Spill
“You could smell oil and stuff in the air, but on the news they were saying it’s fine.” (Benjamin Lowy/Getty)

Not only did BP fail to inform workers of the potential hazards of Corexit and to provide them with safety training and protective gear, according to interviews with dozens of cleanup workers, the company also allegedly threatened to fire workers who complained about the lack of respirators and protective clothing.

“I worked with probably a couple hundred different fishermen on the [cleanup],” Acy Cooper, Guidry’s second in command, tells me in Venice, the coastal town from which many VoO vessels departed. “Not one of them got any safety information or training concerning the toxic materials they encountered.” Cooper says that BP did provide workers with body suits and gloves designed for handling hazardous materials. “But when I’d talk with [the BP representative] about getting my guys respirators and air monitors, I’d never get any response.”

Roughly 58 percent of the 1.84 million gallons of Corexit used in the cleanup was sprayed onto the gulf from C-130 airplanes. The spray sometimes ended up hitting cleanup workers in the face.

“Our boat was sprayed four times,” says Jorey Danos, a 32-year-old father of three who suffered racking coughing fits, severe fatigue, and memory loss after working on the BP cleanup. “I could see the stuff coming out of the plane—like a shower of mist, a smoky color. I could see [it] coming at me, but there was nothing I could do.”

“The next day,” Danos continues, “when the BP rep came around on his speed boat, I asked, ‘Hey, what’s the deal with that stuff that was coming out of those planes yesterday?’ He told me, ‘Don’t worry about it.’ I said, ‘Man, that s–t was burning my face—it ain’t right.’ He said, ‘Don’t worry about it.’ I said, ‘Well, could we get some respirators or something, because that s–t is bad.’ He said, ‘No, that wouldn’t look good to the media. You got two choices: you can either be relieved of your duties or you can deal with it.’”

Perhaps the single most hazardous chemical compound found in Corexit 9527 is 2-Butoxyethanol, a substance that had been linked to cancers and other health impacts among cleanup workers on the 1989 Exxon-Valdez oil spill in Alaska. According to BP’s own data, 20 percent of offshore workers in the gulf had levels of 2-Butoxyethanol two times higher than the level certified as safe by the Occupational Safety and Health Administration.

Cleanup workers were not the only victims; coastal residents also suffered. “My 2-year-old grandson and I would play out in the yard,” says Shirley Tillman of the Mississippi coastal town Pass Christian. “You could smell oil and stuff in the air, but on the news they were saying it’s fine, don’t worry. Well, by October, he was one sick little fellow. All of a sudden, this very active little 2-year-old was constantly sick. He was having headaches, upper respiratory infections, earaches. The night of his birthday party, his parents had to rush him to the emergency room. He went to nine different doctors, but they treated just the symptoms; they’re not toxicologists.”

BP Oil Spill
Doctors misdiagnosed Danos, a BP clean-up worker who was exposed to Corexit, with schizophrenia and bipolar disorder. (Benjamin Lowy/Getty)

“It’s not the crime, it’s the cover-up.” Ever since the Watergate scandal of the 1970s, that’s been the mantra. Cover-ups don’t work, goes the argument. They only dig a deeper hole, because the truth eventually comes out.

But does it?

GAP investigators were hopeful that obtaining the NALCO manual might persuade BP to meet with them, and it did. On July 10, 2012, BP hosted a private meeting at its Houston offices. Presiding over the meeting, which is described here publicly for the first time, was BP’s public ombudsman, Stanley Sporkin, joining by telephone from Washington. Ironically, Sporkin had made his professional reputation during the Watergate scandal. As a lawyer with the Securities and Exchange Commission, Sporkin investigated illegal corporate payments to the slush fund that President Nixon used to buy the silence of the Watergate burglars.

Also attending the meeting were two senior BP attorneys; BP Vice President Luke Keller; other BP officials; Thomas Devine, GAP’s senior attorney on the BP case; Shanna Devine, GAP’s investigator on the case; Dr. Michael Robichaux; Dr. Wilma Subra; and Marylee Orr, the executive director of LEAN. The following account is based on my interviews with Thomas Devine, Robichaux, Subra, and Orr. BP declined to comment.

BP officials had previously confirmed the authenticity of the NALCO manual, says Thomas Devine, but now they refused to discuss it, even though this had been one of the stated purposes for the meeting. Nor would BP address the allegation, made by the whistleblower who had given the manual to GAP, that BP had ordered the manual withheld from cleanup work sites, perhaps to maintain the fiction that Corexit was safe.

“They opened the meeting with this upbeat presentation about how seriously they took their responsibilities for the spill and all the wonderful things they were doing to make things right,” says Devine. “When it was my turn to speak, I said that the manual our whistleblower had provided contradicted what they just said. I asked whether they had ordered the manual withdrawn from work sites. Their attorneys said that was a matter they would not discuss because of the pending litigation on the spill.” [Disclosure: Thomas Devine is a friend of this reporter.]

The visitors’ top priority was to get BP to agree not to use Corexit in the future. Keller said that Corexit was still authorized for use by the U.S. government and BP would indeed feel free to use it against any future oil spills.

105790620BL045_oil_spill
Benjamin Lowy

A second priority was to get BP to provide medical treatment for Jamie Griffin and the many other apparent victims of Corexit-and-crude poisoning. This request too was refused by BP.

Robichaux doubts his patients will receive proper compensation from the $7.8 billion settlement BP reached in 2012 with the Plaintiffs’ Steering Committee, 19 court-appointed attorneys who represent the hundreds of individuals and entities that have sued BP for damages related to the gulf disaster. “Nine of the most common symptoms of my patients do not appear on the list of illnesses that settlement says can be compensated, including memory loss, fatigue, and joint and muscular pain,” says Robichaux. “So how are the attorneys going to file suits on behalf of those victims?”

At one level, BP’s cover-up of the gulf oil disaster speaks to the enormous power that giant corporations exercise in modern society, and how unable, or unwilling, governments are to limit that power. To be sure, BP has not entirely escaped censure for its actions; depending on the outcome of the trial now under way in New Orleans, the company could end up paying tens of billions of dollars in fines and damages over and above the $4.5 billion imposed by the Justice Department in the settlement last year. But BP’s reputation appears to have survived: its market value as this article went to press was a tidy $132 billion, and few, if any, BP officials appear likely to face any legal repercussions. “If I would have killed 11 people, I’d be hanging from a noose,” says Jorey Danos. “Not BP. It’s the golden rule: the man with the gold makes the rules.”

As unchastened as anyone at BP is Bob Dudley, the American who was catapulted into the CEO job a few weeks into the gulf disaster to replace Tony Hayward, whose propensity for imprudent comments—“I want my life back,” the multimillionaire had pouted while thousands of gulf workers and residents were suffering—had made him a globally derided figure. Dudley told the annual BP shareholders meeting in London last week that Corexit “is effectively … dishwashing soap,” no more toxic than that, as all scientific studies supposedly showed. What’s more, Dudley added, he himself had grown up in Mississippi and knows that the Gulf of Mexico is “an ecosystem that is used to oil.”

Nor has the BP oil disaster triggered the kind of changes in law and public priorities one might have expected. “Not much has actually changed,” says Mark Davis of Tulane. “It reflects just how wedded our country is to keeping the Gulf of Mexico producing oil and bringing it to our shores as cheaply as possible. Going forward, no one should assume that just because something really bad happened we’re going to manage oil and gas production with greater sensitivity and wisdom. That will only happen if people get involved and compel both the industry and the government to be more diligent.”

And so the worst environmental disaster in U.S. history has been whitewashed—its true dimensions obscured, its victims forgotten, its lessons ignored. Who says cover-ups never work?

Mark Hertsgaard is a fellow at the New American Foundation and the author, most recently, of HOT: Living Through the Next Fifty Years on Earth. This article was reported in partnership with the Investigative Fund at the Nation Institute.

Earth’s Current Warmth Not Seen in the Last 1,400 Years or More, Says Study (Science Daily)

Apr. 21, 2013 — Fueled by industrial greenhouse gas emissions, Earth’s climate warmed more between 1971 and 2000 than during any other three-decade interval in the last 1,400 years, according to new regional temperature reconstructions covering all seven continents. This period of humanmade global warming, which continues today, reversed a natural cooling trend that lasted several hundred years, according to results published in the journalNature Geoscience by more than 80 scientists from 24 nations analyzing climate data from tree rings, pollen, cave formations, ice cores, lake and ocean sediments, and historical records from around the world.

During Europe’s 2003 heat wave, July temperatures in France were as much as 18 degrees F hotter than in 2001. (Credit: NASA)

“This paper tells us what we already knew, except in a better, more comprehensive fashion,” said study co-author Edward Cook, a tree-ring scientist at Lamont-Doherty Earth Observatory who led the Asia reconstruction.

The study also found that Europe’s 2003 heat wave and drought, which killed an estimated 70,000 people, happened during Europe’s hottest summer of the last 2,000 years. “Summer temperatures were intense that year and accompanied by a lack of rain and very dry soil conditions over much of Europe,” said study co-author Jason Smerdon, a climate scientist at Lamont-Doherty and one of the lead contributors to the Europe reconstruction. Though summer 2003 set a record for Europe, global warming was only one of the factors that contributed to the temperature conditions that summer, he said.

The study is the latest to show that the Medieval Warm Period, from about 950 to 1250, may not have been global, and may not have happened at the same time in places that did grow warmer. While parts of Europe and North America were fairly warm between 950 and 1250, South America stayed relatively cold, the study says. Some people have argued that the natural warming that occurred during the medieval ages is happening today, and that humans are not responsible for modern day global warming. Scientists are nearly unanimous in their disagreement “If we went into another Medieval Warm Period again that extra warmth would be added on top of warming from greenhouse gases,” said Cook.

Temperatures varied less between continents in the same hemisphere than between hemispheres. “Distinctive periods, such as the Medieval Warm Period or the Little Ice Age stand out, but do not show a globally uniform pattern,” said co-author Heinz Wanner, a scientist at the University of Bern. By 1500, temperatures dropped below the long-term average everywhere, though colder temperatures emerged several decades earlier in the Arctic, Europe and Asia.

The most consistent trend across all regions in the last 2,000 years was a long-term cooling, likely caused by a rise in volcanic activity, decrease in solar irradiance, changes in land-surface vegetation, and slow variations in Earth’s orbit. With the exception of Antarctica, cooling tapered off at the end of the 19th century, with the onset of industrialization. Cooler 30-year periods between 830 and 1910 were particularly pronounced during weak solar activity and strong tropical volcanic eruptions. Both phenomena often occurred simultaneously and led to a drop in the average temperature during five distinct 30- to 90-year intervals between 1251 and 1820. Warming in the 20th century was on average twice as large in the northern continents as it was in the Southern Hemisphere. During the past 2000 years, some regions experienced warmer 30-year intervals than during the late 20th century. For example, in Europe the years between 21 and 80 AD were likely warmer than the period 1971-2000.

Mathematical Models Out-Perform Doctors in Predicting Cancer Patients’ Responses to Treatment (Science Daily)

Apr. 19, 2013 — Mathematical prediction models are better than doctors at predicting the outcomes and responses of lung cancer patients to treatment, according to new research presented today (Saturday) at the 2nd Forum of the European Society for Radiotherapy and Oncology (ESTRO).

These differences apply even after the doctor has seen the patient, which can provide extra information, and knows what the treatment plan and radiation dose will be.

“The number of treatment options available for lung cancer patients are increasing, as well as the amount of information available to the individual patient. It is evident that this will complicate the task of the doctor in the future,” said the presenter, Dr Cary Oberije, a postdoctoral researcher at the MAASTRO Clinic, Maastricht University Medical Center, Maastricht, The Netherlands. “If models based on patient, tumour and treatment characteristics already out-perform the doctors, then it is unethical to make treatment decisions based solely on the doctors’ opinions. We believe models should be implemented in clinical practice to guide decisions.”

Dr Oberije and her colleagues in The Netherlands used mathematical prediction models that had already been tested and published. The models use information from previous patients to create a statistical formula that can be used to predict the probability of outcome and responses to treatment using radiotherapy with or without chemotherapy for future patients.

Having obtained predictions from the mathematical models, the researchers asked experienced radiation oncologists to predict the likelihood of lung cancer patients surviving for two years, or suffering from shortness of breath (dyspnea) and difficulty swallowing (dysphagia) at two points in time:

1) after they had seen the patient for the first time, and

2) after the treatment plan was made. At the first time point, the doctors predicted two-year survival for 121 patients, dyspnea for 139 and dysphagia for 146 patients.

At the second time point, predictions were only available for 35, 39 and 41 patients respectively.

For all three predictions and at both time points, the mathematical models substantially outperformed the doctors’ predictions, with the doctors’ predictions being little better than those expected by chance.

The researchers plotted the results on a special graph [1] on which the area below the plotted line is used for measuring the accuracy of predictions; 1 represents a perfect prediction, while 0.5 represents predictions that were right in 50% of cases, i.e. the same as chance. They found that the model predictions at the first time point were 0.71 for two-year survival, 0.76 for dyspnea and 0.72 for dysphagia. In contrast, the doctors’ predictions were 0.56, 0.59 and 0.52 respectively.

The models had a better positive predictive value (PPV) — a measure of the proportion of patients who were correctly assessed as being at risk of dying within two years or suffering from dyspnea and dysphagia — than the doctors. The negative predictive value (NPV) — a measure of the proportion of patients that would not die within two years or suffer from dyspnea and dysphagia — was comparable between the models and the doctors.

“This indicates that the models were better at identifying high risk patients that have a very low chance of surviving or a very high chance of developing severe dyspnea or dysphagia,” said Dr Oberije.

The researchers say that it is important that further research is carried out into how prediction models can be integrated into standard clinical care. In addition, further improvement of the models by incorporating all the latest advances in areas such as genetics, imaging and other factors, is important. This will make it possible to tailor treatment to the individual patient’s biological make-up and tumour type

“In our opinion, individualised treatment can only succeed if prediction models are used in clinical practice. We have shown that current models already outperform doctors. Therefore, this study can be used as a strong argument in favour of using prediction models and changing current clinical practice,” said Dr Oberije.

“Correct prediction of outcomes is important for several reasons,” she continued. “First, it offers the possibility to discuss treatment options with patients. If survival chances are very low, some patients might opt for a less aggressive treatment with fewer side-effects and better quality of life. Second, it could be used to assess which patients are eligible for a specific clinical trial. Third, correct predictions make it possible to improve and optimise the treatment. Currently, treatment guidelines are applied to the whole lung cancer population, but we know that some patients are cured while others are not and some patients suffer from severe side-effects while others don’t. We know that there are many factors that play a role in the prognosis of patients and prediction models can combine them all.”

At present, prediction models are not used as widely as they could be by doctors. Dr Oberije says there are a number of reasons: some models lack clinical credibility; others have not yet been tested; the models need to be available and easy to use by doctors; and many doctors still think that seeing a patient gives them information that cannot be captured in a model. “Our study shows that it is very unlikely that a doctor can outperform a model,” she concluded.

President of ESTRO, Professor Vincenzo Valentini, a radiation oncologist at the Policlinico Universitario A. Gemelli, Rome, Italy, commented: “The booming growth of biological, imaging and clinical information will challenge the decision capacity of every oncologist. The understanding of the knowledge management sciences is becoming a priority for radiation oncologists in order for them to tailor their choices to cure and care for individual patients.”

[1] For the mathematicians among you, the graph is known as an Area Under the Curve (AUC) of the Receiver Operating Characteristic (ROC).

[2] This work was partially funded by grants from the Dutch Cancer Society (KWF), the European Fund for Regional Development (INTERREG/EFRO), and the Center for Translational Molecular Medicine (CTMM).

Carbon bubble will plunge the world into another financial crisis – report (The Guardian)

Trillions of dollars at risk as stock markets inflate value of fossil fuels that may have to remain buried forever, experts warn

Damian Carrington – The Guardian, Friday 19 April 2013

Carbon bubble : carbon dioxide polluting power plant : coal-fired Bruce Mansfield Power Plant

Global stock markets are betting on countries failing to adhere to legally binding carbon emission targets. Photograph: Robert Nickelsberg/Getty Images

The world could be heading for a major economic crisis as stock marketsinflate an investment bubble in fossil fuels to the tune of trillions of dollars, according to leading economists.

“The financial crisis has shown what happens when risks accumulate unnoticed,” said Lord (Nicholas) Stern, a professor at the London School of Economics. He said the risk was “very big indeed” and that almost all investors and regulators were failing to address it.

The so-called “carbon bubble” is the result of an over-valuation of oil,coal and gas reserves held by fossil fuel companies. According to a report published on Friday, at least two-thirds of these reserves will have to remain underground if the world is to meet existing internationally agreed targets to avoid the threshold for “dangerous” climate changeIf the agreements hold, these reserves will be in effect unburnable and so worthless – leading to massive market losses. But the stock markets are betting on countries’ inaction on climate change.

The stark report is by Stern and the thinktank Carbon Tracker. Their warning is supported by organisations including HSBC, Citi, Standard and Poor’s and the International Energy Agency. The Bank of England has also recognised that a collapse in the value of oil, gas and coal assets as nations tackle global warming is a potential systemic risk to the economy, with London being particularly at risk owing to its huge listings of coal.

Stern said that far from reducing efforts to develop fossil fuels, the top 200 companies spent $674bn (£441bn) in 2012 to find and exploit even more new resources, a sum equivalent to 1% of global GDP, which could end up as “stranded” or valueless assets. Stern’s landmark 2006 reporton the economic impact of climate change – commissioned by the then chancellor, Gordon Brown – concluded that spending 1% of GDP would pay for a transition to a clean and sustainable economy.

The world’s governments have agreed to restrict the global temperature rise to 2C, beyond which the impacts become severe and unpredictable. But Stern said the investors clearly did not believe action to curb climate change was going to be taken. “They can’t believe that and also believe that the markets are sensibly valued now.”

“They only believe environmental regulation when they see it,” said James Leaton, from Carbon Tracker and a former PwC consultant. He said short-termism in financial markets was the other major reason for the carbon bubble. “Analysts say you should ride the train until just before it goes off the cliff. Each thinks they are smart enough to get off in time, but not everyone can get out of the door at the same time. That is why you get bubbles and crashes.”

Paul Spedding, an oil and gas analyst at HSBC, said: “The scale of ‘listed’ unburnable carbon revealed in this report is astonishing. This report makes it clear that ‘business as usual’ is not a viable option for the fossil fuel industry in the long term. [The market] is assuming it will get early warning, but my worry is that things often happen suddenly in the oil and gas sector.”

HSBC warned that 40-60% of the market capitalisation of oil and gas companies was at risk from the carbon bubble, with the top 200 fossil fuel companies alone having a current value of $4tn, along with $1.5tn debt.

Lord McFall, who chaired the Commons Treasury select committee for a decade, said: “Despite its devastating scale, the banking crisis was at its heart an avoidable crisis: the threat of significant carbon writedown has the unmistakable characteristics of the same endemic problems.”

The report calculates that the world’s currently indicated fossil fuel reserves equate to 2,860bn tonnes of carbon dioxide, but that just 31% could be burned for an 80% chance of keeping below a 2C temperature rise. For a 50% chance of 2C or less, just 38% could be burned.

Carbon capture and storage technology, which buries emissions underground, can play a role in the future, but even an optimistic scenario which sees 3,800 commercial projects worldwide would allow only an extra 4% of fossil fuel reserves to be burned. There are currently no commercial projects up and running. The normally conservativeInternational Energy Agency has also concluded that a major part of fossil fuel reserves is unburnable.

Citi bank warned investors in Australia’s vast coal industry that little could be done to avoid the future loss of value in the face of action on climate change. “If the unburnable carbon scenario does occur, it is difficult to see how the value of fossil fuel reserves can be maintained, so we see few options for risk mitigation.”

Ratings agencies have expressed concerns, with Standard and Poor’s concluding that the risk could lead to the downgrading of the credit ratings of oil companies within a few years.

Steven Oman, senior vice-president at Moody’s, said: “It behoves us as investors and as a society to know the true cost of something so that intelligent and constructive policy and investment decisions can be made. Too often the true costs are treated as unquantifiable or even ignored.”

Jens Peers, who manages €4bn (£3bn) for Mirova, part of €300bn asset managers Natixis, said: “It is shocking to see the report’s numbers, as they are worse than people realise. The risk is massive, but a lot of asset managers think they have a lot of time. I think they are wrong.” He said a key moment will come in 2015, the date when the world’s governments have pledged to strike a global deal to limit carbon emissions. But he said that fund managers need to move now. If they wait till 2015, “it will be too late for them to take action.”

Pension funds are also concerned. “Every pension fund manager needs to ask themselves have we incorporated climate change and carbon risk into our investment strategy? If the answer is no, they need to start to now,” said Howard Pearce, head of pension fund management at the Environment Agency, which holds £2bn in assets.

Stern and Leaton both point to China as evidence that carbon cuts are likely to be delivered. China’s leaders have said its coal use will peak in the next five years, said Leaton, but this has not been priced in. “I don’t know why the market does not believe China,” he said. “When it says it is going to do something, it usually does.” He said the US and Australia were banking on selling coal to China but that this “doesn’t add up”.

Jeremy Grantham, a billionaire fund manager who oversees $106bn of assets, said his company was on the verge of pulling out of all coal and unconventional fossil fuels, such as oil from tar sands. “The probability of them running into trouble is too high for me to take that risk as an investor.” He said: “If we mean to burn all the coal and any appreciable percentage of the tar sands, or other unconventional oil and gas then we’re cooked. [There are] terrible consequences that we will lay at the door of our grandchildren.”

Politicians Found to Be More Risk-Tolerant Than the General Population (Science Daily)

Apr. 16, 2013 — According to a recent study, the popularly elected members of the German Bundestag are substantially more risk-tolerant than the broader population of Germany. Researchers in the Cluster of Excellence “Languages of Emotion” at Freie Universität Berlin and at DIW Berlin (German Institute for Economic Research) conducted a survey of Bundestag representatives and analyzed data on the general population from the German Socio-Economic Panel Study (SOEP). Results show that risk tolerance is even higher among Bundestag representatives than among self-employed people, who are themselves more risk-tolerant than salaried employees or civil servants. This was true for all areas of risk that were surveyed in the study: automobile driving, financial investments, sports and leisure activities, career, and health. The authors interpret this finding as positive.

The full results of the study were published in German in the SOEPpapers series of the German Institute for Economic Research (DIW Berlin).

The authors of the study, Moritz Hess (University of Mannheim), Prof. Dr. Christian von Scheve (Freie Universität Berlin and DIW Berlin), Prof. Dr. Jürgen Schupp (DIW Berlin and Freie Universität Berlin), and Prof. Dr. Gert G. Wagner (DIW Berlin and Technische Universität Berlin) view the above-average risk tolerance found among Bundestag representatives as positive. According to sociologist and lead author of the study Moritz Hess: “Otherwise, important societal decisions often wouldn’t be made due to the almost incalculable risks involved. This would lead to stagnation and social standstill.” The authors do not interpret the higher risk-tolerance found among politicians as a threat to democracy. “The results show a successful and sensible division of labor among citizens, voters, and politicians,” says economist Gert G. Wagner. Democratic structures and parliamentary processes, he argues, act as a brake on the individual risk propensity of elected representatives and politicians.

For their study, the research team distributed written questionnaires to all 620 members of the 17th German Bundestag in late 2011. Twenty-eight percent of Bundestag members responded. Comparisons with the statistical characteristics of all current Bundestag representatives showed that the respondents comprise a representative sample of Bundestag members. SOEP data were used to obtain a figure for the risk tolerance of the general population for comparison with the figures for Bundestag members.

The questions posed to Bundestag members were formulated analogously to the questions in the standard SOEP questionnaire. Politicians were asked to rate their own risk tolerance on a scale from zero (= not at all risk-tolerant) to ten (= very risk-tolerant). They rated both their general risk tolerance as well as their specific risk tolerance in the areas of driving, making financial investments, sports and leisure activities, career, health, and trust towards strangers. They also rated their risk tolerance in regard to political decisions. No questions on party affiliation were asked in order to exclude the possibility that results could be used for partisan political purposes.

References:

Hess, M., von Scheve, C., Schupp, J., Wagner. G. G. (2013): Members of German Federal Parliament More Risk-Loving Than General Population, in: DIW Economic Bulletin, Vol. 3, No. 4, 2013, pp. 20-24.

Hess, M., von Scheve, C., Schupp, J., Wagner. G. G. (2013): Sind Politiker risikofreudiger als das Volk? Eine empirische Studie zu Mitgliedern des Deutschen Bundestags, SOEPpaper No. 545, DIW Berlin.

Ocean’s Future Not So Bleak? Resilience Found in Shelled Plants Exposed to Ocean Acidification (Science Daily)

Apr. 12, 2013 — Marine scientists have long understood the detrimental effect of fossil fuel emissions on marine ecosystems. But a group led by a UC Santa Barbara professor has found a point of resilience in a microscopic shelled plant with a massive environmental impact, which suggests the future of ocean life may not be so bleak.

This shows cells of the coccolithophore species Emiliania huxleyi strain NZEH under present-day, left, and future high, right, carbon dioxide conditions. (Credit: UCSB)

As fossil fuel emissions increase, so does the amount of carbon dioxide oceans absorb and dissolve, lowering their pH levels. “As pH declines, there is this concern that marine species that have shells may start dissolving or may have more difficulty making calcium carbonate, the chalky substance that they use to build shells,” said Debora Iglesias-Rodriguez, a professor in UCSB’s Department of Ecology, Evolution and Marine Biology.

Iglesias-Rodriguez and postdoctoral researcher Bethan Jones, who is now at Rutgers University, led a large-scale study on the effects of ocean acidification on these tiny plants that can only be seen under the microscope. Their research, funded by the European Project on Ocean Acidification, is published in the journal PLoS ONE and breaks with traditional notions about the vitality of calcifiers, or creatures that make shells, in future ocean conditions.

“The story years ago was that ocean acidification was going to be bad, really bad for calcifiers,” said Iglesias-Rodriguez, whose team discovered that one species of the tiny single celled marine coccolithophore, Emiliania huxleyi, actually had bigger shells in high carbon dioxide seawater conditions. While the team acknowledges that calcification tends to decline with acidification, “we now know that there are variable responses in sea corals, in sea urchins, in all shelled organisms that we find in the sea.”

These E. huxleyi are a large army of ocean-regulating shell producers that create oxygen as they process carbon by photosynthesis and fortify the ocean food chain. As one of Earth’s main vaults for environmentally harmful carbon emissions, their survival affects organisms inside and outside the marine system. However, as increasing levels of atmospheric carbon dioxide causes seawater to slide down the pH scale toward acidic levels, this environment could become less hospitable.

The UCSB study incorporated an approach known as shotgun proteomics to uncover how E. huxleyi‘s biochemistry could change in future high carbon dioxide conditions, which were set at four times the current levels for the study. This approach casts a wider investigative net that looks at all changes and influences in the environment as opposed to looking at individual processes like photosynthesis.

Shotgun proteomics examines the type, abundance, and alterations in proteins to understand how a cell’s machinery is conditioned by ocean acidification. “There is no perfect approach,” said Iglesias-Rodriguez. “They all have their caveats, but we think that this is a way of extracting a lot of information from this system.”

To mirror natural ocean conditions, the team used over half a ton of seawater to grow the E. huxleyi and bubbled in carbon dioxide to recreate both present day and high future carbon levels. It took more than six months for the team to grow enough plants to accumulate and analyze sufficient proteins.

The team found that E. huxleyi cells exposed to higher carbon dioxide conditions were larger and contained more shell than those grown in current conditions. However, they also found that these larger cells grow slower than those under current carbon dioxide conditions. Aside from slower growth, the higher carbon dioxide levels did not seem to affect the cells even at the biochemical level, as measured by the shotgun proteomic approach.

“The E. huxleyi increased the amount of calcite they had because they kept calcifying but slowed down division rates,” said Iglesias-Rodriguez. “You get fewer cells but they look as healthy as those under current ocean conditions, so the shells are not simply dissolving away.”

The team stresses that while representatives of this species seem to have biochemical mechanisms to tolerate even very high levels of carbon dioxide, slower growth could become problematic. If other species grow faster, E. huxleyi could be outnumbered in some areas.

“The cells in this experiment seemed to tolerate future ocean conditions,” said Jones. “However, what will happen to this species in the future is still an open question. Perhaps the grow-slow outcome may end up being their downfall as other species could simply outgrow and replace them.”

Journal Reference:

  1. Bethan M. Jones, M. Debora Iglesias-Rodriguez, Paul J. Skipp, Richard J. Edwards, Mervyn J. Greaves, Jeremy R. Young, Henry Elderfield, C. David O’Connor. Responses of the Emiliania huxleyi Proteome to Ocean AcidificationPLoS ONE, 2013; 8 (4): e61868 DOI:10.1371/journal.pone.0061868

Carbon Dioxide Removal Can Lower Costs of Climate Protection (Science Daily)

Apr. 12, 2013 — Directly removing CO2 from the air has the potential to alter the costs of climate change mitigation. It could allow prolonging greenhouse-gas emissions from sectors like transport that are difficult, thus expensive, to turn away from using fossil fuels. And it may help to constrain the financial burden on future generations, a study now published by the Potsdam Institute for Climate Impact Research (PIK) shows. It focuses on the use of biomass for energy generation, combined with carbon capture and storage (CCS). According to the analysis, carbon dioxide removal could be used under certain requirements to alleviate the most costly components of mitigation, but it would not replace the bulk of actual emissions reductions. 

Directly removing CO2 from the air has the potential to alter the costs of climate change mitigation. It could allow prolonging greenhouse-gas emissions from sectors like transport that are difficult, thus expensive, to turn away from using fossil fuels. And it may help to constrain the financial burden on future generations, a new study shows. It focuses on the use of biomass for energy generation, combined with carbon capture and storage. (Credit: © Jürgen Fälchle / Fotolia)

“Carbon dioxide removal from the atmosphere allows to separate emissions control from the time and location of the actual emissions. This flexibility can be important for climate protection,” says lead-author Elmar Kriegler. “You don’t have to prevent emissions in every factory or truck, but could for instance plant grasses that suck CO2 out of the air to grow — and later get processed in bioenergy plants where the CO2 gets stored underground.”

In economic terms, this flexibility allows to lower costs by compensating for emissions which would be most costly to eliminate. “This means that a phase-out of global emissions by the end of the century — that we would need to hold the 2 degree line adopted by the international community — does not necessarily require to eliminate each and every source of emissions,” says Kriegler. “Decisions whether and how to protect future generations from the risks of climate change have to be made today, but the burden of achieving these targets will increase over time. The costs for future generations can be substantially reduced if carbon dioxide removal technologies become available in the long run.”

Balancing the financial burden across generations

The study now published is the first to quantify this. If bioenergy plus CCS is available, aggregate mitigation costs over the 21st century might be halved. In the absence of such a carbon dioxide removal strategy, costs for future generations rise significantly, up to a quadrupling of mitigation costs in the period of 2070 to 2090. The calculation was carried out using a computer simulation of the economic system, energy markets, and climate, covering a range of scenarios.

Options for carbon dioxide removal from the atmosphere include afforestation and chemical approaches like direct air capture of CO2 from the atmosphere or reactions of CO2 with minerals to form carbonates. But the use of biomass for energy generation combined with carbon capture and storage is less costly than chemical options, as long as sufficient biomass feedstock is available, the scientists point out.

Serious concerns about large-scale biomass use combined with CCS

“Of course, there are serious concerns about the sustainability of large-scale biomass use for energy,” says co-author Ottmar Edenhofer, chief-economist of PIK. “We therefore considered the bioenergy with CCS option only as an example of the role that carbon dioxide removal could play for climate change mitigation.” The exploitation of bioenergy can conflict with land-use for food production or ecosystem protection. To account for sustainability concerns, the study restricts the bioenergy production to a medium level, that may be realized mostly on abandoned agricultural land.

Still, global population growth and changing dietary habits, associated with an increased demand for land, as well as improvements of agricultural productivity, associated with a decreased demand for land, are important uncertainties here. Furthermore, CCS technology is not yet available for industrial-scale use and, due to environmental concerns, is controversial in countries like Germany. Yet in this study it is assumed that it will become available in the near future.

“CO2 removal from the atmosphere could enable humankind to keep the window of opportunity open for low-stabilization targets despite of a likely delay in international cooperation, but only under certain requirements,” says Edenhofer. “The risks of scaling up bioenergy use need to be better understood, and safety concerns about CCS have to be thoroughly investigated. Still, carbon dioxide removal technologies are no science fiction and need to be further explored.” In no way should they be seen as a pretext to neglect emissions reductions now, notes Edenhofer. “By far the biggest share of climate change mitigation has to come from a large effort to reduce greenhouse-gas emissions globally.”

Journal Reference:

  1. Elmar Kriegler, Ottmar Edenhofer, Lena Reuster, Gunnar Luderer, David Klein. Is atmospheric carbon dioxide removal a game changer for climate change mitigation? Climatic Change, 2013; DOI: 10.1007/s10584-012-0681-4

Maya Long Count Calendar Calibrated to Modern European Calendar Using Carbon-14 Dating (Science Daily)

Apr. 11, 2013 — The Maya are famous for their complex, intertwined calendric systems, and now one calendar, the Maya Long Count, is empirically calibrated to the modern European calendar, according to an international team of researchers.

Elaborately carved wooden lintel or ceiling from a temple in the ancient Maya city of Tikal, Guatemala, that carries a carving and dedication date in the Maya calendar. (Credit: Courtesy of the Museum der Kulturen)

“The Long Count calendar fell into disuse before European contact in the Maya area,” said Douglas J. Kennett, professor of environmental archaeology, Penn State.

“Methods of tying the Long Count to the modern European calendar used known historical and astronomical events, but when looking at how climate affects the rise and fall of the Maya, I began to question how accurately the two calendars correlated using those methods.”

The researchers found that the new measurements mirrored the most popular method in use, the Goodman-Martinez-Thompson (GMT) correlation, initially put forth by Joseph Goodman in 1905 and subsequently modified by others. In the 1950s scientists tested this correlation using early radiocarbon dating, but the large error range left open the validity of GMT.

“With only a few dissenting voices, the GMT correlation is widely accepted and used, but it must remain provisional without some form of independent corroboration,” the researchers report in today’s (April 11) issue of Scientific Reports.

A combination of high-resolution accelerator mass spectrometry carbon-14 dates and a calibration using tree growth rates showed the GMT correlation is correct.

The Long Count counts days from a mythological starting point. The date is composed of five components that combine a multiplier times 144,000 days — Bak’tun, 7,200 days — K’atun, 360 days — Tun, 20 days — Winal, and 1 day — K’in separated, in standard notation, by dots.

Archaeologists want to place the Long Count dates into the European calendar so there is an understanding of when things happened in the Maya world relative to historic events elsewhere. Correlation also allows the rich historical record of the Maya to be compared with other sources of environmental, climate and archaeological data calibrated using the European calendar.

The samples came from an elaborately carved wooden lintel or ceiling from a temple in the ancient Maya city of Tikal, Guatemala, that carries a carving and dedication date in the Maya calendar. This same lintel was one of three analyzed in the previous carbon-14 study.

Researchers measured tree growth by tracking annual changes in calcium uptake by the trees, which is greater during the rainy season.

The amount of carbon-14 in the atmosphere is incorporated into a tree’s incremental growth. Atmospheric carbon-14 changes through time, and during the Classic Maya period oscillated up and down.

The researchers took four samples from the lintel and used annually fluctuating calcium concentrations evident in the incremental growth of the tree to determine the true time distance between each by counting the number of elapsed rainy seasons. The researchers used this information to fit the four radiocarbon dates to the wiggles in the calibration curve. Wiggle-matching the carbon-14 dates provided a more accurate age for linking the Maya and Long Count dates to the European calendars.

These calculations were further complicated by known differences in the atmospheric radiocarbon content between northern and southern hemisphere.

“The complication is that radiocarbon concentrations differ between the southern and northern hemisphere,” said Kennett. “The Maya area lies on the boundary, and the atmosphere is a mixture of the southern and northern hemispheres that changes seasonally. We had to factor that into the analysis.”

The researchers results mirror the GMT European date correlations indicating that the GMT was on the right track for linking the Long Count and European calendars.

Events recorded in various Maya locations “can now be harmonized with greater assurance to other environmental, climatic and archaeological datasets from this and adjacent regions and suggest that climate change played an important role in the development and demise of this complex civilization,” the researchers wrote.

Journal Reference:

  1. Douglas J. Kennett, Irka Hajdas, Brendan J. Culleton, Soumaya Belmecheri, Simon Martin, Hector Neff, Jaime Awe, Heather V. Graham, Katherine H. Freeman, Lee Newsom, David L. Lentz, Flavio S. Anselmetti, Mark Robinson, Norbert Marwan, John Southon, David A. Hodell, Gerald H. Haug. Correlating the Ancient Maya and Modern European Calendars with High-Precision AMS 14C DatingScientific Reports, 2013; 3 DOI:10.1038/srep01597

Segue o Seco (Rolling Stone)

Edição 77 – Fevereiro de 2013

Enquanto a Bahia sofre com “a pior seca dos últimos 50 anos”, os habitantes do sertão se desdobram para superar os percalços. A esperança persiste, mas é minguada como a água da chuva

Segue o SecoFoto: Flavio Forner

Por MAÍRA KUBÍK MANO

“Para o carro! para o carro! olha ali, em cima das pedras! Tá vendo?” Não, eu não via nada. A paisagem parecia exatamente a mesma da última meia hora. Toda cor de terra, com uma ou outra catingueira no horizonte e os mandacarus, sempre em maior número, acompanhando o traçado da estrada de chão. “Lembra da cena em que o Fabiano vai tentar pegar um preá? Olha ali!”, o interlocutor insiste, apontando. Vidro abaixado, olhos a postos. Dois bichos pequenos, amarronzados e amendoados, de focinho pontudo, se mexem e se fazem notar. Pronto, lá estão os preás. Júlio César Santos fica satisfeito. Afinal, ele fora parar no sertão justamente depois de ler Vidas Secas.

“Eu sou da Zona da Mata, mas quando li Graciliano Ramos quis vir para cá”, conta Santos, um engenheiro agrônomo que se encantou pela caatinga quando ainda era estudante da Universidade Federal do Recôncavo Baiano (UFRB). Hoje, é chefe do escritório da EBDA (Empresa Baiana de Desenvolvimento Agrícola) em Ipirá, um dos 258 municípios da Bahia em situação de emergência por causa da seca. Junto com outros 17 órgãos e secretarias do governo de Jaques Wagner (PT), a EDBA faz parte do Comitê Estadual de Ações de Convivência com a Seca.

Estamos a caminho da cidade vizinha, Pintadas, onde a estiagem é ainda mais crítica. No percurso, cruzamos quatro rios. Três deles, secos. O céu nublado ao longe parece o prenúncio da mudança. Um chuvisco havia caído naquela madrugada, algo que não acontecia há muito tempo. As marcas ainda estavam na terra, em alguns sulcos rasos que provavelmente abrigaram fios de água corrente. Santos parece aliviado. “Agora precisa chover mais”, diz.

Em uma curva à esquerda surge a casa de Messias e Ginalva Jesus Pereira. A plantação de palmas logo se destaca da monocromia – é verde-escura, com nenhum tom de marrom. Na seca, o vegetal tem sido fonte de alimento imprescindível para garantir a sobrevivência dos animais, que já não têm mais pasto. “O povo vem, visita, admira. Outros ficam com usura”, fala Ginalva, sobrancelhas levantadas, há cerca de 20 anos vivendo naquele roçado.

Como era de se esperar, a conversa envereda para o clima e as gotas que caíram à noite. “Choveu em Ipirá, foi? Ah, aqui foi só uma neblina”, rebate o pequeno Matheus, filho do meio de Ginalva. “Aqui não chove mesmo há três anos. Perdemos dois bezerros e dois umbuzeiros para a seca. Painho está pedindo a Deus para esse resto de palma pegar”, diz, referindo-se a uma área mais distante da casa, plantada há pouco, onde o verde já está quase desbotando.

O cálculo de Matheus não é exagerado. Geralmente, chove na caatinga entre janeiro e maio, justamente a época do plantio. Em 2012, porém, a água não caiu e um período de estiagem emendou no outro, fazendo desta a maior seca dos últimos 50 anos, segundo a Coordenação de Defesa Civil da Bahia (Cordec). A previsão é que ela se estenda por mais um ou dois anos. “Agora, com a chuva, vai ser outra coisa. Vai mudar tudo”, avalia uma experiente Ginalva. Assim como o protagonista Fabiano da obra de Graciliano Ramos, ela sabe que a caatinga ressuscita.

Na casa dela, canos estrategicamente posicionados aguardam a próxima precipitação para recolher a água em cisternas. Enquanto isso não ocorre, Ginalva mantém, por meio de irrigação artificial, a produção – que inclui também feijão de corda, cebolinha, coentro, mamão, batata-doce e quiabo, além da criação de ovinos, caprinos e bovinos. O poço, recém-construído, foi financiado via Pronaf (Programa Nacional de Fortalecimento da Agricultura Familiar) Emergencial.

Assim como Ginalva, outros 6 mil agricultores da região apresentaram projetos para acessar o Programa. Segundo o Banco do Nordeste do Brasil (BNB), foram liberados R$ 10 milhões do Pronaf Emergencial até janeiro de 2013 para os 17 municípios do entorno de Feira de Santana, entre eles Pintadas e Ipirá. “São pequenos agricultores que você vê aqui, solicitando financiamento para plantar palmas ou fazer aguada para recuperar o pasto”, diz José Wilson Junqueira Queiroz, gerente de negócios do BNB. Em todo o Brasil, entre maio e dezembro de 2012, o governo federal autorizou R$ 656,2 milhões em linhas de crédito emergenciais para atender os atingidos pela seca.

“São essas políticas públicas que estão segurando as famílias no campo”, avalia Jeane de Almeida Santiago. Agrônoma que trabalha em uma ONG chamada Fundação Apaeba, ela presta assistência técnica para os produtores de Pintadas, Ipirá, Riachão do Jacuípe, Pé de Serra, Baixa Grande e Nova Fátima, todas na Bahia. “Antes, tinha muito mais gente que ia para São Paulo e outros estados para fazer migração.”

O relato é de alguém que conhece de perto a situação. Jeane nasceu em Pintadas. Estudou na escola agrícola e saiu para fazer curso técnico em Juazeiro e faculdade no Recôncavo Baiano. Voltou quando se formou, querendo transmitir os conhecimentos aprendidos. Olhos vivos e atentos, ela muda o tom e reavalia sua afirmação: “É, mas este ano muitos jovens estão indo. Com a seca, a rentabilidade das propriedades está zero. E as pessoas não vão ficar aqui sem ter dinheiro. Infelizmente, são obrigadas a sair, de coração partido, para São Paulo em busca de trabalho, ver se conseguem mandar dinheiro para a família que ficou aqui manter o rebanho vivo”.

De fato, o ponto de ônibus de Pintadas estava cheio naquela manhã. A cidade ainda não tem rodoviária e o asfalto que a conecta com o resto do mundo foi inaugurado há apenas um ano, como avisam as placas do governo do estado logo na entrada. Todos aguardavam na calçada o próximo transporte para a capital paulista, malas e parentes em pé, sol a pino. Há cerca de três semanas, Ginalva se despedia ali mesmo do filho mais velho, de 18 anos, que decidiu tentar a vida fora dali. “Me ligou ontem dizendo que já arrumou um emprego numa fábrica. É temporário, mas é um emprego”, ela conta. É a famosa ponte aérea Pintadas-São Paulo.

“O pior é que não temos previsão boa para este ano”, lamenta Jeane. Ela conta que até a palma e o mandacaru, também usados para alimentar o rebanho, começaram a desaparecer, e que a maioria das terras da região está na mão de pequenos agricultores de subsistência ou pecuaristas. “Já faz mais de um ano que o município está dando ração aos animais porque não tem mais pasto. Mas agora a ração esgotou. Você procura e não acha. Quando acha, é um valor que não dá para colocar no orçamento.”

Jeane preocupa-se: “Tem produtores que estão pagando três ou quatro projetos. Vai chegar uma hora que ninguém vai conseguir pegar mais [crédito], de tanto que devem. E aí, não sei como vai ser. Porque a propriedade não está tendo rentabilidade para pagar os empréstimos que já deve. Sem crédito, eu acredito que na zona rural fica impossível.”

“A causa desta seca é a destruição do meio ambiente”, ela sentencia, citando uma pesquisa recente que constata que 90% da mata nativa da região havia desaparecido. “A natureza está respondendo. O território está descoberto. E a partir daí vêm as queimadas. Muitos solos já se perderam ou estão enfraquecidos. O pessoal não tem a cultura de adubar e vão explorando e explorando. Os rios que tínhamos morreram. As nascentes estão desmatadas.”

Em Ipirá, logo ao lado, a realidade é semelhante. No lugar da caatinga, estão os bois. A cena mais comum é ver o gado ou os cavalos amontoados embaixo das poucas árvores que restam para escapar do sol escaldante – cabeça na sombra, lombo de fora. “Ipirá era um município cheio de minifúndios”, explica Orlando Cintra, gerente de Agricultura e Cooperativismo da Prefeitura. “Os grandes criadores começaram a chegar nos anos 1960. Este pessoal comprou a terra barata e empurrou o homem que produzia a batata, a mandioca e a mamona para a periferia daqui ou para São Paulo, Mato Grosso e Paraná.” Outros tantos foram trabalhar no corte da cana-de-açúcar. “Aqui não tinha boi e os pequenos produtores não desmatavam”, continua. “O que criávamos mais era o bode. Foi com a chegada dos grandes fazendeiros que o clima em Ipirá começou a mudar mais rapidamente. Desmataram para plantar capim.”

“A caatinga não é uma área para agropecuária. É para criação de caprinos, ovinos, animais de médio porte. Trouxeram a cultura do Sul, de pecuarista, e todo mundo quis ter fazenda de boi aqui”, completa Meire Oliveira, assessora da Secretaria de Agricultura e Meio Ambiente de Ipirá.

Meire passou a infância na zona rural do município e ainda se lembra do cheiro dessa mata. Conta que, quando criança, fazia burros a partir de umbus: enfiava quatro pedaços de galhinhos na fruta, representando as quatro patas. “Pena que, muitas vezes, quando eu digo para não desmatar, nem meu pai me ouve”, lamenta. Ela parece conhecer todas as plantas da caatinga. Quando encontra um cacto coroa-de-frade, mostra que é possível comer seu fruto, pequenino e vermelho. Caminhando pelas propriedades da região, cruza as cercas de arame farpado com desenvoltura. Pega um punhado de maxixe ainda verde e explica como cozinhá-lo. “Igualzinho a quiabo, sabe?” No sertão, tudo pode ser aproveitado. “A caatinga tem um poder de regeneração incrível”, explica. “A solução seria deixá-la descansar. Algumas áreas no entorno do Rio do Peixe já estão em processo de desertificação.”

Um exemplo de preservação ambiental é o assentamento D. Mathias, que completou sete anos de existência. Ali, a caatinga aos poucos renasce entre bodes, cabras e ovelhas. As árvores são podadas apenas o suficiente para não machucarem os animais, que circulam livremente pelas aroeiras, xique-xiques e umbuzeiros. Organizado pelo Movimento Luta Camponesa (MLC), o símbolo do assentamento é uma família de retirantes desenhada em preto e vermelho. A fila é puxada por uma mulher com uma foice nas mãos. Em seguida vem um homem, com uma enxada nos ombros. Dois filhos, um menino e uma menina seguem-nos de mãos dadas. Por último, um cachorro que, quiçá, se chama Baleia.

Júlio César Santos, dirigente da EBDA, presta assistência aos assentados e explica que os camponeses estão muito atentos às políticas públicas e linhas de crédito oferecidas pelos governos estadual e federal. Com isso, já conseguiram construir casas, comprar uma resfriadeira de leite e ampliar a criação de ovelhas. Entre as últimas iniciativas no local está a plantação adensada de palmas, mais rentável do que a tradicional. Em um primeiro momento, os agricultores não confiaram na técnica e continuaram plantando os cactos distantes uns dos outros, como sempre fizeram. Para contornar as dificuldades, Santos utilizou o “método de Paulo Freire”. Plantou dois roçados: de um lado, as palmas, adensadas; de outro, as tradicionais. Agora, as duas estão crescendo e ele espera, em breve, provar sua teoria. “Tomara que a falta de chuva não queime elas”, diz.

O sucesso do assentamento motivou, há 11 meses, um acampamento no latifúndio vizinho. Leidinaura Souza Santana, ou simplesmente Leila, é uma das moradoras do acampamento Elenaldo Teixeira. “O problema maior aqui é a água para beber e cozinhar. Ficamos quase 15 dias sem água. O caminhão-pipa chegou só ontem”, reclama. “A Embasa [Empresa Baiana de Águas e Saneamento] suspendeu o pipa por causa do rio, que já estava muito baixo, e também porque deu um problema na bomba”, explica Meire, que acompanha a visita. “Tivemos que tomar uma água que não é boa para beber”, murmura Leila.

Leila nasceu em Coração de Maria, ao norte de Feira de Santana. O marido trabalhava como vaqueiro em Malhador, povoado no município de Ipirá, quando souberam dos boatos da ocupação. Vieram logo participar. “Estamos esperando chegar a hora para entrar dentro da fazenda e acabar com o sofrimento. A área já foi atestada como improdutiva. O assentamento aqui do lado é uma maravilha. Me animei de ver que esse pessoal era acampado como a gente. Não desisto, não”, afirma. Meire aproveita para dar uma injeção de ânimo: “Eu acompanhei o outro acampamento desde o começo e era igualzinho. Acho que era até mais quente que este. Este é mais fresco. E olha como estão hoje”.

A conversa acontece na escola do acampamento, onde jovens e adultos são alfabetizados. A pequena construção de palha e madeira da escola fica no início daquela que foi batizada de “Avenida Brasil”, uma sequência bem aprumada de cerca de 15 barracos de lona. Leila acabou de passar para a 4a série do ensino fundamental e soletra o nome para mim. “L-E-I-D-I-N-A-U-R-A.” “Não é com ‘l’, não?”, pergunta Meire. “Não, é com ‘u’ mesmo”, Leila responde.

Em Tamanduá, povoado do entorno de Ipirá, motos e jegues passam com gente e baldes na garupa. Tudo lembra a estiagem. Egecivaldo Oliveira Nunes está à beira da estrada, ao volante do caminhão-pipa estacionado em frente à casa azul e branca. “Só trabalho particular, não trabalho com Exército nem Prefeitura. Pegamos água das barragens porque os açudes estavam secos”, ele conta, afirmando que nos piores dias da seca não “acha tempo” para as entregas solicitadas. O pagamento é por distância, e a cada quilômetro rodado muda o valor: 5 quilômetros são equivalentes a 9 mil litros e custam R$ 80. Quem não puder pagar (como os acampados) pode esperar pela Defesa Civil estadual – que afirma ter investido R$ 4 milhões em caminhões-pipa – ou pelo Exército, que mensalmente abastece de água 137 municípios.

“A cada ano, a seca vem mais intensa e a tendência é sempre durar mais”, lamenta Orlando Cintra, gerente de Agricultura e Cooperativismo de Ipirá. “A perspectiva é a de que em cinco ou seis anos ninguém vá produzir mais nada aqui, na área da agricultura. O clima vem se transformando. A cada ano piora.”

“Já tivemos tantas previsões, e nada”, diz Jeane Santiago. “Passa a previsão de chuva no jornal e as pessoas dizem: ‘Não tenho mais fé, só acredito se eu vir’. O pessoal da zona rural tem simpatias, como ‘se a flor do mandacaru desabrochar é sinal de que vai chover’. Mas todas deram errado até agora. A fé está acabando.” Os mandacarus já florearam. O vermelho-forte chama atenção. Agora é esperar.

In Big Data, We Hope and Distrust (Huffington Post)

By Robert Hall

Posted: 04/03/2013 6:57 pm

“In God we trust. All others must bring data.” — W. Edwards Deming, statistician, quality guru

Big data helped reelect a pesident, find Osama bin Laden, and contributed to the meltdown of our financial system. We are in the midst of a data revolution where social media introduces new terms like Arab Spring, Facebook Depression and Twitter anxiety that reflect a new reality: Big data is changing the social and relationship fabric of our culture.

We spend hours installing and learning how to use the latest versions of our ever-expanding technology while enduring a never-ending battle to protect our information. Then we labor while developing practices to rid ourselves of technology — rules for turning devices off during meetings or movies, legislation to outlaw texting while driving, restrictions in classrooms to prevent cheating, and scheduling meals or family time where devices are turned off. Information and technology: We love it, hate it, can’t live with it, can’t live without it, use it voraciously, and distrust it immensely. I am schizophrenic and so am I.

Big data is not only big but growing rapidly. According to IBM, we create 2.5 quintillion bytes a day and that “ninety percent of the data in the world has been created in the last two years.” Vast new computing capacity can analyze Web-browsing trails that track our every click, sensor signals from every conceivable device, GPS tracking and social network traffic. It is now possible to measure and monitor people and machines to an astonishing degree. How exciting, how promising. And how scary.

This is not our first data rodeo. The early stages of the customer relationship management movement were filled with hope and with hype. Large data warehouses were going to provide the kind of information that would make companies masters of customer relationships. There were just two problems. First, getting the data out of the warehouse wasn’t nearly as hard as getting it into the person or device interacting with the customers in a way that added value, trust and expanded relationships. We seem to always underestimate the speed of technology and overestimate the speed at which we can absorb it and socialize around it.

Second, unfortunately the customers didn’t get the memo and mostly decided in their own rich wisdom they did not need or want “masters.” In fact as providers became masters of knowing all the details about our lives, consumers became more concerned. So while many organizations were trying to learn more about customer histories, behaviors and future needs — customers and even their governments were busy trying to protect privacy, security, and access. Anyone attempting to help an adult friend or family member with mental health issues has probably run into well-intentioned HIPAA rules (regulations that ensure privacy of medical records) that unfortunately also restrict the ways you can assist them. Big data gives and the fear of big data takes away.

Big data does not big relationships make. Over the last 20 years as our data keeps getting stronger, our customer relationships keep getting weaker. Eighty-six percent of consumers trust corporations less than they did five years ago. Customer retention across industries has fallen about 30 percent in recent years. Is it actually possible that we have unwittingly contributed in the undermining of our customer relationships? How could that be? For one thing, as companies keep getting better at targeting messages to specific groups and those groups keep getting better at blocking their messages. As usual, the power to resist trumps the power to exert.

No matter how powerful big data becomes, if it is to realize its potential, it must build trust on three levels. First, customers must trust our intentions. Data that can be used for us can also be used against us. There is growing fear institutions will become a part of a “surveillance state.” While organizations have gone to great length to promote protection of our data — the numbers reflect a fair amount of doubt. For example, according to MainStreet, “87 percent of Americans do not feel large banks are transparent and 68 percent do not feel their bank is on their side.:

Second, customers must trust our actions. Even if they trust our intentions, they might still fear that our actions put them at risk. Our private information can be hacked, then misused and disclosed in damaging and embarrassing ways. After the Sandy Hook tragedy a New York newspaper published the names and addresses of over 33,000 licensed gun owners along with an interactive map that showed exactly where they lived. In response names and addresses of the newspaper editor and writers were published on-line along with information about their children. No one, including retired judges, law enforcement officers and FBI agents expected their private information to be published in the midst of a very high decibel controversy.

Third, customers must trust the outcome — that sharing data will benefit them. Even with positive intentions and constructive actions, the results may range from disappointing to damaging. Most of us have provided email addresses or other contact data — around a customer service issue or such — and then started receiving email, phone or online solicitations. I know a retired executive who helps hard-to-hire people. She spent one evening surfing the Internet to research about expunging criminal records for released felons. Years later, Amazon greets her with books targeted to the felon it believes she is. Even with opt-out options, we felt used. Or, we provide specific information, only to repeat it in the next transaction or interaction — not getting the hoped for benefit of saving our time.

It will be challenging to grow the trust at anywhere near the rate we grow the data. Information develops rapidly, competence and trust develop slowly. Investing heavily in big data and scrimping on trust will have the opposite effect desired. To quote Dolly Parton who knows a thing or two about big: “It costs a lot of money to look this cheap.”

How Big Could a Man-Made Earthquake Get? (Popular Mechanics)

Scientists have found evidence that wastewater injection induced a record-setting quake in Oklahoma two years ago. How big can a man-made earthquake get, and will we see more of them in the future?

By Sarah Fecht – April 2, 2013 5:00 PM

hydraulic fracking drilling illustration

Hydraulic fracking drilling illustration. Brandon Laufenberg/Getty Images

In November 2011, a magnitude-5.7 earthquake rattled Prague, Okla., and 16 other nearby states. It flattened 14 homes and many other buildings, injured two people, and set the record as the state’s largest recorded earthquake. And according to a new study in the journal Geology, the event can also claim the title of Largest Earthquake That’s Ever Been Induced by Fluid Injection.”

In the paper, a team of geologists pinpoints the quake’s starting point at less than 200 meters (about 650 feet) from an injection well where wastewater from oil drilling was being pumped into the ground at high pressures. At 5.7 magnitude, the Prague earthquake was about 10 times stronger than the previous record holder: a magnitude-4.8 Rocky Mountain Arsenal earthquake in Colorado in 1967, caused by the U.S. Army injecting a deep well with 148,000 gallons per day of fluid wastes from chemical-weapons testing. So how big can these man-made earthquakes get?

The short answer is that scientists don’t really know yet, but it’s possible that fluid injection could cause some big ones on very rare occasions. “We don’t see any reason that there should be any upper limit for an earthquake that is induced,” says Bill Ellsworth, a geophysicist with the U.S. Geological Survey, who wasn’t involved in the new study.

As with natural earthquakes, most man-made earthquakes have been small to moderate in size, and most are felt only by seismometers. Larger quakes are orders of magnitude rarer than small quakes. For example, for every 1000 magnitude-1.0 earthquakes that occur, expect to see 100 magnitude-2.0s, 10 magnitude-3.0s, just 1 magnitude-4.0, and so on. And just as with natural earthquakes, the strength of the induced earthquake depends on the size of the nearby fault and the amount of stress acting on it. Some faults just don’t have the capacity to cause big earthquakes, whether natural or induced.

How do Humans Trigger Earthquakes?

Faults have two major kinds of stressors: shear stress, which makes two plates slide past each other along the fault line, and normal stress, which pushes the two plates together. Usually the normal stress keeps the fault from moving sideways. But when a fluid is injected into the ground, as in Prague, that can reduce the normal stress and make it easier for the fault to slip sideways. It’s as if if you have a tall stack of books on a table, Ellsworth says: If you take half the books away, it’s easier to slide the stack across the table.

“Water increases the fluid pressure in pores of rocks, which acts against the pressure across the fault,” says Geoffrey Abers, a Columbia University geologist and one of the new study’s authors. “By increasing the fluid pressure, you’re decreasing the strength of the fault.”

A similar mechanism may be behind earthquakes induced by large water reservoirs. In those instances, the artificial lake behind a dam causes water to seep into the pore spaces in the ground. In 1967, India’s Koyna Dam caused a 6.5 earthquake that killed 177 people, injured more than 2000, and left 50,000 homeless. Unprecedented seasonal fluctuations in water level behind a dam in Oroville, Calif., are believed to be behind the magnitude-6.1 earthquake that occurred there in 1975.

Extracting a fluid from the ground can also contribute to triggering a quake. “Think about filling a balloon with water and burying it at the beach,” Ellsworth says. “If you let the water out, the sand will collapse inward.” Similarly, when humans remove large amounts of oil and natural gas from the ground, it can put additional stress on a fault line. “In this case it may be the shear stresses that are being increased, rather than normal stresses,” Ellsworth says.

Take the example of the Gazli gas field in Uzbekistan, thought to be located in a seismically inactive area when drilling began in 1962. As drillers removed the natural gas, the pressure in the gas field dropped from 1030 psi in 1962 to 515 psi in 1976, then down to 218 psi in 1985. Meanwhile, three large magnitude-7.0 earthquakes struck: two in 1976 and one in 1984. Each quake had an epicenter within 12 miles of Gazli and caused a surface uplift of some 31 inches. Because the quakes occurred in Soviet-era Uzbekistan, information about the exact locations, magnitudes, and causes are not available. However, a report by the National Research Council concludes that “observations of crustal uplift and the proximity of these large earthquakes to the Gazli gas field in a previously seismically quiet region strongly suggest that they were induced by hydrocarbon extraction.” Extraction of oil is believed to have caused at least three big earthquakes in California, with magnitudes of 5.9, 6.1, and 6.5.

Some people worry that hydraulic fracturing, or fracking‚Äîwherein high-pressure fluids are used to crack through rock layers to extract oil and natural gas‚Äîwill lead to an increased risk of earthquakes. However, the National Research Council report points out that there are tens of thousands of hydrofracking wells in existence today, and there has only been one case in which a “felt” tremor was linked to fracking. That was a 2.3 earthquake in Blackpool, England, in 2011, which didn’t cause any significant damage. Although scientists have known since the 1920s that humans trigger earthquakes, experts caution that it’s not always easy to determine whether a specific event was induced.

Are Human Activities Making Quakes More Common?

Human activities have been linked to increased earthquake frequencies in certain areas. For instance, researchers have shown a strong correlation between the volume of fluid injected into the Rocky Mountain Arsenal well and the frequency of earthquakes in that area.

Geothermal-energy sites can also induce many earthquakes, possibly due to pressure, heat, and volume changes. The Geysers in California is the largest geothermal field in the U.S., generating 725 megawatts of electricity using steam from deep within the earth. Before The Geysers began operating in 1960, seismic activity was low in the area. Now the area experiences hundreds of earthquakes per year. Researchers have found correlations between the volume of steam production and the number of earthquakes in the region. In addition, as the area of the steam wells increased over the years, so did the spatial distribution of earthquakes.

Whether or not human activity is increasing the magnitude of earthquakes, however, is more of a gray area. When it comes to injection wells, evidence suggests that earthquake magnitudes rise along with the volume of injected wastewater, and possibly injection pressure and rate of injection as well, according to a statement from the Department of Interior.

The vast majority of earthquakes caused by The Geysers are considered to be microseismic events—too small for humans to feel. However, researchers from Lawrence Berkeley National Laboratory note that magnitude-4.0 earthquakes, which can cause minor damage, seem to be increasing in frequency.

The new study says that though earthquakes with a magnitude of 5.0 or greater are rare east of the Rockies, scientists have observed an 11-fold increase between 2008 and 2011, compared with 1976 through 2007. But the increase hasn’t been tied to human activity. “We do not really know what is causing this increase, but it is remarkable,” Abers says. “It is reasonable that at least some may be natural.”

Futuristic predictions from 1988 LA Times Magazine come true… mostly (Singularity Hub)

Written By: 

Posted: 03/28/13 8:52 AM

los-angeles-banner

In 2013, a day in the life of a Los Angeles family of four is an amazing testament to technological progress and the idealistic society that can be achieved…or at least that’s what the Los Angeles Times Magazine was hoping for 25 years ago. Back in April 1988, the magazine ran a special cover story called “L.A. 2013″ and presented what a typical day would be like for a family living in the city.

The author of the story, Nicole Yorkin, spoke with over 30 experts and futurists to forecast daily life in 2013 and then wove these into a story akin to those “World of Tomorrow” MGM cartoons from the mid-20th century. But unlike the cartoons which often included far fetched technologies for humor, what’s most remarkable about the 1988 article is just how many of the predictions have actually come to pass, giving some leeway in how accurately the future can be imagined.

For anyone considering what will happen in the next 25 years, the article is worth a read as it serves as an amazing window into how well the future can be predicted in addition to what technology is able to achieve in a short period of time.

LA-2013-banner

Just consider the section on ‘smart cars’ speculated to be “smaller, more efficient, more automated and more personalized” than cars 25 years ago. While experts envisioned that cars would have more Transformer-like abilities to change from a sports car to a beach buggy, the key development in automobile technology will be “a central computer in the car that will control a number of devices.” Furthermore, cars were expected to be equipped with “electronic navigation or map systems,” or GPS systems. Although modern cars don’t have a ‘sonar shield’ that would cause a car to slow down when it came closer to another, parking sensors are becoming common and rearview cameras may soon be required by law.

Though the article doesn’t explicitly predict the Internet and all its consequences per se, computers were implicit to some of the predictions, such as telecommuting, virtual shopping, smart cards for health monitoring, a personalized ‘home newspaper,’ and video chatting. Integrated computers were also expected in the form of smart appliances, wall-to-ceiling computer displays in classrooms, and 3D video conferencing. These technologies exist today thanks to the networked computer revolution that was amazingly only in its infancy in 1988.

LA-2013-robot

‘The Ultimate Appliance’ is the mobile robot expected to be a ‘fixture’ in today’s homes.

But of all the technologies expected to be part of daily life in 2013, the biggest miss by the article comes with robots.

In fact, the mobile robot “Billy Rae” is depicted as an integral component to the household, much like Rosie The Robot was in The Jetsons. In the story, the family communicates with Billy Rae naturally as the mother reads a list of chores for cleaning the house and preparing meals. There’s even a pet canine robot named Max that helps the son learn to read and do math. The robots aren’t necessarily depicted as being super intelligent, but they were still expected to be vital, even being referred to as the “ultimate appliance.”

In recent years, great strides have been made with robots and artificial intelligence, but we are years away from having a maid-like robot that was hoped for in the article. We’re all familiar withcleaning robots like the Roomba and hospitals are starting to utilize healthcare robots.Personal assistants like Siri show that we’re getting closer to the day when people and computers can communicate verbally. But bringing all these technologies together is one of the most challenging problems to be solved, even with the high amounts of expectation and huge market potential that these bots will experience.

In light of this, it’s interesting to compare the predictions in this article to those in French illustrations drawn around 1900, which also include a fair share of robotic automation.

The piece is peppered with utopian speculation, but already on the radar were concerns about the shifting job market, increasing pollution, and the need for quality schooling, public transportation, and affordable housing, issues that have reached or are nearing crisis levels. It’s comforting to know that many of the problems that modern cities face were understood fairly well a quarter of a century ago, but it is sobering to recognize how technologies have been slow in some cases at handling these problems.

Perhaps the greatest lesson from reading the article is that few of the predictions are completely wrong, but the timescale was ambitious. Almost all of the technologies described will get here sooner or later. The real issue then is, what is preventing rapid innovation or broad-scale adoption of technologies?

Not surprisingly, the answers today are the same as they were 25 years ago: time and money.

LA-metro-rail

[images: kla4067/Flickr, LA Times]

Everybody Knows. Climate Denialism has peaked. Now what are we going to do? (EcoEquity)

– Tom Athanasiou (toma@ecoequity.org).  April 2, 2013.

It was never going to be easy to face the ecological crisis.  Even back in the 1970s, before climate took center stage, it was clear that we the prosperous were walking far too heavily.  And that “environmentalism,” as it was called, was only going to be a small beginning.  But it was only when the climate crisis pushed fossil energy into the spotlight that the real stakes were widely recognized.  Fossil fuels are the meat and potatoes of industrial civilization, and the need to rapidly and radically reduce their emissions cut right through to the heart of the great American dream.  And the European dream.  And, inevitably, the Chinese dream as well.

Decades later, 81% of global energy is still supplied by the fossil fuels: coal, gas, and oil.[1]  And though the solar revolution is finally beginning, the day is late.  The Arctic is melting, and, soon, as each year the northern ocean lies bare beneath the summer sun, the warming will accelerate.  Moreover, our plight is becoming visible.  We have discovered, to our considerable astonishment, that most of the fossil fuel on the books of our largest corporations is “unburnable” – in the precise sense that, if we burn it, we are doomed.[2]  Not that we know what to do with this rather strange knowledge.  Also, even as China rises, it’s obvious that it’s not the last in line for the promised land.  Billions of people, all around the world, watch the wealthy on TV, and most all of them want a drink from the well of modern prosperity.  Why wouldn’t they?  Life belongs to us all, as does the Earth.

The challenge, in short, is rather daunting.

The denial of the challenge, on the other hand, always came ready-made.  As Francis Bacon said so long ago, “what a man would rather were true, he more readily believes.”  And we really did want to believe that ours was still a boundless world.  The alternative – an honest reckoning – was just too challenging.  For one thing, there was no obvious way to reconcile the Earth’s finitude with the relentless expansion of the capitalist market.  And as long as we believed in a world without limits, there was no need to see that economic stratification would again become a fatal issue.  Sure, our world was bitterly riven between haves and have-nots, but this problem, too, would fade in time.  With enough growth – the universal balm – redistribution would never be necessary.  In time, every man would be a king.

The denial had many cheerleaders.  The chemical-company flacks who derided Rachel Carson as a “hysterical woman” couldn’t have known that they were pioneering a massive trend.  Also, and of course, big money always has plenty of mouthpieces.  But it’s no secret that, during the 20th Century, the “engineering of consent” reached new levels of sophistication.  The composed image of benign scientific competence became one of its favorite tools, and somewhere along the way tobacco-industry science became a founding prototype of anti-environmental denialism.  On this front, I’m happy to say that the long and instructive history of today’s denialist pseudo-science has already been expertly deconstructed.[3]  Given this, I can safely focus on the new world, the post-Sandy world of manifest climatic disruption in which the denialists have lost any residual aura of scientific legitimacy, and have ceased to be a decisive political force.  A world in which climate denialism is increasingly seen, and increasingly ridiculed, as the jibbering of trolls.

To be clear, I’m not claiming that the denialists are going to shut up anytime soon.  Or that they’ll call off their suicidal, demoralizing campaigns.  Or that their fogs and poisons are not useful to the fossil-fuel cartel.  But the battle of the science is over, at least as far as the scientists are concerned.  And even on the street, hard denialism is looking pretty ridiculous.  To be sure, the core partisans of the right will fight on, for the win and, of course, for the money.[4]  And they’ll continue to have real weight too, for just as long as people do not believe that life beyond carbon is possible.  But for all this, their influence has peaked, and their position is vulnerable.  They are – and visibly now – agents of a mad and dangerous ideology.  They are knaves, and often they are fools.[5]

As for the rest of us, we can at least draw conclusions, and make plans.

As bad as the human prospect may be – and it is quite bad – this is not “game over.”  We have the technology we need to save ourselves, or most of it in any case; and much of it is ready to go.  Moreover, the “clean tech” revolution is going to be disruptive indeed.  There will be cascades of innovation, delivering opportunities of all kinds, all around the world.  Also, our powers of research and development are strong.  Also, and contrary to today’s vogue for austerity and “we’re broke” political posturing, we have the money to rebuild, quickly and on a global scale.  Also, we know how to cooperate, at least when we have to.  All of which is to say that we still have options.  We are not doomed.

But we are in extremely serious danger, and it is too late to pretend otherwise.  So allow me to tip my hand by noting Jorgen Randers’ new book, 2052: A Global Forecast for the Next Forty Years.[6]  Randers is a Norwegian modeler, futurist, professor, executive, and consultant who made his name as co-author of 1972’s landmark The Limits to Growth.  Limits, of course, was a global blockbuster; it remains the best-selling environmental title of all times.  Also, Limits has been relentlessly ridiculed (the early denialists cut their teeth by distorting it[7]) so it must be said that – very much contrary to the mass-produced opinions of the denialist age – its central, climate-related projections are holding up depressingly well.[8]

By 2012 (when he published 2052) Randers had decided to step away from the detached exploration of multiple scenarios that was the methodological core of Limits, and to make actual predictions.  After a lifetime of frustrated efforts, these predictions are vivid, pessimistic and bitter.  In a nutshell, Randers doesn’t expect anything beyond what he calls “progress as usual,” and while he expects it to yield a “light green” buildout (e.g., solar on a large scale) he doesn’t think it will suffice to stabilize the climate system.  Such stabilization, he grants, is still possible, but it would require concerted global action on a scale that neither he nor Dennis Meadows, the leader of the old Limits team, see on today’s horizon.  Let’s call that kind of action global emergency mobilization.  Meadows, when he peers forwards, sees instead “many decades of uncontrolled climatic disruption and extremely difficult decline.”[9]  Randers is more precise, and predicts that we will by 2052 wake to find ourselves on a dark and frightening shore, knowing full well that our planet is irrevocably “on its way towards runaway climate change in the last third of the twenty-first century.”

This is an extraordinary claim, and it requires extraordinary evidence.[10]  Such evidence, unfortunately, is readily available, but for the moment let me simply state the public secret of this whole discussion.  To wit: we (and I use this pronoun advisedly) can still avoid a global catastrophe, but it’s not at all obvious that we will do so.  What is obvious is that stabilizing the global climate is going to be very, very hard.  Which is a real problem, because we don’t do hard anymore.  Rather, when confronted with a serious problem, we just do what we can, hoping that it will be enough and trying our best not to offend the rich.  In truth, and particularly in America, we count ourselves lucky if we can manage governance at all.

This essay is about climate politics after legitimate skepticism.  Climate politics in a world where, as Leonard Cohen put it, “everybody knows.”  What does this mean?  In the first place, it means that we’ve reached the end of what might be called “environmentalism-as-usual.”  This point is widely understood and routinely granted, as when people say something like “climate is not a merely environmental problem,” but my concern is a more particular one.  As left-green writer Eddie Yuen astutely noted in a recent book on “catastrophism,” the problems of the environmental movement are to a very large degree rooted in “the pairing of overwhelmingly bleak analysis with inadequate solutions.”[11]  This is exactly right.

The climate crisis demands a “new environmentalism,” and such a thing does seem to be emerging.  It’s final shape is unknowable, but one thing is certain – the environmentalism that we need will only exist when its solutions and strategies stand up to its own analyses.  The problem is that this requires us to take our “overwhelmingly bleak” analyses straight, rather than soft-pedaling them so that our “inadequate solutions” might look good.  Pessimism, after all, is closely related to realism.  It cannot just be wished away.

Soft-pedaling, alas, has long been standard practice, on both the scientific and the political sides of the climate movement.  Examples abound, but the best would have to be the IPCC itself, the U.N’s Intergovernmental Panel on Climate Change.  The world’s premier climate-science clearinghouse, the IPCC is often attacked from the right, and has developed a shy and reticent culture.  Even more importantly, though, and far more rarely noted, is that the IPCC is conservative by definition and by design.[12]  It almost has to be conservative to do its job, which is to herd the planet’s decision makers towards scientific realism.  The wrinkle is that, at this point, this isn’t even close to being good enough, not at least in the larger scheme.  At this point, we need strategic realism as well as baseline scientific realism, and it demands a brutal honesty in which underlying scientific and political truths are clearly drawn and publicly expressed.

Yet when it comes to strategic realism, we balk.  The first impulse of the “messaging” experts is always to repeat their perennial caution that sharp portraits of the danger can be frightening, and disempowering, and thus lead to despair and passivity.  This is an excellent point, but it’s only the beginning of the truth, not the end.  The deeper problem is that the physical impacts of climate disruption – the destruction and the suffering – will continue to escalate.  “Superstorm Sandy” was bad, but the future will be much worse.  Moreover, the most severe suffering will be far away, and easy for the good citizens of the wealthy world to ignore.  Imagine, for example, a major failure of the Indian Monsoon, and a subsequent South Asian famine.  Imagine it against a drumbeat background in which food is becoming progressively more expensive.  Imagine the permanence of such droughts, and increasing evidence of tipping points on the horizon, and a world in which ever more scientists take it upon themselves to deliver desperate warnings.  The bottom line will not be the importance of communications strategies, but rather the manifest reality, no longer distant and abstract, and the certain knowledge that we are in deep trouble.  And this is where the dangers of soft-pedaling lie.  For as people come to see the scale of the danger, and then to look about for commensurate strategies and responses, the question will be if such strategies are available, and if they are known, and if they are plausible.  If they’re not, then we’ll all going, together, down the road “from aware to despair.”

Absent the public sense of a future in which human resourcefulness and cooperation can make a decisive difference, we assuredly face an even more difficult future in which denial fades into a sense of pervasive hopelessness.  The last third of the century (when Randers is predicting “runaway climate change”) is not so very far away.  Which is to say that, as denialism collapses – and it will – the challenge of working out a large and plausible response to the climate crisis will become overwhelmingly important.  If we cannot imagine such a response, and explain how it would actually work, then people will draw their own conclusions.  And, so far, it seems that we cannot.  Even those of us who are now climate full-timers don’t have a shared vision, not in any meaningful detail, nor do we have a common sense of the strategic initiatives that could make such a vision cohere.

The larger landscape is even worse.  For though many scientists are steeling themselves to speak, the elites themselves are still stiff and timid, and show few signs of rising to the occasion.  Each month, it seems, there’s another major report on the approaching crisis – the World Bank, the National Intelligence Council, and the International Energy Agency have all recently made hair-raising contributions – but they never quite get around to the really important questions.  How should we contrive the necessary global mobilization?  What conditions are needed to absolutely maximize the speed of the clean-tech revolution?  By what strategy will we actually manage to keep the fossil-fuels in the ground?  What kind of international treaties are necessary, and how shall we establish them?  What would a fast-enough global transition cost, and how shall we pay for it?  What about all those who are forced to retreat from rising waters and drying lands?  How shall they live, and where?  How shall we talk about rights and responsibilities in the Greenhouse Century?  And what about the poor?  How shall they find futures in a climate-constrained world?  Can we even imagine a world in which they do?

In the face of such questions, you have a choice.  You can conclude that we’ll just have to do the best we can, and then you can have a drink.  Or maybe two.  Or you can conclude that, despite all evidence to the contrary, enough of us will soon awaken to reality.  What’s certain is that, all around us, there is a vast potentiality – for reinvention, for resistance, for redistribution, and for renewal of all kinds – and that it could at any time snap into solidity.  And into action.

Forget about “hope.”  What we need now is intention.

***

About a decade ago, in San Francisco, I was on a PBS talk show with, among others, Myron Ebell, chief of climate propaganda at the Competitive Enterprise Institute.  Ebell is an aggressive professional, and given the host’s commitment to phony balance he was easily able to frame the conversation.[13]  The result was a travesty, but not an entirely wasted time, at least not for me.  It was instructive to speak, tentatively, of the need for global climate justice, and to hear, in response, that I was a non-governmental fraud that was only in it for the money.  Moreover, as the hour wore on, I came to appreciate the brutal simplicity of the denialist strategy.  The whole point is to suck the oxygen out of the room, to weave such a tangle of confusionism and pseudo-debate that the Really Big Question – What is to be done? – becomes impossible to even ask, let alone discuss.

When Superstorm Sandy slammed into the New York City region, Ebell’s style of hard denialism took a body blow, though obviously it has not dropped finally to the mat.  Had it done do, the Big Question, in all its many forms, would be buzzing constantly around us.  Clearly, that great day has not yet come.  Still, back in November of 2012, when Bloomberg’s Business Week blared “It’s Global Warming, Stupid” from its front cover, this was widely welcomed as a overdue milestone.  It may even be that Michael Tobis, the editor of the excellent Planet 3.0, will prove correct in his long-standing, half-facetious prediction that 2015 will be the date when “the Wall Street Journal will acknowledge the indisputable and apparent fact of anthropogenic climate change; the year in which it will simply be ridiculous to deny it.”[14]  Or maybe not.  Maybe that day will never come.  Maybe Ebell’s style of well-funded, front-group denialism will live on, zombie-like, forever.  Or maybe (and this is my personal prediction) hard climate denialism will soon go the way of creationism and far-right Christianity, becoming a kind of political lifestyle choice, one that’s dangerous but contained.  One that’s ultimately more dangerous to the right than it is to the reality-based community.

If so, then at some point we’re going to have to ask ourselves if we’ve been so long distracted by the hard denialists that we’ve missed the parallel danger of a “soft denialism.”  By which I mean the denialism of a world in which, though the dangers of climate change are simply too ridiculous to deny, they still – somehow – are not taken to imply courage, and reckoning, and large-scale mobilization.  This is a long story, but the point is that, now that the Big Question is finally on the table, we’re going to have to answer it.  Which is to say that we’re going to have to face the many ways in which political timidity and small-bore realism have trained us to calibrate our sense of what must be done by our sense of what can be done, which these days is inadequate by definition.

And not just because of the denialists.

George Orwell once said that “To see what is in front of one’s nose needs a constant struggle.”[15]  As we hurtle forward, this struggle will rage as never before.  The Big Question, after all, changes everything.  Another way of saying this is that our futures will be shaped by the effort to avoid a full-on global climate catastrophe.  Despite all the rest of the geo-political and geo-economic commotion that will mark the 21st Century (and there’ll be plenty) it will be most fundamentally the Greenhouse Century.  We know this now, if we care to, though still only in preliminary outline.  The details, inevitably, will surprise us all.

The core problem, of course, will be “ambition” – action on the scale that’s actually necessary, rather than the scale that is or appears to be possible.  And here, the legacies of the denialist age – the long-ingrained habits of soft-pedaling and strained optimism – will weigh heavily.  Consider the quasi-official global goal (codified, for example, in the Copenhagen Accord) to hold total planetary warming to 2°C (Earth surface average) above pre-industrial levels.  This is the so-called “2°C target.”  What are we to do with it in the post-denialist age?  Let me count the complications: One, all sorts of Very Important People are now telling us it’s going to all but impossible to avoid overshooting 2°C.[16]  Two, in so doing, they are making a political and not a scientific judgment, though they’re not always clear on this point.  (It’s probably still technically possible to hold the 2°C line – if we’re not too unlucky – though it wouldn’t be easy under the best of circumstances.)[17]  Three, the 2°C line, which was once taken to be reasonably safe, is now widely seen (at least among the scientists) to mark the approximate point of transition from “dangerous” to “extremely dangerous,” and possibly to altogether unmanageable levels of warming.[18]  Four, and finally, it’s now widely recognized that any future in which we approach the 2°C line (which we will do) is one in which we also have a real possibility of pushing the average global temperature up by 3°C, and if this were to come to pass we’d be playing a very high-stakes game indeed, one in which uncontrolled positive feedbacks and worst-case scenarios were surrounding us on every side.

The bottom line is today as it was decades ago.  Greenhouse-gas emissions were increasing then, and they are increasing now.  In late 2012, the authoritative Global Carbon Project reported that, since 1990, they had risen by an astonishing 58 percent.[19]  The climate system has unsurprisingly responded with storms, droughts, ice-melt, conflagrations and floods.  The weather has become “extreme,” and may finally be getting our attention.  In Australia, according to the acute Mark Thomson of the Institute for Backyard Studies in Adelaide, the crushing heatwave of early 2013 even pushed aside “the idiot commentariat” and cleared the path for a bit of 11th-hour optimism: “Another year of this trend will shift public opinion wholesale.  We’re used to this sort of that temperature now and then and even take a perverse pride in dealing with it, but there seems to be a subtle shift in mood that ‘This Could Be Serious.’”  Let’s hope he’s right.  Let’s hope, too, that the mood shift that swept through America after Sandy also lasts, and leads us, too, to conclude that ‘This Could Be Serious.’  Not that this alone would be enough to support a real mobilization – the “moral equivalent of war” that we need – but it would be something.  It might even lead us to wonder about our future, and about the influence of money and power on our lives, and to ask how serious things will have to get before it becomes possible to imagine a meaningful change of direction.

The wrinkle is that, before we can advocate for a meaningful change of direction, we have to have one we believe in, one that we’re willing to explain in global terms that actually scale to the problem.  None of which is going to be easy, given that we’re fast approaching a point where only tales of existential danger ring true.  (cf the zombie apocalypse).  The Arctic ice, as noted above, offers an excellent marker.  In fact, the first famous photos of Earth from space – the “blue marble” photos taken in 1972 by the crew of the Apollo 17 – allow us to anchor our predicament in time and in memory.  For these are photos of an old Earth now passed away; they must be, because they show great expanses of ice that are nowhere to be found.  By August of 2012 the Arctic Sea’s ice cover had declined by 40%,[20] a melt that’s easily large enough to be visible from space.  Moreover, beneath the surface, ice volume is dropping even more precipitously.  The polar researchers who are now feverishly evaluating the great melting haven’t yet pushed the entire scientific community to the edge of despair, though they have managed to inspire a great deal of dark muttering about positive feedbacks and tipping points.  Soon, it seems, that muttering will become louder.  Perhaps as early as 2015, the Arctic Ocean will become virtually ice free for the first time in recorded history.[21]  When it does, the solar absorptivity of the Arctic waters will increase, and shift the planetary heat balance by a surprisingly large amount, and by so doing increase the rate of  planetary warming.  And this, of course, will not be end of it.  The feedbacks will continue.  The cycles will go on.

Should we remain silent about such matters, for risk of inflaming the “idiot commentariat?”  It’s absurd to even ask.  The suffering is already high, and if you know the science, you also know that the real surprise would be an absence of positive feedbacks.  The ice melt, the methane plumes, the drying of the rainforests – they’re all real.  Which is to say that there are obviously tipping points before us, though we do not and can not know how much time will pass before they force themselves upon our attention.  The real question is what we must do if we would talk of them in good earnest, while at the same time speaking, without despair and effectively, about the human future.


[1] Jorgen Randers, 2052: A Global Forecast for the Next Forty Years, Chelsea Green, 2012, page 99.

[2] Begin at the Carbon Track Initiative’s website.  http://www.carbontracker.org/

[3] Two excellent examples: Naomi Oreskes, Erik M. M. Conway, Merchants of Doubt: How a Handful of Scientists Obscured the Truth on Issues from Tobacco Smoke to Global Warming, Bloomsbury Press, 2011,  Chris Mooney, The Republican War on Science, Basic Books, 2006.

[4] See, for example, Suzanne Goldenberg, “Secret funding helped build vast network of climate denial thinktanks,” February 14, 2013, The Guardian.

[5] “Lord Monckton,” in particular, is fantastic.  See http://www.youtube.com/watch?v=w833cAs9EN0

[6] Randers, 2012.  See also Randers’ essay and video at the University of Cambridge 2013 “State of Sustainability Leadership,” athttp://www.cpsl.cam.ac.uk/About-Us/What-is-Sustainability-Leadership/The-State-of-Sustainability-Leadership.aspx

[7] Ugo Bardi, in The Limits to Growth Revisited (Springer Briefs, 2011) offers this summary:

“If, at the beginning, the debate on LTG had seemed to be balanced, gradually the general attitude on the study became more negative. It tilted decisively against the study when, in 1989, Ronald Bailey published a paper in “Forbes” where he accused the authors of having predicted that the world’s economy should have already run out of some vital mineral commodities whereas that had not, obviously, occurred.

Bailey’s statement was only the result of a flawed reading of the data in a single table of the 1972 edition of LTG. In reality, none of the several scenarios presented in the book showed that the world would be running out of any important commodity before the end of the twentieth century and not even of the twenty-first. However, the concept of the “mistakes of the Club of Rome” caught on. With the 1990s, it became commonplace to state that LTG had been a mistake if not a joke designed to tease the public, or even an attempt to force humankind into a planet-wide dictatorship, as it had been claimed in some earlier appraisals (Golub and Townsend 1977; Larouche 1983). By the end of the twentieth century, the victory of the critics of LTG seemed to be complete. But the debate was far from being settled.”

[8] See, for example, Graham Turner, “A Comparison of The Limits to Growth with Thirty Years of Reality.” Global Environmental Change, Volume 18, Issue 3, August 2008, Pages 397–411.  An unprotected copy (without the graphics) can be downloaded at www.csiro.au/files/files/plje.pdf.  Also

[9] In late 2012, Dennis Meadows said that “In the early 1970s, it was possible to believe that maybe we could make the necessary changes.  But now it is too late.  We are entering a period of many decades of uncontrolled climatic disruption and extremely difficult decline.”  See Christian Parenti, “The Limits to Growth’: A Book That Launched a Movement,” The Nation, December 24, 2012.

[11] Eddie Yuen, “The Politics of Failure Have Failed: The Environmental Movement and Catastrophism,” in Catastrophism: The Apocalyptic Politics of Collapse and Rebirth, Sasha Lilley, David McNally, Eddie Yuen, James Davis, with a foreword by Doug Henwood. PM Press 2012.  Yuen’s whole line is “the main reasons that [it] has not led to more dynamic social movements; these include catastrophe fatigue, the paralyzing effects of fear; the pairing of overwhelmingly bleak analysis with inadequate solutions, and a misunderstanding of the process of politicization.” 

[12] See Glenn Scherer, “Special Report: IPCC, assessing climate risks, consistently underestimates,” The Daily Climate, December 6, 2012.   More formally (and more interestingly) see Brysse, Oreskes, O’Reilly, and Oppenheimer, “Climate change prediction: Erring on the side of least drama?,” Global Environmental Change 23 (2013), 327-337.

[13] KQED-FM, Forum, July 22, 2003.

[14] Michael Tobis, editor of Planet 3.0, is amusing on this point.  He notes that “many data-driven climate skeptics are reassessing the issue,” that “In 1996 I defined the turning point of the discussion about climate science (the point where we could actually start talking about policy) as the date when theWall Street Journal would acknowledge the indisputable and apparent fact of anthropogenic climate change; the year in which it would simply be ridiculous to deny it.  My prediction was that this would happen around 2015… I’m not sure the WSJ has actually accepted reality yet.  It’s just starting to squint in its general direction.  2015 still looks like a good bet.”  See http://planet3.org/2012/08/07/is-the-tide-turning/

[15] The Collected Essays, Journalism and Letters of George Orwell: In Front of Your Nose, 1945-1950, Sonia Orwell and Ian Angus, Editors / Paperback / Harcourt Brace Jovanovich, 1968, p. 125.

[16] See for example, Fatih Birol and Nicholas Stern, “Urgent steps to stop the climate door closing,” The Financial Times, March 9, 2011.  And see Sir Robert Watson’s Union Frontiers of Geophysics Lecture at the 2012 meeting of the American Geophysical Union, athttp://fallmeeting.agu.org/2012/events/union-frontiers-of-geophysics-lecture-professor-sir-bob-watson-cmg-frs-chief-scientific-adviser-to-defra/

[17] I just wrote “probably still technically possible.”  I could have written “Excluding the small probability of a very bad case, and the even smaller probability of a very good case, it’s probably still technically possible to hold the 2°C line, though it wouldn’t be easy.”  This, however, is a pretty ugly sentence.  I could also have written “Unless we’re unlucky, and the climate sensitivity turns out be on the high side of the expected range, it’s still technically possible to hold the 2°C line, though it wouldn’t be easy, unless we’re very lucky, and the climate sensitivity turns out to be on the low side.”  Saying something like this, though, kind of puts the cart before the horse, since I haven’t said anything about “climate sensitivity,” or about how the scientists think about probability – and of course it’s even uglier.  The point, at least for now, is that climate projections are probabilistic by nature, which does not mean that they are merely “uncertain.”  We know a lot about the probabilities.

[18] See Kevin Anderson, a former director of Britain’s Tyndall Center, who has been unusually frank on this point.  His views are clearly laid out in a (non-peer-reviewed) essay published by the Dag Hammarskjold Foundation in Sweden.  See “Climate change going beyond dangerous – Brutal numbers and tenuous hope” in Development Dialog #61, September 2012, available at http://www.dhf.uu.se/wordpress/wp-content/uploads/2012/10/dd61_art2.pdf.  For a peer-reviewed paper, see Anderson and Bows, “Beyond ‘dangerous’ climate change: emission scenarios for a new world.”  Philosophical Transactions of The Royal Society, (2011) 369, 20-44 and for a lecture, see “Are climate scientists the most dangerous climate skeptics?” a Tyndall Centre video lecture (September 2010) at http://www.tyndall.ac.uk/audio/are-climate-scientist-most-dangerous-climate-sceptics.

[19] “The challenge to keep global warming below 2°C,” Glen P. Peters, et. al., Nature Climate Change (2012) 3, 4–6 (2013) doi:10.1038/nclimate1783.  December 2, 2012.  This figure might actually be revised upward, as 2012 saw the second-largest annual  concentration increase on record (http://climatedesk.org/2013/03/large-rise-in-co2-emissions-sounds-climate-change-alarm/)

[20] The story of the photos is on Wikipedia – see “blue marble.”  For the latest on the Arctic ice, see the “Arctic Sea Ice News and Analysis” page that the National Snow and Ice Data Center — http://nsidc.org/arcticseaicenews/

[21] Climate Progress is covering the “Arctic Death Spiral” in detail.  See for example Joe Romm, “NOAA: Climate Change Driving Arctic Into A ‘New State’ With Rapid Ice Loss And Record Permafrost Warming,” Climate Progress, Dec 6, 2012.  Give yourself a few hours and follow the links.

Climate Maverick to Retire From NASA (N.Y.Times)

Michael Nagle for The New York Times. James E. Hansen of NASA, retiring this week, reflected in a window at his farm in Pennsylvania.

By 

Published: April 1, 2013

His departure, after a 46-year career at the space agency’s Goddard Institute for Space Studies in Manhattan, will deprive federally sponsored climate research of its best-known public figure.

At the same time, retirement will allow Dr. Hansen to press his cause in court. He plans to take a more active role in lawsuits challenging the federal and state governments over their failure to limit emissions, for instance, as well as in fighting the development in Canada of a particularly dirty form of oil extracted from tar sands.

“As a government employee, you can’t testify against the government,” he said in an interview.

Dr. Hansen had already become an activist in recent years, taking vacation time from NASA to appear at climate protests and allowing himself to be arrested or cited a half-dozen times.

But those activities, going well beyond the usual role of government scientists, had raised eyebrows at NASA headquarters in Washington. “It was becoming clear that there were people in NASA who would be much happier if the ‘sideshow’ would exit,” Dr. Hansen said in an e-mail.

At 72, he said, he feels a moral obligation to step up his activism in his remaining years.

“If we burn even a substantial fraction of the fossil fuels, we guarantee there’s going to be unstoppable changes” in the climate of the earth, he said. “We’re going to leave a situation for young people and future generations that they may have no way to deal with.”

His departure, on Wednesday, will end a career of nearly half a century working not just for a single agency but also in a single building, on the edge of the Columbia University campus.

From that perch, seven floors above the diner made famous by “Seinfeld,” Dr. Hansen battled the White House, testified dozens of times in Congress, commanded some of the world’s most powerful computers and pleaded with ordinary citizens to grasp the basics of a complex science.

His warnings and his scientific papers have drawn frequent attack from climate-change skeptics, to whom he gives no quarter. But Dr. Hansen is a maverick, just as likely to vex his allies in the environmental movement. He supports nuclear power and has taken stands that sometimes undercut their political strategy in Washington.

In the interview and in subsequent e-mails, Dr. Hansen made it clear that his new independence would allow him to take steps he could not have taken as a government employee. He plans to lobby European leaders — who are among the most concerned about climate change — to impose a tax on oil derived from tar sands. Its extraction results in greater greenhouse emissions than conventional oil.

Dr. Hansen’s activism of recent years dismayed some of his scientific colleagues, who felt that it backfired by allowing climate skeptics to question his objectivity. But others expressed admiration for his willingness to risk his career for his convictions.

Initially, Dr. Hansen plans to work out of a converted barn on his farm in Pennsylvania. He has not ruled out setting up a small institute or taking an academic appointment.

He said he would continue publishing scientific papers, but he will no longer command the computer time and other NASA resources that allowed him to track the earth’s rising temperatures and forecast the long-run implications.

Dr. Hansen, raised in small-town Iowa, began his career studying Venus, not the earth. But as concern arose in the 1970s about the effects of human emissions of greenhouse gases, he switched gears, publishing pioneering scientific papers.

His initial estimate of the earth’s sensitivity to greenhouse gases was somewhat on the high side, later work showed. But he was among the first scientists to identify the many ways the planet is likely to respond to rising temperatures and to show how those effects would reinforce one another to produce immense changes in the climate and environment, including a sea level rise that could ultimately flood many of the world’s major cities.

“He’s done the most important science on the most important question that there ever was,” said Bill McKibben, a climate activist who has worked closely with Dr. Hansen.

Around the time Dr. Hansen switched his research focus, in the 1970s, a sharp rise in global temperatures began. He labored in obscurity over the next decade, but on a blistering June day in 1988 he was called before a Congressional committee and testifiedthat human-induced global warming had begun.

Speaking to reporters afterward in his flat Midwestern accent, he uttered a sentence that would appear in news reports across the land: “It is time to stop waffling so much and say that the evidence is pretty strong that the greenhouse effect is here.”

Given the natural variability of climate, it was a bold claim to make after only a decade of rising temperatures, and to this day some of his colleagues do not think he had the evidence.

Yet subsequent events bore him out. Since the day he spoke, not a single month’s temperatures have fallen below the 20th-century average for that month. Half the world’s population is now too young to have lived through the last colder-than-average month, February 1985.

In worldwide temperature records going back to 1880, the 19 hottest years have all occurred since his testimony.

Again and again, Dr. Hansen made predictions that were ahead of the rest of the scientific community and, arguably, a bit ahead of the evidence.

“Jim has a real track record of being right before you can actually prove he’s right with statistics,” said Raymond T. Pierrehumbert, a planetary scientist at the University of Chicago.

Dr. Hansen’s record has by no means been spotless. Even some of his allies consider him prone to rhetorical excess and to occasional scientific error.

He has repeatedly called for trying the most vociferous climate-change deniers for “crimes against humanity.” And in recent years, he stated that excessive carbon dioxide emissions might eventually lead to a runaway greenhouse effect that would boil the oceans and render earth uninhabitable, much like Venus.

His colleagues pointed out that this had not happened even during exceedingly warm episodes in the earth’s ancient past. “I have huge respect for Jim, but in this particular case, he overstated the risk,” said Daniel P. Schrag, a geochemist and the head of Harvard’s Center for the Environment, who is nonetheless deeply worried about climate change.

Climate skeptics have routinely accused Dr. Hansen of alarmism. “He consistently exaggerates all the dangers,” Freeman Dyson, the famed physicist and climate contrarian,told The New York Times Magazine in 2009.

Perhaps the biggest fight of Dr. Hansen’s career broke out in late 2005, when a young political appointee in the administration of George W. Bush began exercising control over Dr. Hansen’s statements and his access to journalists. Dr. Hansen took the fight public and the administration backed down.

For all his battles with conservatives, however, he has also been hard on environmentalists. He was a harsh critic of a failed climate bill they supported in 2009, on the grounds that it would have sent billions into the federal government’s coffers without limiting emissions effectively.

Dr. Hansen agrees that a price is needed on carbon dioxide emissions, but he wants the money returned to the public in the form of rebates on tax bills. “It needs to be done on the basis of conservative principles — not one dime to make the government bigger,” said Dr. Hansen, who is registered as a political independent.

In the absence of such a broad policy, Dr. Hansen has been lending his support to fights against individual fossil fuel projects. Students lured him to a coal protest in 2009, and he was arrested for the first time. That fall he was cited again after sleeping overnight in a tent on the Boston Common with students trying to pressure Massachusetts into passingclimate legislation.

“It was just humbling to have that solidarity and support from this leader, this lion among men,” said Craig S. Altemose, an organizer of the Boston protest.

Dr. Hansen says he senses the beginnings of a mass movement on climate change, led by young people. Once he finishes his final papers as a NASA employee, he intends to give it his full support.

“At my age,” he said, “I am not worried about having an arrest record.”

Secretário da ONU pede urgência na criação de metas globais para o clima (G1/Globo Natureza)

JC e-mail 4699, de 05 de Abril de 2013.

Ban Ki-moon disse que será tarde demais se nada for feito até 2015. Data é limite para criar acordo global que reduza emissão de gases-estufa

O secretário-geral da ONU, Ban Ki-moon, declarou nesta quarta-feira (3) em Mônaco que será “tarde demais” para salvar o meio ambiente, se não forem adotadas medidas vinculantes até 2015 para o clima.

“As palavras não foram seguidas por ações. Logo será tarde demais. Nossos padrões de consumo são incompatíveis com a saúde do planeta”, indicou Ban Ki-moon, diante de uma plateia de personalidades. “Devemos agir agora, se quisermos que em 2050 o planeta continue a ser habitável para os seus nove bilhões de pessoas”, argumentou.

Ele se refere à criação de um novo tratado (ou protocolo) previsto para ser assinado em 2015 e entrar em vigor a partir de 2020, quando o Protocolo de Kyoto perder sua validade. Assim, todos países pretendem terão que cumprir metas para reduzir os gases de efeito estufa e conter a elevação da temperatura do planeta.

Dos noventa objetivos adotados pela comunidade internacional relacionados a questões ambientais nos últimos 20 anos, apenas quatro registraram progressos significativos, lamentou o secretário das Nações Unidas.

Problemas ambientais

Segundo a agência de notícias France Presse, ele destacou como problemas atuais a diminuição da biodiversidade, a redução dos recursos pesqueiros, a maior acidez dos oceanos e o aumento das emissões de gases do efeito estufa. “Temos que acelerar nossa dinâmica. Precisamos desenvolver o que estamos testando em tubos de ensaio há 40 anos. Para isso, devemos adotar medidas de incentivos eficazes, e principalmente colocar um preço sobre as emissões de carbono”, declarou.

“Também devemos adotar, até 2015, um instrumento universal e jurídico vinculante relativo ao clima, de modo que todos os países adotem medidas adicionais para reduzir os efeitos da mudança climática”, instou o secretário-geral das Nações Unidas.

Homenagens em Mônaco

Ban também prestou homenagem à Fundação Prince Albert II de Mônaco, que “é respeitada em todo o mundo pelo trabalho que faz nas áreas da biodiversidade, da água e na luta contra as mudanças climáticas”.

“No momento em que a terra e os oceanos sofrem pressões sem precedentes, em particular devido ao crescimento da população global e às mudanças climáticas, é nossa responsabilidade agir de forma decisiva para preparar para o futuro”, declarou por sua vez o príncipe Albert de Mônaco.

Para o pequeno principado, a visita oficial de Ban Ki-moon marca o 20º aniversário da entrada do Mônaco na Organização das Nações Unidas, em 28 de maio de 1993. “Eu lembro com carinho o orgulho que ele sentiu por esse reconhecimento”, disse o soberano em referência a seu pai, o príncipe Rainier III.

Ban Ki-moon, que iniciou nesta semana um giro europeu com uma visita aos pequenos principados de San Marino e Andorra, também visitará a Espanha e a Holanda. Ele se reunirá na quinta-feira (4) em Mônaco com o chefe de governo.

You Don’t ‘Own’ Your Own Genes: Researchers Raise Alarm About Loss of Individual ‘Genomic Liberty’ Due to Gene Patents (Science Daily)

Mar. 25, 2013 — Humans don’t “own” their own genes, the cellular chemicals that define who they are and what diseases they might be at risk for. Through more than 40,000 patents on DNA molecules, companies have essentially claimed the entire human genome for profit, report two researchers who analyzed the patents on human DNA.

In a new study, researchers report that through more than 40,000 patents on DNA molecules, companies have essentially claimed the entire human genome for profit. (Credit: © X n’ Y hate Z / Fotolia)

Their study, published March 25 in the journal Genome Medicine, raises an alarm about the loss of individual “genomic liberty.”

In their new analysis, the research team examined two types of patented DNA sequences: long and short fragments. They discovered that 41 percent of the human genome is covered by longer DNA patents that often cover whole genes. They also found that, because many genes share similar sequences within their genetic structure, if all of the “short sequence” patents were allowed in aggregate, they could account for 100 percent of the genome.

Furthermore, the study’s lead author, Dr. Christopher E. Mason of Weill Cornell Medical College, and the study’s co-author, Dr. Jeffrey Rosenfeld, an assistant professor of medicine at the University of Medicine & Dentistry of New Jersey and a member of the High Performance and Research Computing Group, found that short sequences from patents also cover virtually the entire genome — even outside of genes.

“If these patents are enforced, our genomic liberty is lost,” says Dr. Mason, an assistant professor of physiology and biophysics and computational genomics in computational biomedicine at the Institute for Computational Biomedicine at Weill Cornell. “Just as we enter the era of personalized medicine, we are ironically living in the most restrictive age of genomics. You have to ask, how is it possible that my doctor cannot look at my DNA without being concerned about patent infringement?”

The U.S. Supreme Court will review genomic patent rights in an upcoming hearing on April 15. At issue is the right of a molecular diagnostic company to claim patents not only on two key breast and ovarian cancer genes — BRCA1 and BRCA2 — but also on any small sequence of code within BRCA1, including a striking patent for only 15 nucleotides.

In its study, the research team matched small sequences within BRCA1 to other genes and found that just this one molecular diagnostic company’s patents also covered at least 689 other human genes — most of which have nothing to do with breast or ovarian cancer; rather, its patents cover 19 other cancers as well as genes involved in brain development and heart functioning.

“This means if the Supreme Court upholds the current scope of the patents, no physician or researcher can study the DNA of these genes from their patients, and no diagnostic test or drug can be developed based on any of these genes without infringing a patent,” says Dr. Mason.

One Patented Sequence Matched More Than 91 Percent of Human Genes

Dr. Mason undertook the study because he realized that his research into brain and cancer disorders inevitably involved studying genes that were protected by patents.

Under U.S. patent law, genes can be patented by those researchers, either at companies or institutions, who are first to find a gene that promises a useful application, such as for a diagnostic test. For example, the patents received by a company in the 1990s on BRCA1 and BRCA2 enables it to offer a diagnostic test to women who may have, or may be at risk for, breast or ovarian cancer due to mutations in one or both of these genes. Women and their doctors have no choice but to use the services of the patents’ owner, which costs $3,000 per test, “whereas any of the hundreds of clinical laboratories around the country could perform such a test for possibly much less,” says Dr. Mason.

The impact on these patents is equally onerous on research, Dr. Mason adds.

“Almost every day, I come across a gene that is patented — a situation that is common for every geneticist in every lab,” says Dr. Mason.

Dr. Mason and his research partner sought to determine how many other genes may be impacted by gene patents, as well as the overall landscape of intellectual property on the human genome.

To conduct the study, Dr. Mason and Dr. Rosenfeld examined the structure of the human genome in the context of two types of patented sequences: short and long fragments of DNA. They used matches to known genes that were confirmed to be present in patent claims, ranging from as few as 15 nucleotides (the building blocks of DNA) to the full length of all patented DNA fragments.

Before examining the patented sequences, the researchers first calculated how many genes had common segments of 15 nucleotide (15mer), and found that every gene in the human genome matched at least one other gene in this respect, ranging from as few as five matches 15mer to as many as 7,688 gene matches. They also discovered that 99.999 percent of 15mers in the human genome are repeated at least twice.

“This demonstrates that short patent sequences are extremely non-specific and that a 15mer claim from one gene will always cross-match and patent a portion of another gene as well,” says Dr. Mason. “This means it is actually impossible to have a 15mer patent for just one gene.”

Next, researchers examined the total sequence space in human genes covered by 15mers in current patent claims. They found 58 patents whose claims covered at least 10 percent of all bases of all human genes. The broadest patent claimed sequences that matched 91.5 percent of human genes. Then, when they took existing gene patents and matched patented 15mers to known genes, they discovered that 100 percent of known genes are patented.

“There is a real controversy regarding gene ownership due to the overlap of many competing patent claims. It is unclear who really owns the rights to any gene,” says Dr. Rosenfeld. “While the Supreme Court is hearing one case concerning just the BRCA1 patent, there are also many other patents whose claims would cover those same genes. Do we need to go through every gene to look at who made the first claim to that gene, even if only one small part? If we resort to this rule, then the first patents to be granted for any DNA will have a vast claim over portions of the human genome.”

A further issue of concern is that patents on DNA can readily cross species boundaries. A company can have a patent that they received for cow breeding and have that patent cover a large percentage of human genes. Indeed, the researchers found that one company owns the rights to 84 percent of all human genes for a patent they received for cow breeding. “It seems silly that a patent designed to study cow genetics also claims the majority of human genes,” says Dr. Rosenfeld.

Finally, they also examined the impact of longer claimed DNA sequences from existing gene patents, which ranged from a few dozen bases up to thousands of bases of DNA, and found that these long, claimed sequences matched 41 percent (9,361) of human genes. Their analysis concluded that almost all clinically relevant genes have already been patented, especially for short sequence patents, showing all human genes are patented many times over.

“This is, so to speak, patently ridiculous,” adds Dr. Mason. “If patent claims that use these small DNA sequences are upheld, it could potentially create a situation where a piece of every gene in the human genome is patented by a phalanx of competing patents.”

In their discussion, the researchers argue that the U.S. Supreme Court now has a chance to shape the balance between the medical good versus inventor protection, adding that, in their opinion, the court should limit the patenting of existing nucleotide sequences, due to their broad scope and non-specificity in the human genome.

“I am extremely pro-patent, but I simply believe that people should not be able to patent a product of nature,” Dr. Mason says. “Moreover, I believe that individuals have an innate right to their own genome, or to allow their doctor to look at that genome, just like the lungs or kidneys. Failure to resolve these ambiguities perpetuates a direct threat to genomic liberty, or the right to one’s own DNA.”

Journal Reference:

  1. Jeffrey Rosenfeld, and Christopher E Mason. Pervasive sequence patents cover the entire human genome.Genome Medicine, 2013 (in press) DOI: 10.1186/gm431

Survey Shows Many Republicans Feel America Should Take Steps to Address Climate Change (Science Daily)

Apr. 2, 2013 — In a recent survey of Republicans and Republican-leaning Independents conducted by the Center for Climate Change Communication (4C) at George Mason University, a majority of respondents (62 percent) said they feel America should take steps to address climate change. More than three out of four survey respondents (77 percent) said the United States should use more renewable energy sources, and of those, most believe that this change should begin immediately.

The national survey, conducted in January 2013, asked more than 700 people who self-identified as Republicans and Republican-leaning Independents about energy and climate change.

“Over the past few years, our surveys have shown that a growing number of Republicans want to see Congress do more to address climate change,” said Mason professor Edward Maibach, director of 4C. “In this survey, we asked a broader set of questions to see if we could better understand how Republicans, and Independents who have a tendency to vote Republican, think about America’s energy and climate change situation.”

Other highlights from the survey include the following:

  • Republicans and Republican-leaning Independents prefer clean energy as the basis of America’s energy future and say the benefits of clean energy, such as energy independence (66 percent) saving resources for our children and grandchildren (57 percent), and providing a better life for our children and grandchildren (56 percent) outweigh the costs, such as more government regulation (42 percent) or higher energy prices (31 percent).
  • By a margin of 2 to 1, respondents say America should take action to reduce its fossil fuel use.
  • Only one third of respondents agree with the Republican Party’s position on climate change, while about half agree with the party’s position on how to meet America’s energy needs.
  • A large majority of respondents say their elected representatives are unresponsive to their views about climate change.

“The findings from this survey suggest there is considerable support among conservatives for accelerating the transition away from fossil fuels and toward clean renewable forms of energy, and for taking steps to address climate change,” said Maibach. “Perhaps the most surprising finding, however, is how few of our survey respondents agreed with the Republican Party’s current position on climate change.”

The report can be downloaded at: http://climatechangecommunication.org

The report is based on findings from a nationally representative survey conducted by the George Mason University Center for Climate Change Communication. A total of 726 adults (18+) were interviewed between January 12th and January 27th, 2013. The average margin of error for the survey +/- 4 percentage points at the 95% confidence level.

The Tar Sands Disaster (N.Y.Times)

OP-ED CONTRIBUTOR

By THOMAS HOMER-DIXON

Published: March 31, 2013

WATERLOO, Ontario

Rick Froberg

IF President Obama blocks the Keystone XL pipeline once and for all, he’ll do Canada a favor.

Canada’s tar sands formations, landlocked in northern Alberta, are a giant reserve of carbon-saturated energy — a mixture of sand, clay and a viscous low-grade petroleum called bitumen. Pipelines are the best way to get this resource to market, but existing pipelines to the United States are almost full. So tar sands companies, and the Alberta and Canadian governments, are desperately searching for export routes via new pipelines.

Canadians don’t universally support construction of the pipeline. A poll by Nanos Research in February 2012 found that nearly 42 percent of Canadians were opposed. Many of us, in fact, want to see the tar sands industry wound down and eventually stopped, even though it pumps tens of billions of dollars annually into our economy.

The most obvious reason is that tar sands production is one of the world’s most environmentally damaging activities. It wrecks vast areas of boreal forest through surface mining and subsurface production. It sucks up huge quantities of water from local rivers, turns it into toxic waste and dumps the contaminated water into tailing ponds that now cover nearly 70 square miles.

Also, bitumen is junk energy. A joule, or unit of energy, invested in extracting and processing bitumen returns only four to six joules in the form of crude oil. In contrast, conventional oil production in North America returns about 15 joules. Because almost all of the input energy in tar sands production comes from fossil fuels, the process generates significantly more carbon dioxide than conventional oil production.

There is a less obvious but no less important reason many Canadians want the industry stopped: it is relentlessly twisting our society into something we don’t like. Canada is beginning to exhibit the economic and political characteristics of a petro-state.

Countries with huge reserves of valuable natural resources often suffer from economic imbalances and boom-bust cycles. They also tend to have low-innovation economies, because lucrative resource extraction makes them fat and happy, at least when resource prices are high.

Canada is true to type. When demand for tar sands energy was strong in recent years, investment in Alberta surged. But that demand also lifted the Canadian dollar, which hurt export-oriented manufacturing in Ontario, Canada’s industrial heartland. Then, as the export price of Canadian heavy crude softened in late 2012 and early 2013, the country’s economy stalled.

Canada’s record on technical innovation, except in resource extraction, is notoriously poor. Capital and talent flow to the tar sands, while investments in manufacturing productivity and high technology elsewhere languish.

But more alarming is the way the tar sands industry is undermining Canadian democracy. By suggesting that anyone who questions the industry is unpatriotic, tar sands interest groups have made the industry the third rail of Canadian politics.

The current Conservative government holds a large majority of seats in Parliament but was elected in 2011 with only 40 percent of the vote, because three other parties split the center and left vote. The Conservative base is Alberta, the province from which Prime Minister Stephen Harper and many of his allies hail. As a result, Alberta has extraordinary clout in federal politics, and tar sands influence reaches deep into the federal cabinet.

Both the cabinet and the Conservative parliamentary caucus are heavily populated by politicians who deny mainstream climate science. The Conservatives have slashed financing for climate science, closed facilities that do research on climate change, told federal government climate scientists not to speak publicly about their work without approval and tried, unsuccessfully, to portray the tar sands industry as environmentally benign.

The federal minister of natural resources, Joe Oliver, has attacked “environmental and other radical groups” working to stop tar sands exports. He has focused particular ire on groups getting money from outside Canada, implying that they’re acting as a fifth column for left-wing foreign interests. At a time of widespread federal budget cuts, the Conservatives have given Canada’s tax agency extra resources to audit registered charities. It’s widely assumed that environmental groups opposing the tar sands are a main target.

This coercive climate prevents Canadians from having an open conversation about the tar sands. Instead, our nation behaves like a gambler deep in the hole, repeatedly doubling down on our commitment to the industry.

President Obama rejected the pipeline last year but now must decide whether to approve a new proposal from TransCanada, the pipeline company. Saying no won’t stop tar sands development by itself, because producers are busy looking for other export routes — west across the Rockies to the Pacific Coast, east to Quebec, or south by rail to the United States. Each alternative faces political, technical or economic challenges as opponents fight to make the industry unviable.

Mr. Obama must do what’s best for America. But stopping Keystone XL would be a major step toward stopping large-scale environmental destruction, the distortion of Canada’s economy and the erosion of its democracy.

Thomas Homer-Dixon, who teaches global governance at the Balsillie School of International Affairs, is the author of “The Upside of Down: Catastrophe, Creativity and the Renewal of Civilization.”

Unearthed: The Fracking Facade (Top Documentary Films)

A video exposing a flawed claim often abused in the sales pitch for promoting shale gas development across the world:

“With a history of 60 years, after nearly a million wells drilled, there are no documented cases that hydraulic fracturing (fracking) has lead to the contamination of groundwater.”

Brought to you by the team behind the upcoming South African feature documentary, Unearthed, that is investigating natural gas development and the controversial method of extraction known as fracking from a global perspective. Should South Africa and other countries drill down?

Watch the full documentary now

 

 

The Mathematics of Averting the Next Big Network Failure (Wired)

BY NATALIE WOLCHOVER, SIMONS SCIENCE NEWS

03.19.13 – 9:30 AM

Data: Courtesy of Marc Imhoff of NASA GSFC and Christopher Elvidge of NOAA NGDC; Image: Craig Mayhew and Robert Simmon of NASA GSFC

Gene Stanley never walks down stairs without holding the handrail. For a fit 71-year-old, he is deathly afraid of breaking his hip. In the elderly, such breaks can trigger fatal complications, and Stanley, a professor of physics at Boston University, thinks he knows why.

“Everything depends on everything else,” he said.

Original story reprinted with permission from Simons Science News, an editorially independent division of SimonsFoundation.org whose mission is to enhance public understanding of science by covering research developments and trends in mathematics and the physical and life sciences.

Three years ago, Stanley and his colleagues discovered the mathematics behind what he calls “the extreme fragility of interdependency.” In a system of interconnected networks like the economy, city infrastructure or the human body, their model indicates that a small outage in one network can cascade through the entire system, touching off a sudden, catastrophic failure.

First reported in 2010 in the journal Nature, the finding spawned more than 200 related studies, including analyses of the nationwide blackout in Italy in 2003, the global food-price crisis of 2007 and 2008, and the “flash crash” of the United States stock market on May 6, 2010.

“In isolated networks, a little damage will only lead to a little more,” said Shlomo Havlin, a physicist at Bar-Ilan University in Israel who co-authored the 2010 paper. “Now we know that because of dependency between networks, you can have an abrupt collapse.”

While scientists remain cautious about using the results of simplified mathematical models to re-engineer real-world systems, some recommendations are beginning to emerge. Based on data-driven refinements, new models suggest interconnected networks should have backups, mechanisms for severing their connections in times of crisis, and stricter regulations to forestall widespread failure.

“There’s hopefully some sweet spot where you benefit from all the things that networks of networks bring you without being overwhelmed by risk,” said Raissa D’Souza, a complex systems theorist at the University of California, Davis.

Power, gas, water, telecommunications and transportation networks are often interlinked. When nodes in one network depend on nodes in another, node failures in any of the networks can trigger a system-wide collapse. (Illustration: Leonardo Dueñas-Osorio)

To understand the vulnerability in having nodes in one network depend on nodes in another, consider the “smart grid,” an infrastructure system in which power stations are controlled by a telecommunications network that in turn requires power from the network of stations. In isolation, removing a few nodes from either network would do little harm, because signals could route around the outage and reach most of the remaining nodes. But in coupled networks, downed nodes in one automatically knock out dependent nodes in the other, which knock out other dependent nodes in the first, and so on. Scientists model this cascading process by calculating the size of the largest cluster of connected nodes in each network, where the answer depends on the size of the largest cluster in the other network. With the clusters interrelated in this way, a decrease in the size of one of them sets off a back-and-forth cascade of shrinking clusters.

When damage to a system reaches a “critical point,” Stanley, Havlin and their colleagues find that the failure of one more node drops all the network clusters to zero, instantly killing connectivity throughout the system. This critical point will vary depending on a system’s architecture. In one of the team’s most realistic coupled-network models, an outage of just 8 percent of the nodes in one network — a plausible level of damage in many real systems — brings the system to its critical point. “The fragility that’s implied by this interdependency is very frightening,” Stanley said.

However, in another model recently studied by D’Souza and her colleagues, sparse links between separate networks actually help suppress large-scale cascades, demonstrating that network models are not one-size-fits-all. To assess the behavior of smart grids, financial markets, transportation systems and other real interdependent networks, “we have to start from the data-driven, engineered world and come up with the mathematical models that capture the real systems instead of using models because they are pretty and analytically tractable,” D’Souza said.

In a series of papers in the March issue of Nature Physics, economists and physicists used the science of interconnected networks to pinpoint risk within the financial system. In one study, an interdisciplinary group of researchers including the Nobel Prize-winning economist Joseph Stiglitz found inherent instabilities within the highly complex, multitrillion-dollar derivatives market and suggested regulations that could help stabilize it.

Irena Vodenska, a professor of finance at Boston University who collaborates with Stanley, custom-fit a coupled network model around data from the 2008 financial crisis. Her and her colleagues’ analysis, published in February in Scientific Reports, showed that modeling the financial system as a network of two networks — banks and bank assets, where each bank is linked to the assets it held in 2007 — correctly predicted which banks would fail 78 percent of the time.

“We consider this model as potentially useful for systemic risk stress testing for financial systems,” said Vodenska, whose research is financially supported by the European Union’s Forecasting Financial Crisis program. As globalization further entangles financial networks, she said, regulatory agencies must monitor “sources of contagion” — concentrations in certain assets, for example — before they can cause epidemics of failure. To identify these sources, “it’s imperative to think in the sense of networks of networks,” she said.

Leonardo Dueñas-Osorio, a civil engineer at Rice, visited a damaged high-voltage substation in Chile after a major earthquake in 2010 to gather information about the power grid’s response to the crisis. (Photo: Courtesy of Leonardo Dueñas-Osorio)

Scientists are applying similar thinking to infrastructure assessment. Leonardo Dueñas-Osorio, a civil engineer at Rice University, is analyzing how lifeline systems responded to recent natural disasters. When a magnitude 8.8 earthquake struck Chile in 2010, for example, most of the power grid was restored after just two days, aiding emergency workers. The swift recovery, Dueñas-Osorio’s researchsuggests, occurred because Chile’s power stations immediately decoupled from the centralized telecommunications system that usually controlled the flow of electricity through the grid, but which was down in some areas. Power stations were operated locally until the damage in other parts of the system subsided.

“After an abnormal event, the majority of the detrimental effects occur in the very first cycles of mutual interaction,” said Dueñas-Osorio, who is also studying New York City’s response to Hurricane Sandy last October. “So when something goes wrong, we need to have the ability to decouple networks to prevent the back-and-forth effects between them.”

D’Souza and Dueñas-Osorio are collaborating to build accurate models of infrastructure systems in Houston, Memphis and other American cities in order to identify system weaknesses. “Models are useful for helping us explore alternative configurations that could be more effective,” Dueñas-Osorio explained. And as interdependency between networks naturally increases in many places, “we can model that higher integration and see what happens.”

Scientists are also looking to their models for answers on how to fix systems when they fail. “We are in the process of studying what is the optimal way to recover a network,” Havlin said. “When networks fail, which node do you fix first?”

The hope is that networks of networks might be unexpectedly resilient for the same reason that they are vulnerable. As Dueñas-Osorio put it, “By making strategic improvements, can we have what amounts to positive cascades, where a small improvement propagates much larger benefits?”

These open questions have the attention of governments around the world. In the U.S., the Defense Threat Reduction Agency, an organization tasked with safeguarding national infrastructure against weapons of mass destruction, considers the study of interdependent networks its “top mission priority” in the category of basic research. Some defense applications have emerged already, such as a new design for electrical network systems at military bases. But much of the research aims at sorting through the mathematical subtleties of network interaction.

“We’re not yet at the ‘let’s engineer the internet differently’ level,” said Robin Burk, an information scientist and former DTRA program manager who led the agency’s focus on interdependent networks research. “A fair amount of it is still basic science — desperately needed science.”

Original story reprinted with permission from Simons Science News, an editorially independent division of SimonsFoundation.org whose mission is to enhance public understanding of science by covering research developments and trends in mathematics and the physical and life sciences.

Mudanças climáticas afetam previsões astrológicas dos índios amazônicos (UOL Notícias)

Carlos A. Moreno

Da EFE, no Rio de Janeiro

31/03/201311h59

Crianças da aldeia ticuna brincam no Rio Solimões, no Amazonas; os ticunas são uma das tribos afetadas pelas mudanças climáticasCrianças da aldeia ticuna brincam no Rio Solimões, no Amazonas; os ticunas são uma das tribos afetadas pelas mudanças climáticas. Patrícia Santos – 30.nov.1999/Folhapress

As previsões que os índios da Amazônia brasileira fazem com a ajuda dos astros para determinar o melhor momento para plantar ou pescar, entre outras atividades, se veem afetadas pelas mudanças climáticas, segundo constatou um estudo realizado com diferentes etnias indígenas no Brasil.

“Os xamãs passaram a se queixar que suas previsões estavam perdendo a exatidão e, a partir dessas indagações, descobrimos que alguns fenômenos provocados pelas mudanças climáticas afetavam seus cálculos”, explicou à Agência Efe o astrônomo Germano Afonso, coordenador do estudo.

Segundo o especialista, que é doutor em Astronomia e Mecânica Celeste pela francesa Universidade Pierre et Marie Curie, os índios da Amazônia ainda utilizam o conhecimento astrológico ancestral para determinar seu calendário e programar, entre outras coisas, a melhor data para plantar, colher, caçar, pescar e, até mesmo, realizar seus rituais religiosos.

Afonso, que construiu e opera – com ajuda dos índios – um observatório solar na Amazônia, explicou que a observação ou não de diferentes constelações, assim como o deslocamento das mesmas, fazem com que os xamãs prevejam os momentos de chuva e seca, das cheias dos rios, da fertilidade da terra e da procriação dos peixes.

“No entanto, nas tribos com as quais trabalhamos, os próprios xamãs admitem que suas previsões não estavam sendo exatas, já que as chuvas se antecipavam ou se atrasavam e os rios secavam antes do tempo previsto. O curioso é que eles mesmos culpavam às mudanças climáticas”, declarou o astrônomo, que é professor da Universidade do Estado do Paraná e autor de diferentes obras sobre o assunto, como “O Céu dos Índios Tembé”.

A equipe coordenada por Afonso e contratada pela Fundação de Apoio à pesquisa no Estado do Amazonas (Fapeam) para estudar o assunto decidiu contrastar o conhecimento indígena de diferentes etnias – Tukano, Tupé, Dessana, Baré, Tuyuka, Baniwa e Tikuna – com as medições meteorológicas da região para tentar identificar as falhas nas previsões.

“Com essa análise percebemos que alguns fenômenos provocados pelas mudanças climáticas estavam desvirtuando as previsões, tendo em vista que a chuva se atrasava ou se antecipava por fenômenos como El Niño e o desmatamento”, apontou o especialista, que passou a morar em São Gabriel da Cachoeira, uma cidade amazônica na qual confluem várias etnias e onde construiu o Observatório Solar Indígena.

Afonso esclareceu que esse problema não pode ser atribuído diretamente ao aquecimento global, mas também aos fenômenos que causam o efeito estufa e os que são provocados pelo mesmo, como o desmatamento da Amazônia, a poluição ambiental e a construção de represas na floresta.

Tais fenômenos, segundo os especialistas, alteram os períodos de chuva e de cheia dos rios na Amazônia, que já não podem ser previstos a partir do conhecimento astronômico acumulado por séculos e transmitido oralmente entre os índios.

Após a constatação do problema, os pesquisadores responsáveis pelo estudo iniciaram um projeto para transmitir aos xamãs alguns conhecimentos científicos e, com isso, ajudá-los a corrigir suas previsões.

“Estamos usando cálculos astronômicos modernos e as informações recolhidas pelas estações meteorológicas da região para ajudá-los a aperfeiçoar seus cálculos”, explicou Afonso.

“Recuperamos o conhecimento astrológico que eles transmitem oralmente e comparamos com dados científicos para fazer alguns ajustes e permitir que as previsões sejam mais precisas”, completou.

De acordo com Afonso, com previsões mais exatas, os índios seguirão confiando em sua capacidade de interpretar os astros e na precisão de seus conhecimentos – o melhor, sem se afastarem de sua cultura.

“Mas só transmitimos os dados que podem ajudá-los. Não nos introduzimos mais. Não queremos invadir, deslegitimar e nem modificar nada de sua cultura. O projeto tem dois objetivos claros: recuperar o conhecimento astrológico dos índios e ajudá-los a melhorar suas previsões. Trata-se de uma troca”, exaltou o pesquisador.

Segundo o astrônomo, essa troca teve uma boa recepção devido ao fato de que a maioria de seus colaboradores no projeto são universitários e indígenas, alguns filhos ou netos de caciques e xamãs das tribos onde nasceram.

Latour: “No estaba escrito que la ecología fuera un partido” (El País)

ENTREVISTA

“No estaba escrito que la ecología fuera un partido”

Sociólogo, antropólogo, filósofo y director científico del Instituto de Estudios Políticos de París.

Bruno Latour tiene una mirada ácida y provocadora de la sociedad y el medio ambiente.

MIGUEL MORA 25 MAR 2013 – 11:52 CET19

Bruno Latour. / MANUEL BRAUN

¿Ha servido para algo el activismo ecológico? ¿Han forjado los verdes una política común? ¿Escuchan los políticos a los científicos cuando alertan sobre el cambio climático? ¿Puede la Tierra soportar más agresiones? El sociólogo, antropólogo y filósofo francés Bruno Latour(Beaune, 1947) lleva más de 20 años reflexionando sobre estos asuntos, y su pronóstico es desolador. A su juicio, la llegada de los ecologistas a la política ha sido un fracaso porque los verdes han renunciado al debate inteligente, los políticos se limitan a aplicar viejas recetas sin darse cuenta de que la revolución se ha producido ya y fue “una catástrofe”: ocurrió en 1947, cuando la población mundial superó el número que garantizaba el acceso a los recursos. Según Latour, es urgente poner en marcha una nueva forma de hacer ecología política, basada en una constitución que comprometa a gobernantes, científicos y ciudadanos a garantizar el futuro de la Tierra. Esta idea es una de las propuestas de su libro Políticas de la naturaleza. Por una democracia de las ciencias, publicado en Francia en 1999 y que ahora edita en español RBA.

Latour, aire de sabio despistado, recibe a El País Semanal en su caótico y enorme despacho del Instituto de Estudios Políticos de París, del que es director científico y director adjunto desde 2007.

PREGUNTA: Este libro se publicó en Francia hace ya 14 años. ¿Sigue suscribiendo lo que escribió?

RESPUESTA: Casi todo, sí. Pero las cosas no han mejorado. He seguido trabajando en lo mismo, pero con otro tono. Hoy debo de ser el único que se ocupa de estas cuestiones, de una filosofía política que exige una verdadera política ecologista. Lo que no ha funcionado es que pensé que iba a ser un libro fundador para los ecologistas. ¡Y ha sido un fracaso total! Los ecologistas han desaparecido.

P: En Francia al menos hay verdes en el Gobierno.

R: Sí, pero tienen una visión muy estrecha de la ecología, no reflexionan ni sobre la economía ni sobre la sociedad. La ecología está limitada a las cuestiones de la naturaleza, cuando en realidad no tiene nada que ver con eso. Hay que elegir entre naturaleza y política. Desgraciadamente, se ha intentado hacer una política ecologista que no ha producido nada bueno porque se ha basado en la lucha tradicional, que tenía como objetivo torpedear la política o, mejor, someterla; en cierto modo, los verdes actúan como un tribunal que trata de definir una especie de soberanía.

P: ¿De superioridad moral o natural?

R: Sí, pero sobre todo de estupidez. Evidentemente, el tomar la naturaleza como un fin no ha hecho más que debilitar la posición de los ecologistas, que nunca han sido capaces de hacer política; en fin, auténtica política en el sentido de la tradición socialista, en la que se hubieran debido inspirar. No han hecho el trabajo que el socialismo primero, el marxismo después y luego la socialdemocracia hicieron. No ha habido, para nada, un trabajo de invención intelectual, de exploración; han preferido “el escaparate”. Puede que no hubiera otra solución, pues no estaba escrito que la ecología se fuera a convertir en un partido.

“Hay una ecología profunda con un gran papel en EE UU y alemania”

P: ¿Entonces el ecologismo es hoy una especie de ac­­tivismo sin conexión científica?

R: Ha habido movimientos interesantes gracias a una casuística muy concreta, importante en lo que concierne a los animales, las plantas, los dientes de los elefantes, el agua, los ríos, etcétera. Han mostrado además gran energía en las cuestiones locales, pero sin afrontar las cuestiones de la política, de la vida en común. Por eso el ecologismo sigue siendo marginal, justo en un momento en que las cuestiones ecológicas se han convertido en un asunto de todos. Y se da una paradoja: la ecología se ocupa de temas minúsculos relacionados con la naturaleza y la sociedad mientras que la cuestión de la Tierra, la presencia de la Tierra en la política, se hace cada vez más apremiante. Esa urgencia, que ya era acuciante hace 10 o 15 años, lo es mucho más ahora.

P: ¿Quizá ha faltado formar una Internacional Verde?

R: No se ha hecho porque los ecologistas pensaban que la Tierra iba a unificar todos estos movimientos. Han surgido un montón de redes, basadas en casos concretos, como Greenpeace. Hay asociaciones, pero nada a nivel político. La internacional sigue siendo la geopolítica clásica de los Estados nación. No ha habido reflexión sobre la nueva situación. Existe una ecología profunda, deep ecology, en Francia prácticamente inexistente, que ha tenido un papel importante en Alemania, en los países escandinavos y en Norteamérica. Pero está muy poco politizada.

P: Estamos ante un fracaso político y ante una mayor conciencia de los científicos. ¿Y los ciudadanos?

R: Paradójicamente, esa dolorosa pelea sobre el clima nos ha permitido progresar. En cierto modo, la querella ha tenido un papel importante en una “comprensión renovada” por parte del público de la realidad científica. El problema es que intentamos insertar las cuestiones ecológicas en el viejo modelo “ciencia y política”. Desde este punto de vista, incluso los científicos más avanzados siguen intentando poner estas cuestiones dentro del marco de esa situación superada que intento criticar. Este es el tema del libro, y en ese sentido sigue de actualidad.

P: En Francia hay una identificación entre ecologismo y territorio. José Bové, por ejemplo, es un proteccionista a ultranza. Es rara esta evolución de la ecología hacia el nacionalismo, ¿no?

R: Sí, pero al mismo tiempo es útil e interesante replantearse lo que es el territorio, el terruño, por usar la palabra francesa. Los ecologistas siempre se han mostrado indecisos sobre el carácter progresista o reaccionario de su apego a la tierra, porque la expresión en francés puede significar cosas muy distintas. Pero es importante, porque es una de las dimensiones de la cuestión ecológica, tanto de la progresista como de la arcaica. Ese era uno de los objetivos fundamentales del libro, saber si hemos sido realmente modernos alguna vez. Hay aspectos regresivos en el apego al terruño, y a la vez hay otros muy importantes sobre la definición de los límites, de los entornos en los cuales vivimos, que son decisivos para el porvenir. Una vez más, los verdes han omitido trabajar esa cuestión. Pero el problema de la orientación, de la diferencia entre el apego reaccionario o progresista a la tierra, es fundamental. Si vemos movimientos como Slow Food, nos preguntamos si están adelantados o retrasados, porque tienen aspectos regresivos. Pero si se piensa en el tema de los circuitos de distribución, ¿por qué las lasañas inglesas tendrían que estar hechas con caballo rumano y transitar por 25 intermediarios? No es una tontería: si tomamos caballo francés, rumano o turco, las cuestiones de pertenencia y de límites se convierten en cuestiones progresistas.

El antropólogo iconoclasta

Bruno Latour nació en la Borgoña, donde surgen los vinos más caros del planeta. Su padre era viticultor. De ahí sus pecualiares análisis sobre el terruño y la tradición. Cursó Antropología y Sociología. Su formación es tan variopinta como los centros donde ha impartido clase, desde la Escuela de Minas de París hasta la London School of Economics y la cátedra de Historia de Harvard.

Escritor incansable, es autor de una treintena de libros de ensayo, todos los últimos editados por Harvard, por los que circulan la tierra, la sociedad, la guerra, la energía, la ciencia, la tecnología, la modernidad y los medios de comunicación.

Su último proyecto está conectado con el llamado medialab, un espacio donde desarrollar conexiones entre las tecnologías digitales, la sociología y los estudios científicos.

P: Su libro llama a superar los esquemas de izquierda y derecha. Pero no parece que eso haya cambiado mucho.

R: El debate afronta un gran problema. Hay una inversión de las relaciones entre el marco geográfico y la política: el marco ha cambiado mucho más que la política. Las grandes negociaciones internacionales manifiestan esa inercia de la organización económica, legal y política, mientras que el marco, lo que antes llamábamos la Tierra, la geografía, cambia a velocidad asombrosa. Esa mutación es difícil de comprender por la gente acostumbrada a la historia de antes, en la cual había humanos que se peleaban, como en el siglo XX: hombres haciéndose la guerra dentro de un marco geográfico estable desde la última glaciación. Es una razón demasiado filosófica. Así que preferimos pensar que tenemos tiempo, que todo está en su sitio, que la economía es así, que el derecho internacional es así, etcétera. Pero incluso los términos para señalar las aceleraciones rápidas han cambiado, volcándose hacia la naturaleza y los glaciares. El tiempo que vivimos es el del antropoceno, y las cosas ya no son como antes. Lo que ha cambiado desde que escribí el libro es que en aquel momento no teníamos la noción del antropoceno. Fue una invención muy útil de Crutzen, un climatólogo, pero no existía entonces, me habría ayudado mucho.

P: ¿Y qué fue de su propuesta de aprobar una constitución ecológica?

R: Intenté construir una asociación de parlamentarios y lanzar una constitución para que las cuestiones de la energía empezaran a ser tratadas de otro modo. Intentaba abrir un debate, que naturalmente no ha tenido lugar. El debate sobre la Constitución empezó bien, se consideró una gran invención de la democracia europea. El problema es que ya no se trata de la cuestión de la representación de los humanos, sino que ese debate atañe a los innumerables seres que viven en la Tierra. Me parecía necesario en aquel momento, y ahora más incluso, hacer un debate constitucional. ¿Cómo sería un Parlamento dedicado a la política ecológica? Tendrá que crearse, pero no reflexionamos lo suficiente sobre las cuestiones de fondo.

P: ¿Las grandes conferencias medioambientales resuelven algo?

R: El problema es que la geopolítica organizada en torno a una nación, con sus propios intereses y nivel de agregación, está mal adaptada a las cuestiones ecológicas, que son transnacionales. Todo el mundo sabe eso, los avances no pueden plasmarse ya a base de mapas, no jugamos en territorios clásicos. Así, desde Copenhague 2009 hay una desafección por las grandes cumbres, no solo porque no se consigue decidir nada, sino también porque nos damos cuenta de que el nivel de decisión y agregación política no es el correcto. De hecho, las ciudades, las regiones, las naciones, las provincias, toman a menudo más iniciativas que los Estados.

P: Francia es uno de los países más nuclearizados del mundo. Los ecologistas braman. ¿Le parece bien?

R: Los ecologistas se han obstinado en la cuestión nuclear, pero nadie ha venido a explicarnos por qué lo nuclear es antiecológico, mientras mucha gente seria considera que el átomo es una de las soluciones, a largo plazo no, pero a corto plazo sí. De nuevo estamos ante la ausencia total de reflexión política por parte de los ecologistas, que militan contra lo nuclear sin explicar por qué. Por consiguiente, no hemos avanzado un centímetro. De hecho, en este momento hay un gran debate público sobre la transición energética, y los verdes siguen siendo incapaces de comprender nada, incluso de discutir, porque han moralizado la cuestión nuclear. Cuando se hace ética, no hay que hacer política, hay que hacer religión.

P: ¿Está realmente en cuestión la supervivencia de la especie?

R: La especie humana se las apañará. Nadie piensa que vaya a desaparecer, ¿pero la civilización? No se sabe lo que es una Tierra a seis u ocho grados, no lo hemos conocido. Hay que remontarse centenares de millones de años. El problema no se abordaba con la misma urgencia cuando escribí el libro en 1999, se hablaba aún de las generaciones futuras. Ahora hablamos de nuestros hijos. No hay una sola empresa que haga un cálculo más allá de 2050, es el horizonte más corto que ha habido nunca. La mutación de la historia es increíblemente rápida. Ahora se trata de acontecimientos naturales, mucho más rápidos que los humanos. Es inimaginable para la gente formada en el siglo XX, una novedad total.

P: ¿Es la globalización? ¿O más que eso?

R: Tiene relación con la globalización, pero no por la extensión de las conexiones entre los humanos. Se trata de la llegada de un mundo desagradable que impide la globalización real: es un conflicto entre globos. Nos hemos globalizado, y eso resulta tranquilizador porque todo está conectado y hace de la Tierra un planeta pequeño. Pero que un gran pueblo sea aplastado al chocar con otra cosa tranquiliza menos.

La especie humana se las apañará. nadie piensa que va a desaparecer”

P: ¿Y el malestar que sentimos, la indignación, tiene que ver con ese miedo?

R: Ese catastrofismo siempre ha existido; siempre ha habido momentos de apocalipsis, de literatura de la catástrofe; pero al mismo tiempo existe un sentimiento nuevo: no se trata del apocalipsis de los humanos, sino del final de recursos, en un sentido, creo, literal.

P: ¿Nos hemos zampado el planeta?

R: La gente que analiza el antropoceno dibuja esquemas de este tipo (muestra un famoso gráfico de población y recursos). Esto se llama “la gran aceleración”, ocurrió en 1947. La revolución ya ha tenido lugar, y es una de las causas de esa nueva ansiedad. La gente sigue hablando de la revolución, desesperándose porque no llega, pero ya está aquí. Es un acontecimiento pasado y de consecuencias catastróficas. Eso también nubla la mente de progresistas y reaccionarios. ¿Qué significa vivir en una época en la cual la revolución ha ocurrido ya y cuyos resultados son catastróficos?

P: ¿No querrá decir que la austeridad es la solución?

R: Ya existe el concepto del decrecimiento feliz, no sé si la tienen en España… ¡Sí! Ustedes están muy adelantados sobre decrecimiento.

P: Estamos en plena vanguardia, pero del infeliz.

R: Es uno de los grandes temas del momento, la crisis económica es decrecimiento no deseado, desigualmente repartido; y hay algo más: austeridad no es necesariamente la palabra, sino ascetismo. Sería la visión religiosa, o espiritual, de la austeridad. Eso se mezcla con las nuevas visiones geológicas de los límites que debemos imponernos…

P: ¿Habla del regreso al campo o de reconstruir el planeta?

R: No me refiero a volver al campo, sino a otra Tierra.

P: ¿La tecnología es la única brújula?

R: La tecnología se encuentra en esa misma situación. Existe una solución muy importante de la geoingeniería, que considera que la situación es reversible, que se pueden recrear artificialmente unas condiciones favorables tras haberlas destruido sin saberlo. Así ha surgido un inmenso movimiento de geoingeniería en todas partes. Ya que es la energía de la Tierra, podemos mandar naves espaciales, modificar la acidez de las aguas del mar, etcétera. Hacer algo que contrarreste lo que se hizo mal. Si hemos podido modificar la Tierra, podemos modificarla en el otro sentido, lo que es un argumento peligroso, porque la podemos destrozar por segunda vez.

P: ¿No se regenerará sola?

R: Sí, ¡pero sin humanos! Se regenerará sola mientras no haya humanos. Puede deshacerse de nosotros, es una de las hipótesis, volviéndose invivible, pero eso no sería muy positivo. La era de los límites puede llegar hasta la extinción.

P: ¿Acabaremos fatal?

R: La historia no está repleta de ejemplos favorables. No se sabe. No hay nada en la naturaleza humana que favorezca la reflexión, por lo cual la solución solo puede ser mala.

P: Algunos temen que acabaremos devorados por los chinos.

R: Los chinos tienen más problemas que nosotros y corren el peligro de comerse a sí mismos por el suelo, el agua y el aire. No nos amenazan, desaparecerán antes que nosotros.

P: Žižek dice que nuestros problemas provienen de la mediocridad intelectual de Alemania y Francia, que esa es la razón principal de la decadencia actual. ¿Qué piensa?

R: Es una estupidez. Ocurren muchas más cosas intelectualmente en Europa que en América, infinitamente más. Por ejemplo, en arte, en filosofía, en ciencias, en urbanismo. Es insensato decir cosas así, pero es que Žižek es un viejo cretino, una especie de cosa de extrema izquierda, fruto del agotamiento de la extrema izquierda, de su decadencia final, de la cual es el síntoma. Por otra parte, es un chico muy majo. La extrema izquierda se ha equivocado tanto sobre el mundo que al final todos estos viejos de extrema izquierda no tienen otra cosa que hacer salvo vomitar sobre el mundo, como hace Alain Badiou en Francia.

P: ¿Prefiere a Marine Le Pen?

R: No soy político, no puedo responder a esta pregunta, no me interesa.

P: ¿No le gusta hablar de política?

R: Sí hablo de política, he escrito un libro sobre política, ¡que yo sepa!,Las políticas de la naturaleza.

P: ¿No le interesa la política de todos los días?

R: La de todos los días sí, pero no la de los partidos, son agitaciones superficiales, sobre todo en Francia, donde ya no hay verdaderamente política.

P: Critica a la extrema izquierda, ¿y nada a la extrema derecha?

R: Se agita, intenta agarrarse a un clavo ardiendo, pero no tiene mucha importancia. No es ahí donde las cosas están en juego.

P: ¿Cree que es residual?

R: No, no es residual, puede desarrollarse y provocar daños, tanto como la extrema izquierda; el no pensar siempre provoca daños, pero no es eso lo que va a solucionar los problemas de la Tierra, la economía, las ciudades, el transporte y la tecnología.

P: ¿Qué escenario prevé para 2050? ¿Qué Tierra, qué humanidad?

R: Ese no es mi trabajo, mi trabajo consiste en prepararnos para las guerras. Las guerras ecológicas van a ser muy importantes y tenemos que preparar nuestros ejércitos de un modo intelectual y humano. Ese es mi trabajo.

P: ¿Habrá guerras violentas por el clima?

R: La definición misma de guerra va a cambiar, estamos en una situación en la cual no podemos ganar contra la Tierra, es una guerra asimétrica: si ganamos, perdemos, y si perdemos, ganamos. Así pues, esta situación crea obligaciones a multitud de gente y antes que nada a los intelectuales.

P: ¿La batalla principal es esa?

R: Si no tenemos mundo, no podemos hacer gran cosa, ni siquiera la revolución. Cuando se lee a Marx, uno se queda impresionado por lo que dice sobre los humanos. En esta época, la cuestión de la ciencia y del margen geográfico, más la presencia de miles de millones de personas, conforma un escenario crucial. Antes teníamos otros problemas, pero este no.

P: ¿Así que se trata de ser o no ser?

R: En cada informe científico, las previsiones son peores, el plan más pesimista siempre aparece. Hay que tener en cuenta eso. Son previsiones extremas, pero de momento son las únicas válidas. No se trata de una guerra mundial, sino de una acumulación de guerras mundiales. Es parecido al invierno nuclear de la guerra fría, una situación de cataclismo, pero con algunas ventajas: es más radical, pero más lento, tenemos mucha capacidad de invención, 9.000 millones de personas y muchas mentes inteligentes. Pero también es un reto. Por tanto, es una cuestión de alta política y no de naturaleza. La política viene primero.

P: ¿Tiene la sensación de estar solo?

R: Lo que era complicado en este libro era crear el vínculo entre ciencia y política, y no puedo decir que haya convencido a mucha gente. Si además se hace el vínculo entre la religión y las artes, es más difícil. Gente como Sloterdijk sería muy capaz de comprenderlo. Sin embargo, muchos intelectuales siguen en el siglo XX, como Žižek. Permanecen en un contexto, en un ideal revolucionario, de decepción. Están decepcionados con los humanos.

P: ¿Cree que los humanos se dejarán ayudar?

R: Primero hay que ayudar a la Tierra. En el antropoceno ya no se puede hacer la distinción entre los humanos y la Tierra.

P: ¿Y sus estudiantes están listos para la lucha?

R: En mi escuela soy el único en dar clases sobre cuestiones donde no entra la política en el sentido clásico. Hay un curso o dos sobre cuestiones ecológicas. Es culpa mía, no he trabajado lo suficiente como para cambiar las cosas. Llevamos mucho retraso.