Arquivo da tag: Desastre

What BP Doesn’t Want You to Know About the 2010 Gulf Spill (The Daily Beast)

The 2010 Gulf of Mexico oil spill was even worse than BP wanted us to know.

by  | April 22, 2013 4:45 AM EDT

“It’s as safe as Dawn dishwashing liquid.” That’s what Jamie Griffin says the BP man told her about the smelly, rainbow-streaked gunk coating the floor of the “floating hotel” where Griffin was feeding hundreds of cleanup workers during the BP oil disaster in the Gulf of Mexico. Apparently, the workers were tracking the gunk inside on their boots. Griffin, as chief cook and maid, was trying to clean it. But even boiling water didn’t work.

BP Oil Spill
An agonizing 87 days passed before the BP oil spill was finally sealed off. According to US government estimates, 210 million gallons of Louisiana sweet crude had escaped into the Gulf, making this disaster the largest unintentional oil leak in world history. (Benjamin Lowy/Getty)

“The BP representative said, ‘Jamie, just mop it like you’d mop any other dirty floor,’” Griffin recalls in her Louisiana drawl.

It was the opening weeks of what everyone, echoing President Barack Obama, was calling “the worst environmental disaster in American history.” At 9:45 p.m. local time on April 20, 2010, a fiery explosion on the Deepwater Horizon oil rig had killed 11 workers and injured 17. One mile underwater, the Macondo well had blown apart, unleashing a gusher of oil into the gulf. At risk were fishing areas that supplied one third of the seafood consumed in the U.S., beaches from Texas to Florida that drew billions of dollars’ worth of tourism to local economies, and Obama’s chances of reelection. Republicans were blaming him for mishandling the disaster, his poll numbers were falling, even his 11-year-old daughter was demanding, “Daddy, did you plug the hole yet?”

Griffin did as she was told: “I tried Pine-Sol, bleach, I even tried Dawn on those floors.” As she scrubbed, the mix of cleanser and gunk occasionally splashed onto her arms and face.

Within days, the 32-year-old single mother was coughing up blood and suffering constant headaches. She lost her voice. “My throat felt like I’d swallowed razor blades,” she says.

Then things got much worse.

Like hundreds, possibly thousands, of workers on the cleanup, Griffin soon fell ill with a cluster of excruciating, bizarre, grotesque ailments. By July, unstoppable muscle spasms were twisting her hands into immovable claws. In August, she began losing her short-term memory. After cooking professionally for 10 years, she couldn’t remember the recipe for vegetable soup; one morning, she got in the car to go to work, only to discover she hadn’t put on pants. The right side, but only the right side, of her body “started acting crazy. It felt like the nerves were coming out of my skin. It was so painful. My right leg swelled—my ankle would get as wide as my calf—and my skin got incredibly itchy.”

“These are the same symptoms experienced by soldiers who returned from the Persian Gulf War with Gulf War syndrome,” says Dr. Michael Robichaux, a Louisiana physician and former state senator, who treated Griffin and 113 other patients with similar complaints. As a general practitioner, Robichaux says he had “never seen this grouping of symptoms together: skin problems, neurological impairments, plus pulmonary problems.” Only months later, after Kaye H. Kilburn, a former professor of medicine at the University of Southern California and one of the nation’s leading environmental health experts, came to Louisiana and tested 14 of Robichaux’s patients did the two physicians make the connection with Gulf War syndrome, the malady that afflicted an estimated 250,000 veterans of that war with a mysterious combination of fatigue, skin inflammation, and cognitive problems.

Meanwhile, the well kept hemorrhaging oil. The world watched with bated breath as BP failed in one attempt after another to stop the leak. An agonizing 87 days passed before the well was finally plugged on July 15. By then, 210 million gallons of Louisiana sweet crude had escaped into the Gulf of Mexico, according to government estimates, making the BP disaster the largest accidental oil leak in world history.

In 2010, Pulitzer Prize-winning animator Mark Fiore created this humorous and poignant take on the BP oil spill.

Yet three years later, the BP disaster has been largely forgotten, both overseas and in the U.S. Popular anger has cooled. The media have moved on. Today, only the business press offers serious coverage of what the Financial Times calls “the trial of the century”—the trial now under way in New Orleans, where BP faces tens of billions of dollars in potential penalties for the disaster. As for Obama, the same president who early in the BP crisis blasted the “scandalously close relationship” between oil companies and government regulators two years later ran for reelection boasting about how much new oil and gas development his administration had approved.

Such collective amnesia may seem surprising, but there may be a good explanation for it: BP mounted a cover-up that concealed the full extent of its crimes from public view. This cover-up prevented the media and therefore the public from knowing—and above all, seeing—just how much oil was gushing into the gulf. The disaster appeared much less extensive and destructive than it actually was. BP declined to comment for this article.

That BP lied about the amount of oil it discharged into the gulf is already established. Lying to Congress about that was one of 14 felonies to which BP pleaded guilty last year in a legal settlement with the Justice Department that included a $4.5 billion fine, the largest fine ever levied against a corporation in the U.S.

What has not been revealed until now is how BP hid that massive amount of oil from TV cameras and the price that this “disappearing act” imposed on cleanup workers, coastal residents, and the ecosystem of the gulf. That story can now be told because an anonymous whistleblower has provided evidence that BP was warned in advance about the safety risks of attempting to cover up its leaking oil. Nevertheless, BP proceeded. Furthermore, BP appears to have withheld these safety warnings, as well as protective measures, both from the thousands of workers hired for the cleanup and from the millions of Gulf Coast residents who stood to be affected.

The financial implications are enormous. The trial now under way in New Orleans is wrestling with whether BP was guilty of “negligence” or “gross negligence” for the Deepwater Horizon disaster. If found guilty of “negligence,” BP would be fined, under the Clean Water Act, $1,100 for each barrel of oil that leaked. But if found guilty of “gross negligence”—which a cover-up would seem to imply—BP would be fined $4,300 per barrel, almost four times as much, for a total of $17.5 billion. That large a fine, combined with an additional $34 billion that the states of Louisiana, Alabama, Mississippi, and Florida are seeking, could have a powerful effect on BP’s economic health.

Yet the most astonishing thing about BP’s cover-up? It was carried out in plain sight, right in front of the world’s uncomprehending news media (including, I regret to say, this reporter).

BP Oil Spill
More than half of the Corexit was dispersed by C-130 airplanes, often hitting workers. (Benjamin Lowy/Getty)

The chief instrument of BP’s cover-up was the same substance that apparently sickened Jamie Griffin and countless other cleanup workers and local residents. Its brand name is Corexit, but most news reports at the time referred to it simply as a “dispersant.” Its function was to attach itself to leaked oil, break it into droplets, and disperse them into the vast reaches of the gulf, thereby keeping the oil from reaching Gulf Coast shorelines. And the Corexit did largely achieve this goal.

But the 1.84 million gallons of Corexit that BP applied during the cleanup also served a public-relations purpose: they made the oil spill all but disappear, at least from TV screens. By late July 2010, the Associated Press and The New York Times were questioning whether the spill had been such a big deal after all. Time went so far as to assert that right-wing talk-radio host Rush Limbaugh “has a point” when he accused journalists and environmentalists of exaggerating the crisis.

But BP had a problem: it had lied about how safe Corexit is, and proof of its dishonesty would eventually fall into the hands of the Government Accountability Project, the premiere whistleblower-protection group in the U.S. The proof? A technical manual BP had received from NALCO, the firm that supplied the Corexit that BP used in the gulf.

An electronic copy of that manual is included in a new report GAPhas issued, “Deadly Dispersants in the Gulf.” On the basis of interviews with dozens of cleanup workers, scientists, and Gulf Coast residents, GAP concludes that the health impacts endured by Griffin were visited upon many other locals as well. What’s more, the combination of Corexit and crude oil also caused terrible damage to gulf wildlife and ecosystems, including an unprecedented number of seafood mutations; declines of up to 80 percent in seafood catch; and massive die-offs of the microscopic life-forms at the base of the marine food chain. GAP warns that BP and the U.S. government nevertheless appear poised to repeat the exercise after the next major oil spill: “As a result of Corexit’s perceived success, Corexit … has become the dispersant of choice in the U.S. to ‘clean up’ oil spills.”

BP Oil Spill
Numerous fishermen on BP’s payroll helped with the cleanup by dispersing Corexit. (Benjamin Lowy/Getty)

BP’s cover-up was not planned in advance but devised in the heat of the moment as the oil giant scrambled to limit the PR and other damages of the disaster. Indeed, one of the chief scandals of the disaster is just how unprepared both BP and federal and state authorities were for an oil leak of this magnitude. U.S. law required that a response plan be in place before drilling began, but the plan was embarrassingly flawed.

“We weren’t managing for actual risk; we were checking a box,” says Mark Davis, director of the Institute on Water Resources Law and Policy at Tulane University. “That’s how we ended up with a response plan that included provisions for dealing with the impacts to walruses: because [BP] copied word for word the response plans that had been developed after the Exxon-Valdez oil spill [in Alaska, in 1989] instead of a plan tailored to the conditions in the gulf.”

As days turned into weeks and it became obvious that no one knew how to plug the gushing well, BP began insisting that Corexit be used to disperse the leaking oil. This triggered alarms from scientists and from a leading environmental NGO in Louisiana, the Louisiana Environmental Action Network (LEAN).

The group’s scientific adviser, Wilma Subra, a chemist whose work on environmental pollution had won her a “genius grant” from the MacArthur Foundation, told state and federal authorities that she was especially concerned about how dangerous the mixture of crude and Corexit was: “The short-term health symptoms include acute respiratory problems, skin rashes, cardiovascular impacts, gastrointestinal impacts, and short-term loss of memory,” she told GAP investigators. “Long-term impacts include cancer, decreased lung function, liver damage, and kidney damage.”

(Nineteen months after the Deepwater Horizon explosion, a scientific study published in the peer-reviewed journal Environmental Pollution found that crude oil becomes 52 times more toxic when combined with Corexit.)

BP even rebuffed a direct request from the administrator of the Environmental Protection Agency, Lisa Jackson, who wrote BP a letter on May 19, asking the company to deploy a less toxic dispersant in the cleanup. Jackson could only ask BP to do this; she could not legally require it. Why? Because use of Corexit had been authorized years before under the federal Oil Pollution Act.

In a recent interview, Jackson explains that she and other officials “had to determine, with less-than-perfect scientific testing and data, whether use of dispersants would, despite potential side effects, improve the overall situation in the gulf and coastal ecosystems. The tradeoff, as I have said many times, was potential damage in the deep water versus the potential for larger amounts of undispersed oil in the ecologically rich coastal shallows and estuaries.” She adds that the presidential commission that later studied the BP oil disaster did not fault the decision to use dispersants.

Knowing that EPA lacked the authority to stop it, BP wrote back to Jackson on May 20, declaring that Corexit was safe. What’s more, BP wrote, there was a ready supply of Corexit, which was not the case with alternative dispersants. (A NALCO plant was located just 30 miles west of New Orleans.)

But Corexit was decidedly not safe without taking proper precautions, as the manual BP got from NALCO spelled out in black and white. The “Vessel Captains Hazard Communication” resource manual, which GAP shared with me, looks innocuous enough. A three-ring binder with a black plastic cover, the manual contained 61 sheets, each wrapped in plastic, that detailed the scientific properties of the two types of Corexit that BP was buying, as well as their health hazards and recommended measures against those hazards.

BP applied two types of Corexit in the gulf. The first, Corexit 9527, was considerably more toxic. According to the NALCO manual, Corexit 9527 is an “eye and skin irritant. Repeated or excessive exposure … may cause injury to red blood cells (hemolysis), kidney or the liver.” The manual adds: “Excessive exposure may cause central nervous system effects, nausea, vomiting, anesthetic or narcotic effects.” It advises, “Do not get in eyes, on skin, on clothing,” and “Wear suitable protective clothing.”

When available supplies of Corexit 9527 were exhausted early in the cleanup, BP switched to the second type of dispersant, Corexit 9500. In its recommendations for dealing with Corexit 9500, the NALCO manual advised, “Do not get in eyes, on skin, on clothing,” “Avoid breathing vapor,” and “Wear suitable protective clothing.”

It’s standard procedure—and required by U.S. law—for companies to distribute this kind of information to any work site where hazardous materials are present so workers can know about the dangers they face and how to protect themselves. But interviews with numerous cleanup workers suggest that this legally required precaution was rarely if ever followed during the BP cleanup. Instead, it appears that BP told NALCO to stop including the manuals with the Corexit that NALCO was delivering to cleanup work sites.

“It’s my understanding that some manuals were sent out with the shipments of Corexit in the beginning [of the cleanup],” the anonymous source tells me. “Then, BP told NALCO to stop sending them. So NALCO was left with a roomful of unused binders.”

Roman Blahoski, NALCO’s director of global communications, says: “NALCO responded to requests for its pre-approved dispersants from those charged with protecting the gulf and mitigating the environmental, health, and economic impact of this event. NALCO was never involved in decisions relating to the use, volume, and application of its dispersant.”

BP Oil Spill
The gulf’s vital tourism industry lost billions as oil poured into the water. (Benjamin Lowy/Getty)

Misrepresenting the safety of Corexit went hand in hand with BP’s previously noted lie about how much oil was leaking from the Macondo well. As reported by John Rudolf in The Huffington Post, internal BP emails show that BP privately estimated that “the runaway well could be leaking from 62,000 barrels a day to 146,000 barrels a day.” Meanwhile, BP officials were telling the government and the media that only 5,000 barrels a day were leaking.

In short, applying Corexit enabled BP to mask the fact that a much larger amount of oil was actually leaking into the gulf. “Like any good magician, the oil industry has learned that if you can’t see something that was there, it must have ‘disappeared,’” Scott Porter, a scientist and deep-sea diver who consults for oil companies and oystermen, says in the GAP report. “Oil companies have also learned that, in the public mind, ‘out of sight equals out of mind.’ Therefore, they have chosen crude oil dispersants as the primary tool for handling large marine oil spills.”

BP also had a more direct financial interest in using Corexit, argues Clint Guidry, president of the Louisiana Shrimp Association, whose members include not only shrimpers but fishermen of all sorts. As it happens, local fishermen constituted a significant portion of BP’s cleanup force (which numbered as many as 47,000 workers at the height of the cleanup). Because the spill caused the closure of their fishing grounds, BP and state and federal authorities established the Vessels of Opportunity (VoO) program, in which BP paid fishermen to take their boats out and skim, burn, and otherwise get rid of leaked oil. Applying dispersants, Guidry points out, reduced the total volume of oil that could be traced back to BP.

“The next phase of this trial [against BP] is going to turn on how much oil was leaked,” Guidry tells me. [If found guilty, BP will be fined a certain amount for each barrel of oil judged to have leaked.] “So hiding the oil with Corexit worked not only to hide the size of the spill but also to lower the amount of oil that BP may get charged for releasing.”

BP Oil Spill
“You could smell oil and stuff in the air, but on the news they were saying it’s fine.” (Benjamin Lowy/Getty)

Not only did BP fail to inform workers of the potential hazards of Corexit and to provide them with safety training and protective gear, according to interviews with dozens of cleanup workers, the company also allegedly threatened to fire workers who complained about the lack of respirators and protective clothing.

“I worked with probably a couple hundred different fishermen on the [cleanup],” Acy Cooper, Guidry’s second in command, tells me in Venice, the coastal town from which many VoO vessels departed. “Not one of them got any safety information or training concerning the toxic materials they encountered.” Cooper says that BP did provide workers with body suits and gloves designed for handling hazardous materials. “But when I’d talk with [the BP representative] about getting my guys respirators and air monitors, I’d never get any response.”

Roughly 58 percent of the 1.84 million gallons of Corexit used in the cleanup was sprayed onto the gulf from C-130 airplanes. The spray sometimes ended up hitting cleanup workers in the face.

“Our boat was sprayed four times,” says Jorey Danos, a 32-year-old father of three who suffered racking coughing fits, severe fatigue, and memory loss after working on the BP cleanup. “I could see the stuff coming out of the plane—like a shower of mist, a smoky color. I could see [it] coming at me, but there was nothing I could do.”

“The next day,” Danos continues, “when the BP rep came around on his speed boat, I asked, ‘Hey, what’s the deal with that stuff that was coming out of those planes yesterday?’ He told me, ‘Don’t worry about it.’ I said, ‘Man, that s–t was burning my face—it ain’t right.’ He said, ‘Don’t worry about it.’ I said, ‘Well, could we get some respirators or something, because that s–t is bad.’ He said, ‘No, that wouldn’t look good to the media. You got two choices: you can either be relieved of your duties or you can deal with it.’”

Perhaps the single most hazardous chemical compound found in Corexit 9527 is 2-Butoxyethanol, a substance that had been linked to cancers and other health impacts among cleanup workers on the 1989 Exxon-Valdez oil spill in Alaska. According to BP’s own data, 20 percent of offshore workers in the gulf had levels of 2-Butoxyethanol two times higher than the level certified as safe by the Occupational Safety and Health Administration.

Cleanup workers were not the only victims; coastal residents also suffered. “My 2-year-old grandson and I would play out in the yard,” says Shirley Tillman of the Mississippi coastal town Pass Christian. “You could smell oil and stuff in the air, but on the news they were saying it’s fine, don’t worry. Well, by October, he was one sick little fellow. All of a sudden, this very active little 2-year-old was constantly sick. He was having headaches, upper respiratory infections, earaches. The night of his birthday party, his parents had to rush him to the emergency room. He went to nine different doctors, but they treated just the symptoms; they’re not toxicologists.”

BP Oil Spill
Doctors misdiagnosed Danos, a BP clean-up worker who was exposed to Corexit, with schizophrenia and bipolar disorder. (Benjamin Lowy/Getty)

“It’s not the crime, it’s the cover-up.” Ever since the Watergate scandal of the 1970s, that’s been the mantra. Cover-ups don’t work, goes the argument. They only dig a deeper hole, because the truth eventually comes out.

But does it?

GAP investigators were hopeful that obtaining the NALCO manual might persuade BP to meet with them, and it did. On July 10, 2012, BP hosted a private meeting at its Houston offices. Presiding over the meeting, which is described here publicly for the first time, was BP’s public ombudsman, Stanley Sporkin, joining by telephone from Washington. Ironically, Sporkin had made his professional reputation during the Watergate scandal. As a lawyer with the Securities and Exchange Commission, Sporkin investigated illegal corporate payments to the slush fund that President Nixon used to buy the silence of the Watergate burglars.

Also attending the meeting were two senior BP attorneys; BP Vice President Luke Keller; other BP officials; Thomas Devine, GAP’s senior attorney on the BP case; Shanna Devine, GAP’s investigator on the case; Dr. Michael Robichaux; Dr. Wilma Subra; and Marylee Orr, the executive director of LEAN. The following account is based on my interviews with Thomas Devine, Robichaux, Subra, and Orr. BP declined to comment.

BP officials had previously confirmed the authenticity of the NALCO manual, says Thomas Devine, but now they refused to discuss it, even though this had been one of the stated purposes for the meeting. Nor would BP address the allegation, made by the whistleblower who had given the manual to GAP, that BP had ordered the manual withheld from cleanup work sites, perhaps to maintain the fiction that Corexit was safe.

“They opened the meeting with this upbeat presentation about how seriously they took their responsibilities for the spill and all the wonderful things they were doing to make things right,” says Devine. “When it was my turn to speak, I said that the manual our whistleblower had provided contradicted what they just said. I asked whether they had ordered the manual withdrawn from work sites. Their attorneys said that was a matter they would not discuss because of the pending litigation on the spill.” [Disclosure: Thomas Devine is a friend of this reporter.]

The visitors’ top priority was to get BP to agree not to use Corexit in the future. Keller said that Corexit was still authorized for use by the U.S. government and BP would indeed feel free to use it against any future oil spills.

105790620BL045_oil_spill
Benjamin Lowy

A second priority was to get BP to provide medical treatment for Jamie Griffin and the many other apparent victims of Corexit-and-crude poisoning. This request too was refused by BP.

Robichaux doubts his patients will receive proper compensation from the $7.8 billion settlement BP reached in 2012 with the Plaintiffs’ Steering Committee, 19 court-appointed attorneys who represent the hundreds of individuals and entities that have sued BP for damages related to the gulf disaster. “Nine of the most common symptoms of my patients do not appear on the list of illnesses that settlement says can be compensated, including memory loss, fatigue, and joint and muscular pain,” says Robichaux. “So how are the attorneys going to file suits on behalf of those victims?”

At one level, BP’s cover-up of the gulf oil disaster speaks to the enormous power that giant corporations exercise in modern society, and how unable, or unwilling, governments are to limit that power. To be sure, BP has not entirely escaped censure for its actions; depending on the outcome of the trial now under way in New Orleans, the company could end up paying tens of billions of dollars in fines and damages over and above the $4.5 billion imposed by the Justice Department in the settlement last year. But BP’s reputation appears to have survived: its market value as this article went to press was a tidy $132 billion, and few, if any, BP officials appear likely to face any legal repercussions. “If I would have killed 11 people, I’d be hanging from a noose,” says Jorey Danos. “Not BP. It’s the golden rule: the man with the gold makes the rules.”

As unchastened as anyone at BP is Bob Dudley, the American who was catapulted into the CEO job a few weeks into the gulf disaster to replace Tony Hayward, whose propensity for imprudent comments—“I want my life back,” the multimillionaire had pouted while thousands of gulf workers and residents were suffering—had made him a globally derided figure. Dudley told the annual BP shareholders meeting in London last week that Corexit “is effectively … dishwashing soap,” no more toxic than that, as all scientific studies supposedly showed. What’s more, Dudley added, he himself had grown up in Mississippi and knows that the Gulf of Mexico is “an ecosystem that is used to oil.”

Nor has the BP oil disaster triggered the kind of changes in law and public priorities one might have expected. “Not much has actually changed,” says Mark Davis of Tulane. “It reflects just how wedded our country is to keeping the Gulf of Mexico producing oil and bringing it to our shores as cheaply as possible. Going forward, no one should assume that just because something really bad happened we’re going to manage oil and gas production with greater sensitivity and wisdom. That will only happen if people get involved and compel both the industry and the government to be more diligent.”

And so the worst environmental disaster in U.S. history has been whitewashed—its true dimensions obscured, its victims forgotten, its lessons ignored. Who says cover-ups never work?

Mark Hertsgaard is a fellow at the New American Foundation and the author, most recently, of HOT: Living Through the Next Fifty Years on Earth. This article was reported in partnership with the Investigative Fund at the Nation Institute.

Carbon bubble will plunge the world into another financial crisis – report (The Guardian)

Trillions of dollars at risk as stock markets inflate value of fossil fuels that may have to remain buried forever, experts warn

Damian Carrington – The Guardian, Friday 19 April 2013

Carbon bubble : carbon dioxide polluting power plant : coal-fired Bruce Mansfield Power Plant

Global stock markets are betting on countries failing to adhere to legally binding carbon emission targets. Photograph: Robert Nickelsberg/Getty Images

The world could be heading for a major economic crisis as stock marketsinflate an investment bubble in fossil fuels to the tune of trillions of dollars, according to leading economists.

“The financial crisis has shown what happens when risks accumulate unnoticed,” said Lord (Nicholas) Stern, a professor at the London School of Economics. He said the risk was “very big indeed” and that almost all investors and regulators were failing to address it.

The so-called “carbon bubble” is the result of an over-valuation of oil,coal and gas reserves held by fossil fuel companies. According to a report published on Friday, at least two-thirds of these reserves will have to remain underground if the world is to meet existing internationally agreed targets to avoid the threshold for “dangerous” climate changeIf the agreements hold, these reserves will be in effect unburnable and so worthless – leading to massive market losses. But the stock markets are betting on countries’ inaction on climate change.

The stark report is by Stern and the thinktank Carbon Tracker. Their warning is supported by organisations including HSBC, Citi, Standard and Poor’s and the International Energy Agency. The Bank of England has also recognised that a collapse in the value of oil, gas and coal assets as nations tackle global warming is a potential systemic risk to the economy, with London being particularly at risk owing to its huge listings of coal.

Stern said that far from reducing efforts to develop fossil fuels, the top 200 companies spent $674bn (£441bn) in 2012 to find and exploit even more new resources, a sum equivalent to 1% of global GDP, which could end up as “stranded” or valueless assets. Stern’s landmark 2006 reporton the economic impact of climate change – commissioned by the then chancellor, Gordon Brown – concluded that spending 1% of GDP would pay for a transition to a clean and sustainable economy.

The world’s governments have agreed to restrict the global temperature rise to 2C, beyond which the impacts become severe and unpredictable. But Stern said the investors clearly did not believe action to curb climate change was going to be taken. “They can’t believe that and also believe that the markets are sensibly valued now.”

“They only believe environmental regulation when they see it,” said James Leaton, from Carbon Tracker and a former PwC consultant. He said short-termism in financial markets was the other major reason for the carbon bubble. “Analysts say you should ride the train until just before it goes off the cliff. Each thinks they are smart enough to get off in time, but not everyone can get out of the door at the same time. That is why you get bubbles and crashes.”

Paul Spedding, an oil and gas analyst at HSBC, said: “The scale of ‘listed’ unburnable carbon revealed in this report is astonishing. This report makes it clear that ‘business as usual’ is not a viable option for the fossil fuel industry in the long term. [The market] is assuming it will get early warning, but my worry is that things often happen suddenly in the oil and gas sector.”

HSBC warned that 40-60% of the market capitalisation of oil and gas companies was at risk from the carbon bubble, with the top 200 fossil fuel companies alone having a current value of $4tn, along with $1.5tn debt.

Lord McFall, who chaired the Commons Treasury select committee for a decade, said: “Despite its devastating scale, the banking crisis was at its heart an avoidable crisis: the threat of significant carbon writedown has the unmistakable characteristics of the same endemic problems.”

The report calculates that the world’s currently indicated fossil fuel reserves equate to 2,860bn tonnes of carbon dioxide, but that just 31% could be burned for an 80% chance of keeping below a 2C temperature rise. For a 50% chance of 2C or less, just 38% could be burned.

Carbon capture and storage technology, which buries emissions underground, can play a role in the future, but even an optimistic scenario which sees 3,800 commercial projects worldwide would allow only an extra 4% of fossil fuel reserves to be burned. There are currently no commercial projects up and running. The normally conservativeInternational Energy Agency has also concluded that a major part of fossil fuel reserves is unburnable.

Citi bank warned investors in Australia’s vast coal industry that little could be done to avoid the future loss of value in the face of action on climate change. “If the unburnable carbon scenario does occur, it is difficult to see how the value of fossil fuel reserves can be maintained, so we see few options for risk mitigation.”

Ratings agencies have expressed concerns, with Standard and Poor’s concluding that the risk could lead to the downgrading of the credit ratings of oil companies within a few years.

Steven Oman, senior vice-president at Moody’s, said: “It behoves us as investors and as a society to know the true cost of something so that intelligent and constructive policy and investment decisions can be made. Too often the true costs are treated as unquantifiable or even ignored.”

Jens Peers, who manages €4bn (£3bn) for Mirova, part of €300bn asset managers Natixis, said: “It is shocking to see the report’s numbers, as they are worse than people realise. The risk is massive, but a lot of asset managers think they have a lot of time. I think they are wrong.” He said a key moment will come in 2015, the date when the world’s governments have pledged to strike a global deal to limit carbon emissions. But he said that fund managers need to move now. If they wait till 2015, “it will be too late for them to take action.”

Pension funds are also concerned. “Every pension fund manager needs to ask themselves have we incorporated climate change and carbon risk into our investment strategy? If the answer is no, they need to start to now,” said Howard Pearce, head of pension fund management at the Environment Agency, which holds £2bn in assets.

Stern and Leaton both point to China as evidence that carbon cuts are likely to be delivered. China’s leaders have said its coal use will peak in the next five years, said Leaton, but this has not been priced in. “I don’t know why the market does not believe China,” he said. “When it says it is going to do something, it usually does.” He said the US and Australia were banking on selling coal to China but that this “doesn’t add up”.

Jeremy Grantham, a billionaire fund manager who oversees $106bn of assets, said his company was on the verge of pulling out of all coal and unconventional fossil fuels, such as oil from tar sands. “The probability of them running into trouble is too high for me to take that risk as an investor.” He said: “If we mean to burn all the coal and any appreciable percentage of the tar sands, or other unconventional oil and gas then we’re cooked. [There are] terrible consequences that we will lay at the door of our grandchildren.”

Carbon Dioxide Removal Can Lower Costs of Climate Protection (Science Daily)

Apr. 12, 2013 — Directly removing CO2 from the air has the potential to alter the costs of climate change mitigation. It could allow prolonging greenhouse-gas emissions from sectors like transport that are difficult, thus expensive, to turn away from using fossil fuels. And it may help to constrain the financial burden on future generations, a study now published by the Potsdam Institute for Climate Impact Research (PIK) shows. It focuses on the use of biomass for energy generation, combined with carbon capture and storage (CCS). According to the analysis, carbon dioxide removal could be used under certain requirements to alleviate the most costly components of mitigation, but it would not replace the bulk of actual emissions reductions. 

Directly removing CO2 from the air has the potential to alter the costs of climate change mitigation. It could allow prolonging greenhouse-gas emissions from sectors like transport that are difficult, thus expensive, to turn away from using fossil fuels. And it may help to constrain the financial burden on future generations, a new study shows. It focuses on the use of biomass for energy generation, combined with carbon capture and storage. (Credit: © Jürgen Fälchle / Fotolia)

“Carbon dioxide removal from the atmosphere allows to separate emissions control from the time and location of the actual emissions. This flexibility can be important for climate protection,” says lead-author Elmar Kriegler. “You don’t have to prevent emissions in every factory or truck, but could for instance plant grasses that suck CO2 out of the air to grow — and later get processed in bioenergy plants where the CO2 gets stored underground.”

In economic terms, this flexibility allows to lower costs by compensating for emissions which would be most costly to eliminate. “This means that a phase-out of global emissions by the end of the century — that we would need to hold the 2 degree line adopted by the international community — does not necessarily require to eliminate each and every source of emissions,” says Kriegler. “Decisions whether and how to protect future generations from the risks of climate change have to be made today, but the burden of achieving these targets will increase over time. The costs for future generations can be substantially reduced if carbon dioxide removal technologies become available in the long run.”

Balancing the financial burden across generations

The study now published is the first to quantify this. If bioenergy plus CCS is available, aggregate mitigation costs over the 21st century might be halved. In the absence of such a carbon dioxide removal strategy, costs for future generations rise significantly, up to a quadrupling of mitigation costs in the period of 2070 to 2090. The calculation was carried out using a computer simulation of the economic system, energy markets, and climate, covering a range of scenarios.

Options for carbon dioxide removal from the atmosphere include afforestation and chemical approaches like direct air capture of CO2 from the atmosphere or reactions of CO2 with minerals to form carbonates. But the use of biomass for energy generation combined with carbon capture and storage is less costly than chemical options, as long as sufficient biomass feedstock is available, the scientists point out.

Serious concerns about large-scale biomass use combined with CCS

“Of course, there are serious concerns about the sustainability of large-scale biomass use for energy,” says co-author Ottmar Edenhofer, chief-economist of PIK. “We therefore considered the bioenergy with CCS option only as an example of the role that carbon dioxide removal could play for climate change mitigation.” The exploitation of bioenergy can conflict with land-use for food production or ecosystem protection. To account for sustainability concerns, the study restricts the bioenergy production to a medium level, that may be realized mostly on abandoned agricultural land.

Still, global population growth and changing dietary habits, associated with an increased demand for land, as well as improvements of agricultural productivity, associated with a decreased demand for land, are important uncertainties here. Furthermore, CCS technology is not yet available for industrial-scale use and, due to environmental concerns, is controversial in countries like Germany. Yet in this study it is assumed that it will become available in the near future.

“CO2 removal from the atmosphere could enable humankind to keep the window of opportunity open for low-stabilization targets despite of a likely delay in international cooperation, but only under certain requirements,” says Edenhofer. “The risks of scaling up bioenergy use need to be better understood, and safety concerns about CCS have to be thoroughly investigated. Still, carbon dioxide removal technologies are no science fiction and need to be further explored.” In no way should they be seen as a pretext to neglect emissions reductions now, notes Edenhofer. “By far the biggest share of climate change mitigation has to come from a large effort to reduce greenhouse-gas emissions globally.”

Journal Reference:

  1. Elmar Kriegler, Ottmar Edenhofer, Lena Reuster, Gunnar Luderer, David Klein. Is atmospheric carbon dioxide removal a game changer for climate change mitigation? Climatic Change, 2013; DOI: 10.1007/s10584-012-0681-4

Segue o Seco (Rolling Stone)

Edição 77 – Fevereiro de 2013

Enquanto a Bahia sofre com “a pior seca dos últimos 50 anos”, os habitantes do sertão se desdobram para superar os percalços. A esperança persiste, mas é minguada como a água da chuva

Segue o SecoFoto: Flavio Forner

Por MAÍRA KUBÍK MANO

“Para o carro! para o carro! olha ali, em cima das pedras! Tá vendo?” Não, eu não via nada. A paisagem parecia exatamente a mesma da última meia hora. Toda cor de terra, com uma ou outra catingueira no horizonte e os mandacarus, sempre em maior número, acompanhando o traçado da estrada de chão. “Lembra da cena em que o Fabiano vai tentar pegar um preá? Olha ali!”, o interlocutor insiste, apontando. Vidro abaixado, olhos a postos. Dois bichos pequenos, amarronzados e amendoados, de focinho pontudo, se mexem e se fazem notar. Pronto, lá estão os preás. Júlio César Santos fica satisfeito. Afinal, ele fora parar no sertão justamente depois de ler Vidas Secas.

“Eu sou da Zona da Mata, mas quando li Graciliano Ramos quis vir para cá”, conta Santos, um engenheiro agrônomo que se encantou pela caatinga quando ainda era estudante da Universidade Federal do Recôncavo Baiano (UFRB). Hoje, é chefe do escritório da EBDA (Empresa Baiana de Desenvolvimento Agrícola) em Ipirá, um dos 258 municípios da Bahia em situação de emergência por causa da seca. Junto com outros 17 órgãos e secretarias do governo de Jaques Wagner (PT), a EDBA faz parte do Comitê Estadual de Ações de Convivência com a Seca.

Estamos a caminho da cidade vizinha, Pintadas, onde a estiagem é ainda mais crítica. No percurso, cruzamos quatro rios. Três deles, secos. O céu nublado ao longe parece o prenúncio da mudança. Um chuvisco havia caído naquela madrugada, algo que não acontecia há muito tempo. As marcas ainda estavam na terra, em alguns sulcos rasos que provavelmente abrigaram fios de água corrente. Santos parece aliviado. “Agora precisa chover mais”, diz.

Em uma curva à esquerda surge a casa de Messias e Ginalva Jesus Pereira. A plantação de palmas logo se destaca da monocromia – é verde-escura, com nenhum tom de marrom. Na seca, o vegetal tem sido fonte de alimento imprescindível para garantir a sobrevivência dos animais, que já não têm mais pasto. “O povo vem, visita, admira. Outros ficam com usura”, fala Ginalva, sobrancelhas levantadas, há cerca de 20 anos vivendo naquele roçado.

Como era de se esperar, a conversa envereda para o clima e as gotas que caíram à noite. “Choveu em Ipirá, foi? Ah, aqui foi só uma neblina”, rebate o pequeno Matheus, filho do meio de Ginalva. “Aqui não chove mesmo há três anos. Perdemos dois bezerros e dois umbuzeiros para a seca. Painho está pedindo a Deus para esse resto de palma pegar”, diz, referindo-se a uma área mais distante da casa, plantada há pouco, onde o verde já está quase desbotando.

O cálculo de Matheus não é exagerado. Geralmente, chove na caatinga entre janeiro e maio, justamente a época do plantio. Em 2012, porém, a água não caiu e um período de estiagem emendou no outro, fazendo desta a maior seca dos últimos 50 anos, segundo a Coordenação de Defesa Civil da Bahia (Cordec). A previsão é que ela se estenda por mais um ou dois anos. “Agora, com a chuva, vai ser outra coisa. Vai mudar tudo”, avalia uma experiente Ginalva. Assim como o protagonista Fabiano da obra de Graciliano Ramos, ela sabe que a caatinga ressuscita.

Na casa dela, canos estrategicamente posicionados aguardam a próxima precipitação para recolher a água em cisternas. Enquanto isso não ocorre, Ginalva mantém, por meio de irrigação artificial, a produção – que inclui também feijão de corda, cebolinha, coentro, mamão, batata-doce e quiabo, além da criação de ovinos, caprinos e bovinos. O poço, recém-construído, foi financiado via Pronaf (Programa Nacional de Fortalecimento da Agricultura Familiar) Emergencial.

Assim como Ginalva, outros 6 mil agricultores da região apresentaram projetos para acessar o Programa. Segundo o Banco do Nordeste do Brasil (BNB), foram liberados R$ 10 milhões do Pronaf Emergencial até janeiro de 2013 para os 17 municípios do entorno de Feira de Santana, entre eles Pintadas e Ipirá. “São pequenos agricultores que você vê aqui, solicitando financiamento para plantar palmas ou fazer aguada para recuperar o pasto”, diz José Wilson Junqueira Queiroz, gerente de negócios do BNB. Em todo o Brasil, entre maio e dezembro de 2012, o governo federal autorizou R$ 656,2 milhões em linhas de crédito emergenciais para atender os atingidos pela seca.

“São essas políticas públicas que estão segurando as famílias no campo”, avalia Jeane de Almeida Santiago. Agrônoma que trabalha em uma ONG chamada Fundação Apaeba, ela presta assistência técnica para os produtores de Pintadas, Ipirá, Riachão do Jacuípe, Pé de Serra, Baixa Grande e Nova Fátima, todas na Bahia. “Antes, tinha muito mais gente que ia para São Paulo e outros estados para fazer migração.”

O relato é de alguém que conhece de perto a situação. Jeane nasceu em Pintadas. Estudou na escola agrícola e saiu para fazer curso técnico em Juazeiro e faculdade no Recôncavo Baiano. Voltou quando se formou, querendo transmitir os conhecimentos aprendidos. Olhos vivos e atentos, ela muda o tom e reavalia sua afirmação: “É, mas este ano muitos jovens estão indo. Com a seca, a rentabilidade das propriedades está zero. E as pessoas não vão ficar aqui sem ter dinheiro. Infelizmente, são obrigadas a sair, de coração partido, para São Paulo em busca de trabalho, ver se conseguem mandar dinheiro para a família que ficou aqui manter o rebanho vivo”.

De fato, o ponto de ônibus de Pintadas estava cheio naquela manhã. A cidade ainda não tem rodoviária e o asfalto que a conecta com o resto do mundo foi inaugurado há apenas um ano, como avisam as placas do governo do estado logo na entrada. Todos aguardavam na calçada o próximo transporte para a capital paulista, malas e parentes em pé, sol a pino. Há cerca de três semanas, Ginalva se despedia ali mesmo do filho mais velho, de 18 anos, que decidiu tentar a vida fora dali. “Me ligou ontem dizendo que já arrumou um emprego numa fábrica. É temporário, mas é um emprego”, ela conta. É a famosa ponte aérea Pintadas-São Paulo.

“O pior é que não temos previsão boa para este ano”, lamenta Jeane. Ela conta que até a palma e o mandacaru, também usados para alimentar o rebanho, começaram a desaparecer, e que a maioria das terras da região está na mão de pequenos agricultores de subsistência ou pecuaristas. “Já faz mais de um ano que o município está dando ração aos animais porque não tem mais pasto. Mas agora a ração esgotou. Você procura e não acha. Quando acha, é um valor que não dá para colocar no orçamento.”

Jeane preocupa-se: “Tem produtores que estão pagando três ou quatro projetos. Vai chegar uma hora que ninguém vai conseguir pegar mais [crédito], de tanto que devem. E aí, não sei como vai ser. Porque a propriedade não está tendo rentabilidade para pagar os empréstimos que já deve. Sem crédito, eu acredito que na zona rural fica impossível.”

“A causa desta seca é a destruição do meio ambiente”, ela sentencia, citando uma pesquisa recente que constata que 90% da mata nativa da região havia desaparecido. “A natureza está respondendo. O território está descoberto. E a partir daí vêm as queimadas. Muitos solos já se perderam ou estão enfraquecidos. O pessoal não tem a cultura de adubar e vão explorando e explorando. Os rios que tínhamos morreram. As nascentes estão desmatadas.”

Em Ipirá, logo ao lado, a realidade é semelhante. No lugar da caatinga, estão os bois. A cena mais comum é ver o gado ou os cavalos amontoados embaixo das poucas árvores que restam para escapar do sol escaldante – cabeça na sombra, lombo de fora. “Ipirá era um município cheio de minifúndios”, explica Orlando Cintra, gerente de Agricultura e Cooperativismo da Prefeitura. “Os grandes criadores começaram a chegar nos anos 1960. Este pessoal comprou a terra barata e empurrou o homem que produzia a batata, a mandioca e a mamona para a periferia daqui ou para São Paulo, Mato Grosso e Paraná.” Outros tantos foram trabalhar no corte da cana-de-açúcar. “Aqui não tinha boi e os pequenos produtores não desmatavam”, continua. “O que criávamos mais era o bode. Foi com a chegada dos grandes fazendeiros que o clima em Ipirá começou a mudar mais rapidamente. Desmataram para plantar capim.”

“A caatinga não é uma área para agropecuária. É para criação de caprinos, ovinos, animais de médio porte. Trouxeram a cultura do Sul, de pecuarista, e todo mundo quis ter fazenda de boi aqui”, completa Meire Oliveira, assessora da Secretaria de Agricultura e Meio Ambiente de Ipirá.

Meire passou a infância na zona rural do município e ainda se lembra do cheiro dessa mata. Conta que, quando criança, fazia burros a partir de umbus: enfiava quatro pedaços de galhinhos na fruta, representando as quatro patas. “Pena que, muitas vezes, quando eu digo para não desmatar, nem meu pai me ouve”, lamenta. Ela parece conhecer todas as plantas da caatinga. Quando encontra um cacto coroa-de-frade, mostra que é possível comer seu fruto, pequenino e vermelho. Caminhando pelas propriedades da região, cruza as cercas de arame farpado com desenvoltura. Pega um punhado de maxixe ainda verde e explica como cozinhá-lo. “Igualzinho a quiabo, sabe?” No sertão, tudo pode ser aproveitado. “A caatinga tem um poder de regeneração incrível”, explica. “A solução seria deixá-la descansar. Algumas áreas no entorno do Rio do Peixe já estão em processo de desertificação.”

Um exemplo de preservação ambiental é o assentamento D. Mathias, que completou sete anos de existência. Ali, a caatinga aos poucos renasce entre bodes, cabras e ovelhas. As árvores são podadas apenas o suficiente para não machucarem os animais, que circulam livremente pelas aroeiras, xique-xiques e umbuzeiros. Organizado pelo Movimento Luta Camponesa (MLC), o símbolo do assentamento é uma família de retirantes desenhada em preto e vermelho. A fila é puxada por uma mulher com uma foice nas mãos. Em seguida vem um homem, com uma enxada nos ombros. Dois filhos, um menino e uma menina seguem-nos de mãos dadas. Por último, um cachorro que, quiçá, se chama Baleia.

Júlio César Santos, dirigente da EBDA, presta assistência aos assentados e explica que os camponeses estão muito atentos às políticas públicas e linhas de crédito oferecidas pelos governos estadual e federal. Com isso, já conseguiram construir casas, comprar uma resfriadeira de leite e ampliar a criação de ovelhas. Entre as últimas iniciativas no local está a plantação adensada de palmas, mais rentável do que a tradicional. Em um primeiro momento, os agricultores não confiaram na técnica e continuaram plantando os cactos distantes uns dos outros, como sempre fizeram. Para contornar as dificuldades, Santos utilizou o “método de Paulo Freire”. Plantou dois roçados: de um lado, as palmas, adensadas; de outro, as tradicionais. Agora, as duas estão crescendo e ele espera, em breve, provar sua teoria. “Tomara que a falta de chuva não queime elas”, diz.

O sucesso do assentamento motivou, há 11 meses, um acampamento no latifúndio vizinho. Leidinaura Souza Santana, ou simplesmente Leila, é uma das moradoras do acampamento Elenaldo Teixeira. “O problema maior aqui é a água para beber e cozinhar. Ficamos quase 15 dias sem água. O caminhão-pipa chegou só ontem”, reclama. “A Embasa [Empresa Baiana de Águas e Saneamento] suspendeu o pipa por causa do rio, que já estava muito baixo, e também porque deu um problema na bomba”, explica Meire, que acompanha a visita. “Tivemos que tomar uma água que não é boa para beber”, murmura Leila.

Leila nasceu em Coração de Maria, ao norte de Feira de Santana. O marido trabalhava como vaqueiro em Malhador, povoado no município de Ipirá, quando souberam dos boatos da ocupação. Vieram logo participar. “Estamos esperando chegar a hora para entrar dentro da fazenda e acabar com o sofrimento. A área já foi atestada como improdutiva. O assentamento aqui do lado é uma maravilha. Me animei de ver que esse pessoal era acampado como a gente. Não desisto, não”, afirma. Meire aproveita para dar uma injeção de ânimo: “Eu acompanhei o outro acampamento desde o começo e era igualzinho. Acho que era até mais quente que este. Este é mais fresco. E olha como estão hoje”.

A conversa acontece na escola do acampamento, onde jovens e adultos são alfabetizados. A pequena construção de palha e madeira da escola fica no início daquela que foi batizada de “Avenida Brasil”, uma sequência bem aprumada de cerca de 15 barracos de lona. Leila acabou de passar para a 4a série do ensino fundamental e soletra o nome para mim. “L-E-I-D-I-N-A-U-R-A.” “Não é com ‘l’, não?”, pergunta Meire. “Não, é com ‘u’ mesmo”, Leila responde.

Em Tamanduá, povoado do entorno de Ipirá, motos e jegues passam com gente e baldes na garupa. Tudo lembra a estiagem. Egecivaldo Oliveira Nunes está à beira da estrada, ao volante do caminhão-pipa estacionado em frente à casa azul e branca. “Só trabalho particular, não trabalho com Exército nem Prefeitura. Pegamos água das barragens porque os açudes estavam secos”, ele conta, afirmando que nos piores dias da seca não “acha tempo” para as entregas solicitadas. O pagamento é por distância, e a cada quilômetro rodado muda o valor: 5 quilômetros são equivalentes a 9 mil litros e custam R$ 80. Quem não puder pagar (como os acampados) pode esperar pela Defesa Civil estadual – que afirma ter investido R$ 4 milhões em caminhões-pipa – ou pelo Exército, que mensalmente abastece de água 137 municípios.

“A cada ano, a seca vem mais intensa e a tendência é sempre durar mais”, lamenta Orlando Cintra, gerente de Agricultura e Cooperativismo de Ipirá. “A perspectiva é a de que em cinco ou seis anos ninguém vá produzir mais nada aqui, na área da agricultura. O clima vem se transformando. A cada ano piora.”

“Já tivemos tantas previsões, e nada”, diz Jeane Santiago. “Passa a previsão de chuva no jornal e as pessoas dizem: ‘Não tenho mais fé, só acredito se eu vir’. O pessoal da zona rural tem simpatias, como ‘se a flor do mandacaru desabrochar é sinal de que vai chover’. Mas todas deram errado até agora. A fé está acabando.” Os mandacarus já florearam. O vermelho-forte chama atenção. Agora é esperar.

In Big Data, We Hope and Distrust (Huffington Post)

By Robert Hall

Posted: 04/03/2013 6:57 pm

“In God we trust. All others must bring data.” — W. Edwards Deming, statistician, quality guru

Big data helped reelect a pesident, find Osama bin Laden, and contributed to the meltdown of our financial system. We are in the midst of a data revolution where social media introduces new terms like Arab Spring, Facebook Depression and Twitter anxiety that reflect a new reality: Big data is changing the social and relationship fabric of our culture.

We spend hours installing and learning how to use the latest versions of our ever-expanding technology while enduring a never-ending battle to protect our information. Then we labor while developing practices to rid ourselves of technology — rules for turning devices off during meetings or movies, legislation to outlaw texting while driving, restrictions in classrooms to prevent cheating, and scheduling meals or family time where devices are turned off. Information and technology: We love it, hate it, can’t live with it, can’t live without it, use it voraciously, and distrust it immensely. I am schizophrenic and so am I.

Big data is not only big but growing rapidly. According to IBM, we create 2.5 quintillion bytes a day and that “ninety percent of the data in the world has been created in the last two years.” Vast new computing capacity can analyze Web-browsing trails that track our every click, sensor signals from every conceivable device, GPS tracking and social network traffic. It is now possible to measure and monitor people and machines to an astonishing degree. How exciting, how promising. And how scary.

This is not our first data rodeo. The early stages of the customer relationship management movement were filled with hope and with hype. Large data warehouses were going to provide the kind of information that would make companies masters of customer relationships. There were just two problems. First, getting the data out of the warehouse wasn’t nearly as hard as getting it into the person or device interacting with the customers in a way that added value, trust and expanded relationships. We seem to always underestimate the speed of technology and overestimate the speed at which we can absorb it and socialize around it.

Second, unfortunately the customers didn’t get the memo and mostly decided in their own rich wisdom they did not need or want “masters.” In fact as providers became masters of knowing all the details about our lives, consumers became more concerned. So while many organizations were trying to learn more about customer histories, behaviors and future needs — customers and even their governments were busy trying to protect privacy, security, and access. Anyone attempting to help an adult friend or family member with mental health issues has probably run into well-intentioned HIPAA rules (regulations that ensure privacy of medical records) that unfortunately also restrict the ways you can assist them. Big data gives and the fear of big data takes away.

Big data does not big relationships make. Over the last 20 years as our data keeps getting stronger, our customer relationships keep getting weaker. Eighty-six percent of consumers trust corporations less than they did five years ago. Customer retention across industries has fallen about 30 percent in recent years. Is it actually possible that we have unwittingly contributed in the undermining of our customer relationships? How could that be? For one thing, as companies keep getting better at targeting messages to specific groups and those groups keep getting better at blocking their messages. As usual, the power to resist trumps the power to exert.

No matter how powerful big data becomes, if it is to realize its potential, it must build trust on three levels. First, customers must trust our intentions. Data that can be used for us can also be used against us. There is growing fear institutions will become a part of a “surveillance state.” While organizations have gone to great length to promote protection of our data — the numbers reflect a fair amount of doubt. For example, according to MainStreet, “87 percent of Americans do not feel large banks are transparent and 68 percent do not feel their bank is on their side.:

Second, customers must trust our actions. Even if they trust our intentions, they might still fear that our actions put them at risk. Our private information can be hacked, then misused and disclosed in damaging and embarrassing ways. After the Sandy Hook tragedy a New York newspaper published the names and addresses of over 33,000 licensed gun owners along with an interactive map that showed exactly where they lived. In response names and addresses of the newspaper editor and writers were published on-line along with information about their children. No one, including retired judges, law enforcement officers and FBI agents expected their private information to be published in the midst of a very high decibel controversy.

Third, customers must trust the outcome — that sharing data will benefit them. Even with positive intentions and constructive actions, the results may range from disappointing to damaging. Most of us have provided email addresses or other contact data — around a customer service issue or such — and then started receiving email, phone or online solicitations. I know a retired executive who helps hard-to-hire people. She spent one evening surfing the Internet to research about expunging criminal records for released felons. Years later, Amazon greets her with books targeted to the felon it believes she is. Even with opt-out options, we felt used. Or, we provide specific information, only to repeat it in the next transaction or interaction — not getting the hoped for benefit of saving our time.

It will be challenging to grow the trust at anywhere near the rate we grow the data. Information develops rapidly, competence and trust develop slowly. Investing heavily in big data and scrimping on trust will have the opposite effect desired. To quote Dolly Parton who knows a thing or two about big: “It costs a lot of money to look this cheap.”

How Big Could a Man-Made Earthquake Get? (Popular Mechanics)

Scientists have found evidence that wastewater injection induced a record-setting quake in Oklahoma two years ago. How big can a man-made earthquake get, and will we see more of them in the future?

By Sarah Fecht – April 2, 2013 5:00 PM

hydraulic fracking drilling illustration

Hydraulic fracking drilling illustration. Brandon Laufenberg/Getty Images

In November 2011, a magnitude-5.7 earthquake rattled Prague, Okla., and 16 other nearby states. It flattened 14 homes and many other buildings, injured two people, and set the record as the state’s largest recorded earthquake. And according to a new study in the journal Geology, the event can also claim the title of Largest Earthquake That’s Ever Been Induced by Fluid Injection.”

In the paper, a team of geologists pinpoints the quake’s starting point at less than 200 meters (about 650 feet) from an injection well where wastewater from oil drilling was being pumped into the ground at high pressures. At 5.7 magnitude, the Prague earthquake was about 10 times stronger than the previous record holder: a magnitude-4.8 Rocky Mountain Arsenal earthquake in Colorado in 1967, caused by the U.S. Army injecting a deep well with 148,000 gallons per day of fluid wastes from chemical-weapons testing. So how big can these man-made earthquakes get?

The short answer is that scientists don’t really know yet, but it’s possible that fluid injection could cause some big ones on very rare occasions. “We don’t see any reason that there should be any upper limit for an earthquake that is induced,” says Bill Ellsworth, a geophysicist with the U.S. Geological Survey, who wasn’t involved in the new study.

As with natural earthquakes, most man-made earthquakes have been small to moderate in size, and most are felt only by seismometers. Larger quakes are orders of magnitude rarer than small quakes. For example, for every 1000 magnitude-1.0 earthquakes that occur, expect to see 100 magnitude-2.0s, 10 magnitude-3.0s, just 1 magnitude-4.0, and so on. And just as with natural earthquakes, the strength of the induced earthquake depends on the size of the nearby fault and the amount of stress acting on it. Some faults just don’t have the capacity to cause big earthquakes, whether natural or induced.

How do Humans Trigger Earthquakes?

Faults have two major kinds of stressors: shear stress, which makes two plates slide past each other along the fault line, and normal stress, which pushes the two plates together. Usually the normal stress keeps the fault from moving sideways. But when a fluid is injected into the ground, as in Prague, that can reduce the normal stress and make it easier for the fault to slip sideways. It’s as if if you have a tall stack of books on a table, Ellsworth says: If you take half the books away, it’s easier to slide the stack across the table.

“Water increases the fluid pressure in pores of rocks, which acts against the pressure across the fault,” says Geoffrey Abers, a Columbia University geologist and one of the new study’s authors. “By increasing the fluid pressure, you’re decreasing the strength of the fault.”

A similar mechanism may be behind earthquakes induced by large water reservoirs. In those instances, the artificial lake behind a dam causes water to seep into the pore spaces in the ground. In 1967, India’s Koyna Dam caused a 6.5 earthquake that killed 177 people, injured more than 2000, and left 50,000 homeless. Unprecedented seasonal fluctuations in water level behind a dam in Oroville, Calif., are believed to be behind the magnitude-6.1 earthquake that occurred there in 1975.

Extracting a fluid from the ground can also contribute to triggering a quake. “Think about filling a balloon with water and burying it at the beach,” Ellsworth says. “If you let the water out, the sand will collapse inward.” Similarly, when humans remove large amounts of oil and natural gas from the ground, it can put additional stress on a fault line. “In this case it may be the shear stresses that are being increased, rather than normal stresses,” Ellsworth says.

Take the example of the Gazli gas field in Uzbekistan, thought to be located in a seismically inactive area when drilling began in 1962. As drillers removed the natural gas, the pressure in the gas field dropped from 1030 psi in 1962 to 515 psi in 1976, then down to 218 psi in 1985. Meanwhile, three large magnitude-7.0 earthquakes struck: two in 1976 and one in 1984. Each quake had an epicenter within 12 miles of Gazli and caused a surface uplift of some 31 inches. Because the quakes occurred in Soviet-era Uzbekistan, information about the exact locations, magnitudes, and causes are not available. However, a report by the National Research Council concludes that “observations of crustal uplift and the proximity of these large earthquakes to the Gazli gas field in a previously seismically quiet region strongly suggest that they were induced by hydrocarbon extraction.” Extraction of oil is believed to have caused at least three big earthquakes in California, with magnitudes of 5.9, 6.1, and 6.5.

Some people worry that hydraulic fracturing, or fracking‚Äîwherein high-pressure fluids are used to crack through rock layers to extract oil and natural gas‚Äîwill lead to an increased risk of earthquakes. However, the National Research Council report points out that there are tens of thousands of hydrofracking wells in existence today, and there has only been one case in which a “felt” tremor was linked to fracking. That was a 2.3 earthquake in Blackpool, England, in 2011, which didn’t cause any significant damage. Although scientists have known since the 1920s that humans trigger earthquakes, experts caution that it’s not always easy to determine whether a specific event was induced.

Are Human Activities Making Quakes More Common?

Human activities have been linked to increased earthquake frequencies in certain areas. For instance, researchers have shown a strong correlation between the volume of fluid injected into the Rocky Mountain Arsenal well and the frequency of earthquakes in that area.

Geothermal-energy sites can also induce many earthquakes, possibly due to pressure, heat, and volume changes. The Geysers in California is the largest geothermal field in the U.S., generating 725 megawatts of electricity using steam from deep within the earth. Before The Geysers began operating in 1960, seismic activity was low in the area. Now the area experiences hundreds of earthquakes per year. Researchers have found correlations between the volume of steam production and the number of earthquakes in the region. In addition, as the area of the steam wells increased over the years, so did the spatial distribution of earthquakes.

Whether or not human activity is increasing the magnitude of earthquakes, however, is more of a gray area. When it comes to injection wells, evidence suggests that earthquake magnitudes rise along with the volume of injected wastewater, and possibly injection pressure and rate of injection as well, according to a statement from the Department of Interior.

The vast majority of earthquakes caused by The Geysers are considered to be microseismic events—too small for humans to feel. However, researchers from Lawrence Berkeley National Laboratory note that magnitude-4.0 earthquakes, which can cause minor damage, seem to be increasing in frequency.

The new study says that though earthquakes with a magnitude of 5.0 or greater are rare east of the Rockies, scientists have observed an 11-fold increase between 2008 and 2011, compared with 1976 through 2007. But the increase hasn’t been tied to human activity. “We do not really know what is causing this increase, but it is remarkable,” Abers says. “It is reasonable that at least some may be natural.”

Survey Shows Many Republicans Feel America Should Take Steps to Address Climate Change (Science Daily)

Apr. 2, 2013 — In a recent survey of Republicans and Republican-leaning Independents conducted by the Center for Climate Change Communication (4C) at George Mason University, a majority of respondents (62 percent) said they feel America should take steps to address climate change. More than three out of four survey respondents (77 percent) said the United States should use more renewable energy sources, and of those, most believe that this change should begin immediately.

The national survey, conducted in January 2013, asked more than 700 people who self-identified as Republicans and Republican-leaning Independents about energy and climate change.

“Over the past few years, our surveys have shown that a growing number of Republicans want to see Congress do more to address climate change,” said Mason professor Edward Maibach, director of 4C. “In this survey, we asked a broader set of questions to see if we could better understand how Republicans, and Independents who have a tendency to vote Republican, think about America’s energy and climate change situation.”

Other highlights from the survey include the following:

  • Republicans and Republican-leaning Independents prefer clean energy as the basis of America’s energy future and say the benefits of clean energy, such as energy independence (66 percent) saving resources for our children and grandchildren (57 percent), and providing a better life for our children and grandchildren (56 percent) outweigh the costs, such as more government regulation (42 percent) or higher energy prices (31 percent).
  • By a margin of 2 to 1, respondents say America should take action to reduce its fossil fuel use.
  • Only one third of respondents agree with the Republican Party’s position on climate change, while about half agree with the party’s position on how to meet America’s energy needs.
  • A large majority of respondents say their elected representatives are unresponsive to their views about climate change.

“The findings from this survey suggest there is considerable support among conservatives for accelerating the transition away from fossil fuels and toward clean renewable forms of energy, and for taking steps to address climate change,” said Maibach. “Perhaps the most surprising finding, however, is how few of our survey respondents agreed with the Republican Party’s current position on climate change.”

The report can be downloaded at: http://climatechangecommunication.org

The report is based on findings from a nationally representative survey conducted by the George Mason University Center for Climate Change Communication. A total of 726 adults (18+) were interviewed between January 12th and January 27th, 2013. The average margin of error for the survey +/- 4 percentage points at the 95% confidence level.

Unearthed: The Fracking Facade (Top Documentary Films)

A video exposing a flawed claim often abused in the sales pitch for promoting shale gas development across the world:

“With a history of 60 years, after nearly a million wells drilled, there are no documented cases that hydraulic fracturing (fracking) has lead to the contamination of groundwater.”

Brought to you by the team behind the upcoming South African feature documentary, Unearthed, that is investigating natural gas development and the controversial method of extraction known as fracking from a global perspective. Should South Africa and other countries drill down?

Watch the full documentary now

 

 

The Mathematics of Averting the Next Big Network Failure (Wired)

BY NATALIE WOLCHOVER, SIMONS SCIENCE NEWS

03.19.13 – 9:30 AM

Data: Courtesy of Marc Imhoff of NASA GSFC and Christopher Elvidge of NOAA NGDC; Image: Craig Mayhew and Robert Simmon of NASA GSFC

Gene Stanley never walks down stairs without holding the handrail. For a fit 71-year-old, he is deathly afraid of breaking his hip. In the elderly, such breaks can trigger fatal complications, and Stanley, a professor of physics at Boston University, thinks he knows why.

“Everything depends on everything else,” he said.

Original story reprinted with permission from Simons Science News, an editorially independent division of SimonsFoundation.org whose mission is to enhance public understanding of science by covering research developments and trends in mathematics and the physical and life sciences.

Three years ago, Stanley and his colleagues discovered the mathematics behind what he calls “the extreme fragility of interdependency.” In a system of interconnected networks like the economy, city infrastructure or the human body, their model indicates that a small outage in one network can cascade through the entire system, touching off a sudden, catastrophic failure.

First reported in 2010 in the journal Nature, the finding spawned more than 200 related studies, including analyses of the nationwide blackout in Italy in 2003, the global food-price crisis of 2007 and 2008, and the “flash crash” of the United States stock market on May 6, 2010.

“In isolated networks, a little damage will only lead to a little more,” said Shlomo Havlin, a physicist at Bar-Ilan University in Israel who co-authored the 2010 paper. “Now we know that because of dependency between networks, you can have an abrupt collapse.”

While scientists remain cautious about using the results of simplified mathematical models to re-engineer real-world systems, some recommendations are beginning to emerge. Based on data-driven refinements, new models suggest interconnected networks should have backups, mechanisms for severing their connections in times of crisis, and stricter regulations to forestall widespread failure.

“There’s hopefully some sweet spot where you benefit from all the things that networks of networks bring you without being overwhelmed by risk,” said Raissa D’Souza, a complex systems theorist at the University of California, Davis.

Power, gas, water, telecommunications and transportation networks are often interlinked. When nodes in one network depend on nodes in another, node failures in any of the networks can trigger a system-wide collapse. (Illustration: Leonardo Dueñas-Osorio)

To understand the vulnerability in having nodes in one network depend on nodes in another, consider the “smart grid,” an infrastructure system in which power stations are controlled by a telecommunications network that in turn requires power from the network of stations. In isolation, removing a few nodes from either network would do little harm, because signals could route around the outage and reach most of the remaining nodes. But in coupled networks, downed nodes in one automatically knock out dependent nodes in the other, which knock out other dependent nodes in the first, and so on. Scientists model this cascading process by calculating the size of the largest cluster of connected nodes in each network, where the answer depends on the size of the largest cluster in the other network. With the clusters interrelated in this way, a decrease in the size of one of them sets off a back-and-forth cascade of shrinking clusters.

When damage to a system reaches a “critical point,” Stanley, Havlin and their colleagues find that the failure of one more node drops all the network clusters to zero, instantly killing connectivity throughout the system. This critical point will vary depending on a system’s architecture. In one of the team’s most realistic coupled-network models, an outage of just 8 percent of the nodes in one network — a plausible level of damage in many real systems — brings the system to its critical point. “The fragility that’s implied by this interdependency is very frightening,” Stanley said.

However, in another model recently studied by D’Souza and her colleagues, sparse links between separate networks actually help suppress large-scale cascades, demonstrating that network models are not one-size-fits-all. To assess the behavior of smart grids, financial markets, transportation systems and other real interdependent networks, “we have to start from the data-driven, engineered world and come up with the mathematical models that capture the real systems instead of using models because they are pretty and analytically tractable,” D’Souza said.

In a series of papers in the March issue of Nature Physics, economists and physicists used the science of interconnected networks to pinpoint risk within the financial system. In one study, an interdisciplinary group of researchers including the Nobel Prize-winning economist Joseph Stiglitz found inherent instabilities within the highly complex, multitrillion-dollar derivatives market and suggested regulations that could help stabilize it.

Irena Vodenska, a professor of finance at Boston University who collaborates with Stanley, custom-fit a coupled network model around data from the 2008 financial crisis. Her and her colleagues’ analysis, published in February in Scientific Reports, showed that modeling the financial system as a network of two networks — banks and bank assets, where each bank is linked to the assets it held in 2007 — correctly predicted which banks would fail 78 percent of the time.

“We consider this model as potentially useful for systemic risk stress testing for financial systems,” said Vodenska, whose research is financially supported by the European Union’s Forecasting Financial Crisis program. As globalization further entangles financial networks, she said, regulatory agencies must monitor “sources of contagion” — concentrations in certain assets, for example — before they can cause epidemics of failure. To identify these sources, “it’s imperative to think in the sense of networks of networks,” she said.

Leonardo Dueñas-Osorio, a civil engineer at Rice, visited a damaged high-voltage substation in Chile after a major earthquake in 2010 to gather information about the power grid’s response to the crisis. (Photo: Courtesy of Leonardo Dueñas-Osorio)

Scientists are applying similar thinking to infrastructure assessment. Leonardo Dueñas-Osorio, a civil engineer at Rice University, is analyzing how lifeline systems responded to recent natural disasters. When a magnitude 8.8 earthquake struck Chile in 2010, for example, most of the power grid was restored after just two days, aiding emergency workers. The swift recovery, Dueñas-Osorio’s researchsuggests, occurred because Chile’s power stations immediately decoupled from the centralized telecommunications system that usually controlled the flow of electricity through the grid, but which was down in some areas. Power stations were operated locally until the damage in other parts of the system subsided.

“After an abnormal event, the majority of the detrimental effects occur in the very first cycles of mutual interaction,” said Dueñas-Osorio, who is also studying New York City’s response to Hurricane Sandy last October. “So when something goes wrong, we need to have the ability to decouple networks to prevent the back-and-forth effects between them.”

D’Souza and Dueñas-Osorio are collaborating to build accurate models of infrastructure systems in Houston, Memphis and other American cities in order to identify system weaknesses. “Models are useful for helping us explore alternative configurations that could be more effective,” Dueñas-Osorio explained. And as interdependency between networks naturally increases in many places, “we can model that higher integration and see what happens.”

Scientists are also looking to their models for answers on how to fix systems when they fail. “We are in the process of studying what is the optimal way to recover a network,” Havlin said. “When networks fail, which node do you fix first?”

The hope is that networks of networks might be unexpectedly resilient for the same reason that they are vulnerable. As Dueñas-Osorio put it, “By making strategic improvements, can we have what amounts to positive cascades, where a small improvement propagates much larger benefits?”

These open questions have the attention of governments around the world. In the U.S., the Defense Threat Reduction Agency, an organization tasked with safeguarding national infrastructure against weapons of mass destruction, considers the study of interdependent networks its “top mission priority” in the category of basic research. Some defense applications have emerged already, such as a new design for electrical network systems at military bases. But much of the research aims at sorting through the mathematical subtleties of network interaction.

“We’re not yet at the ‘let’s engineer the internet differently’ level,” said Robin Burk, an information scientist and former DTRA program manager who led the agency’s focus on interdependent networks research. “A fair amount of it is still basic science — desperately needed science.”

Original story reprinted with permission from Simons Science News, an editorially independent division of SimonsFoundation.org whose mission is to enhance public understanding of science by covering research developments and trends in mathematics and the physical and life sciences.

Mudanças climáticas afetam previsões astrológicas dos índios amazônicos (UOL Notícias)

Carlos A. Moreno

Da EFE, no Rio de Janeiro

31/03/201311h59

Crianças da aldeia ticuna brincam no Rio Solimões, no Amazonas; os ticunas são uma das tribos afetadas pelas mudanças climáticasCrianças da aldeia ticuna brincam no Rio Solimões, no Amazonas; os ticunas são uma das tribos afetadas pelas mudanças climáticas. Patrícia Santos – 30.nov.1999/Folhapress

As previsões que os índios da Amazônia brasileira fazem com a ajuda dos astros para determinar o melhor momento para plantar ou pescar, entre outras atividades, se veem afetadas pelas mudanças climáticas, segundo constatou um estudo realizado com diferentes etnias indígenas no Brasil.

“Os xamãs passaram a se queixar que suas previsões estavam perdendo a exatidão e, a partir dessas indagações, descobrimos que alguns fenômenos provocados pelas mudanças climáticas afetavam seus cálculos”, explicou à Agência Efe o astrônomo Germano Afonso, coordenador do estudo.

Segundo o especialista, que é doutor em Astronomia e Mecânica Celeste pela francesa Universidade Pierre et Marie Curie, os índios da Amazônia ainda utilizam o conhecimento astrológico ancestral para determinar seu calendário e programar, entre outras coisas, a melhor data para plantar, colher, caçar, pescar e, até mesmo, realizar seus rituais religiosos.

Afonso, que construiu e opera – com ajuda dos índios – um observatório solar na Amazônia, explicou que a observação ou não de diferentes constelações, assim como o deslocamento das mesmas, fazem com que os xamãs prevejam os momentos de chuva e seca, das cheias dos rios, da fertilidade da terra e da procriação dos peixes.

“No entanto, nas tribos com as quais trabalhamos, os próprios xamãs admitem que suas previsões não estavam sendo exatas, já que as chuvas se antecipavam ou se atrasavam e os rios secavam antes do tempo previsto. O curioso é que eles mesmos culpavam às mudanças climáticas”, declarou o astrônomo, que é professor da Universidade do Estado do Paraná e autor de diferentes obras sobre o assunto, como “O Céu dos Índios Tembé”.

A equipe coordenada por Afonso e contratada pela Fundação de Apoio à pesquisa no Estado do Amazonas (Fapeam) para estudar o assunto decidiu contrastar o conhecimento indígena de diferentes etnias – Tukano, Tupé, Dessana, Baré, Tuyuka, Baniwa e Tikuna – com as medições meteorológicas da região para tentar identificar as falhas nas previsões.

“Com essa análise percebemos que alguns fenômenos provocados pelas mudanças climáticas estavam desvirtuando as previsões, tendo em vista que a chuva se atrasava ou se antecipava por fenômenos como El Niño e o desmatamento”, apontou o especialista, que passou a morar em São Gabriel da Cachoeira, uma cidade amazônica na qual confluem várias etnias e onde construiu o Observatório Solar Indígena.

Afonso esclareceu que esse problema não pode ser atribuído diretamente ao aquecimento global, mas também aos fenômenos que causam o efeito estufa e os que são provocados pelo mesmo, como o desmatamento da Amazônia, a poluição ambiental e a construção de represas na floresta.

Tais fenômenos, segundo os especialistas, alteram os períodos de chuva e de cheia dos rios na Amazônia, que já não podem ser previstos a partir do conhecimento astronômico acumulado por séculos e transmitido oralmente entre os índios.

Após a constatação do problema, os pesquisadores responsáveis pelo estudo iniciaram um projeto para transmitir aos xamãs alguns conhecimentos científicos e, com isso, ajudá-los a corrigir suas previsões.

“Estamos usando cálculos astronômicos modernos e as informações recolhidas pelas estações meteorológicas da região para ajudá-los a aperfeiçoar seus cálculos”, explicou Afonso.

“Recuperamos o conhecimento astrológico que eles transmitem oralmente e comparamos com dados científicos para fazer alguns ajustes e permitir que as previsões sejam mais precisas”, completou.

De acordo com Afonso, com previsões mais exatas, os índios seguirão confiando em sua capacidade de interpretar os astros e na precisão de seus conhecimentos – o melhor, sem se afastarem de sua cultura.

“Mas só transmitimos os dados que podem ajudá-los. Não nos introduzimos mais. Não queremos invadir, deslegitimar e nem modificar nada de sua cultura. O projeto tem dois objetivos claros: recuperar o conhecimento astrológico dos índios e ajudá-los a melhorar suas previsões. Trata-se de uma troca”, exaltou o pesquisador.

Segundo o astrônomo, essa troca teve uma boa recepção devido ao fato de que a maioria de seus colaboradores no projeto são universitários e indígenas, alguns filhos ou netos de caciques e xamãs das tribos onde nasceram.

Ten Times More Hurricane Surges in Future, New Research Predicts (Science Daily)

Mar. 18, 2013 — By examining the frequency of extreme storm surges in the past, previous research has shown that there was an increasing tendency for storm hurricane surges when the climate was warmer. But how much worse will it get as temperatures rise in the future? How many extreme storm surges like that from Hurricane Katrina, which hit the U.S. coast in 2005, will there be as a result of global warming? New research from the Niels Bohr Institute show that there will be a tenfold increase in frequency if the climate becomes two degrees Celcius warmer.

The results are published in the scientific journal, Proceedings of the National Academy of Science,PNAS.

The extreme storm surge from Superstorm Sandy in the autumn 2012 flooded large sections of New York and other coastal cities in the region. New research shows that such hurricane surges will become more frequent in a warmer climate. (Credit: © Leonard Zhukovsky / Fotolia)

Tropical cyclones arise over warm ocean surfaces with strong evaporation and warming of the air. The typically form in the Atlantic Ocean and move towards the U.S. East Coast and the Gulf of Mexico. If you want to try to calculate the frequency of tropical cyclones in a future with a warmer global climate, researchers have developed various models. One is based on the regional sea temperatures, while another is based on differences between the regional sea temperatures and the average temperatures in the tropical oceans. There is considerable disagreement among researchers about which is best.

New model for predicting cyclones

“Instead of choosing between the two methods, I have chosen to use temperatures from all around the world and combine them into a single model,” explains climate scientist Aslak Grinsted, Centre for Ice and Climate at the Niels Bohr Institute at the University of Copenhagen.

He takes into account the individual statistical models and weights them according to how good they are at explaining past storm surges. In this way, he sees that the model reflects the known physical relationships, for example, how the El Niño phenomenon affects the formation of cyclones. The research was performed in collaboration with colleagues from China and England.

The statistical models are used to predict the number of hurricane surges 100 years into the future. How much worse will it be per degree of global warming? How many ‘Katrinas’ will there be per decade?

Since 1923, there has been a ‘Katrina’ magnitude storm surge every 20 years.

10 times as many ‘Katrinas’

“We find that 0.4 degrees Celcius warming of the climate corresponds to a doubling of the frequency of extreme storm surges like the one following Hurricane Katrina. With the global warming we have had during the 20th century, we have already crossed the threshold where more than half of all ‘Katrinas’ are due to global warming,” explains Aslak Grinsted.

“If the temperature rises an additional degree, the frequency will increase by 3-4 times and if the global climate becomes two degrees warmer, there will be about 10 times as many extreme storm surges. This means that there will be a ‘Katrina’ magnitude storm surge every other year,” says Aslak Grinsted and he points out that in addition to there being more extreme storm surges, the sea will also rise due to global warming. As a result, the storm surges will become worse and potentially more destructive.

Journal Reference:

  1. Aslak Grinsted, John C. Moore, and Svetlana Jevrejeva.Projected Atlantic hurricane surge threat from rising temperaturesPNAS, March 18, 2013 DOI:10.1073/pnas.1209980110

Faltam cinco dias para um alerta global pela consciência ambiental (WWF/Envolverde)

18/3/2013 – 10h50

por Redação do WWF Brasil

n12 300x183 Faltam cinco dias para um alerta global pela consciência ambientalNo próximo sábado, 23 de março, às 20h30 (hora local), milhares, talvez bilhões de pessoas apagarão as luzes de suas casas, comércios, repartições, monumentos e outros logradouros importantes num ato simbólico de alerta contra as mudanças no clima.

É a nona edição da Hora do Planeta, um movimento que começou tímido na Austrália e hoje envolve milhares de cidades em mais de 152 países. Aqui o País a Hora do Planeta é promovida pelo WWF-Brasil e o objetivo é superar os números do ano passado, atraindo para a Hora do Planeta todas as capitais estaduais e o Distrito Federal, e ultrapassando a marca de 131 cidades participantes em 2012.

Sua cidade pode seguir o exemplo das muitas que já o fizeram e se inscrever enviando um email para cidades@wwf.org.br e assinando o Termo de Adesão. Escolas e instituições também não podem ficar de fora.

Sua participação pessoal é fundamental para o sucesso da Hora do Planeta. Não podemos mais consumir o equivalente a um planeta e meio de recursos para nossa subsistência na Terra. Adote práticas sustentáveis desde já e depois do 23 de março. Recicle. Reduza. Reutilize. É possível adotar mudanças simples no seu estilo de vida, que terão um grande impacto global. Evite desperdício de água e energia. Recorra a fontes alternativas, como solar e eólica, se possível. Use menos o carro e prefira o transporte público, a bicicleta e andar a pé. Coma menos carne vermelha. Consuma produtos locais, sempre que possível orgânicos. Informe-se mais sobre o tema. Nosso futuro comum está em perigo e as mudanças climáticas em curso ameaçam toda a vida na Terra. Sua consciência é a maior arma para combatê-las.

Acompanhe a Hora do Planeta no Brasil através dos nossos canais no Facebook, Twitter e YouTube. Divulgue a nossa mensagem. E veja aqui os desafios do “Eu vou se você for” — pessoas propondo alternativas para que todos adotem um estilo de vida mais correto ecologicamente (e de quebra mais saudável!).

Junte-se a nós. A Hora do Planeta já começou e não pode terminar quando as luzes se acenderem no sábado.

* Publicado originalmente no site WWF Brasil.

Big military guy more scared of climate change than enemy guns (Grist)

By Susie Cagle

11 Mar 2013 6:13 PM

Navy Admiral Samuel J. Locklear III, chief of U.S. Pacific Command, doesn’t look like your usual proponent of climate action. Spencer Ackerman writes at Wired that Locklear “is no smelly hippie,” but the guy does believe there will be terrible security threats on a warming planet, which might make him a smelly hippie in the eyes of many American military boosters.

13-03-11AdmSamuelLocklear
Commander U.S. 7th Fleet

Everyone wants him to be worried about North Korean nukes and Chinese missiles, but in an interview with The Boston Globe, Locklear said that societal upheaval due to climate change “is probably the most likely thing that is going to happen … that will cripple the security environment, probably more likely than the other scenarios we all often talk about.’’

“People are surprised sometimes,” he added, describing the reaction to his assessment. “You have the real potential here in the not-too-distant future of nations displaced by rising sea level. Certainly weather patterns are more severe than they have been in the past. We are on super typhoon 27 or 28 this year in the Western Pacific. The average is about 17.”

Locklear said his Hawaii-based headquarters — which is … responsible for operations from California to India — is working with Asian nations to stockpile supplies in strategic locations and planning a major exercise for May with nearly two dozen countries to practice the “what-ifs.”

Locklear isn’t alone in his climate fears. A recent article by Julia Whitty takes an in-depth look at what the military is doing to deal with climate change. A 2008 report by U.S. intelligence agencieswarned about national security challenges posed by global warming, as have later reports from the Department of Defense and the Joint Chiefs of Staff. New Defense Secretary Chuck Hagel understands the threat, too. People may be surprised sometimes, Adm. Locklear, but they really shouldn’t be!

Will not-a-dirty-hippie Locklear’s words help to further mainstream the idea that climate change is a serious security problem? And what all has the good admiral got planned for this emergency sea-rising drill in May?

Susie Cagle writes and draws news for Grist. She also writes and draws tweets for Twitter.

Understanding the Historical Probability of Drought (Science Daily)

Jan. 30, 2013 — Droughts can severely limit crop growth, causing yearly losses of around $8 billion in the United States. But it may be possible to minimize those losses if farmers can synchronize the growth of crops with periods of time when drought is less likely to occur. Researchers from Oklahoma State University are working to create a reliable “calendar” of seasonal drought patterns that could help farmers optimize crop production by avoiding days prone to drought.

Historical probabilities of drought, which can point to days on which crop water stress is likely, are often calculated using atmospheric data such as rainfall and temperatures. However, those measurements do not consider the soil properties of individual fields or sites.

“Atmospheric variables do not take into account soil moisture,” explains Tyson Ochsner, lead author of the study. “And soil moisture can provide an important buffer against short-term precipitation deficits.”

In an attempt to more accurately assess drought probabilities, Ochsner and co-authors, Guilherme Torres and Romulo Lollato, used 15 years of soil moisture measurements from eight locations across Oklahoma to calculate soil water deficits and determine the days on which dry conditions would be likely. Results of the study, which began as a student-led class research project, were published online Jan. 29 inAgronomy Journal. The researchers found that soil water deficits more successfully identified periods during which plants were likely to be water stressed than did traditional atmospheric measurements when used as proposed by previous research.

Soil water deficit is defined in the study as the difference between the capacity of the soil to hold water and the actual water content calculated from long-term soil moisture measurements. Researchers then compared that soil water deficit to a threshold at which plants would experience water stress and, therefore, drought conditions. The threshold was determined for each study site since available water, a factor used to calculate threshold, is affected by specific soil characteristics.

“The soil water contents differ across sites and depths depending on the sand, silt, and clay contents,” says Ochsner. “Readily available water is a site- and depth-specific parameter.”

Upon calculating soil water deficits and stress thresholds for the study sites, the research team compared their assessment of drought probability to assessments made using atmospheric data. They found that a previously developed method using atmospheric data often underestimated drought conditions, while soil water deficits measurements more accurately and consistently assessed drought probabilities. Therefore, the researchers suggest that soil water data be used whenever it is available to create a picture of the days on which drought conditions are likely.

If soil measurements are not available, however, the researchers recommend that the calculations used for atmospheric assessments be reconfigured to be more accurate. The authors made two such changes in their study. First, they decreased the threshold at which plants were deemed stressed, thus allowing a smaller deficit to be considered a drought condition. They also increased the number of days over which atmospheric deficits were summed. Those two changes provided estimates that better agreed with soil water deficit probabilities.

Further research is needed, says Ochsner, to optimize atmospheric calculations and provide accurate estimations for those without soil water data. “We are in a time of rapid increase in the availability of soil moisture data, but many users will still have to rely on the atmospheric water deficit method for locations where soil moisture data are insufficient.”

Regardless of the method used, Ochsner and his team hope that their research will help farmers better plan the cultivation of their crops and avoid costly losses to drought conditions.

Journal Reference:

  1. Guilherme M. Torres, Romulo P. Lollato, Tyson E. Ochsner.Comparison of Drought Probability Assessments Based on Atmospheric Water Deficit and Soil Water Deficit.Agronomy Journal, 2013; DOI: 10.2134/agronj2012.0295

Scientists Underestimated Potential for Tohoku Earthquake: Now What? (Science Daily)

Jan. 23, 2013 — The massive Tohoku, Japan, earthquake in 2011 and Sumatra-Andaman superquake in 2004 stunned scientists because neither region was thought to be capable of producing a megathrust earthquake with a magnitude exceeding 8.4.

Seismograph. (Credit: © huebi71 / Fotolia)

Now earthquake scientists are going back to the proverbial drawing board and admitting that existing predictive models looking at maximum earthquake size are no longer valid.

In a new analysis published in the journal Seismological Research Letters, a team of scientists led by Oregon State University’s Chris Goldfinger describes how past global estimates of earthquake potential were constrained by short historical records and even shorter instrumental records. To gain a better appreciation for earthquake potential, he says, scientists need to investigate longer paleoseismic records.

“Once you start examining the paleoseismic and geodetic records, it becomes apparent that there had been the kind of long-term plate deformation required by a giant earthquake such as the one that struck Japan in 2011,” Goldfinger said. “Paleoseismic work has confirmed several likely predecessors to Tohoku, at about 1,000-year intervals.”

The researchers also identified long-term “supercycles” of energy within plate boundary faults, which appear to store this energy like a battery for many thousands of years before yielding a giant earthquake and releasing the pressure. At the same time, smaller earthquakes occur that do not to any great extent dissipate the energy stored within the plates.

The newly published analysis acknowledges that scientists historically may have underestimated the number of regions capable of producing major earthquakes on a scale of Tohoku.

“Since the 1970s, scientists have divided the world into plate boundaries that can generate 9.0 earthquakes versus those that cannot,” said Goldfinger, a professor in OSU’s College of Earth, Ocean, and Atmospheric Sciences. “Those models were already being called into question when Sumatra drove one stake through their heart, and Tohoku drove the second one.

“Now we have no models that work,” he added, “and we may not have for decades. We have to assume, however, that the potential for 9.0 subduction zone earthquakes is much more widespread than originally thought.”

Both Tohoku and Sumatra were written off in the textbooks as not having the potential for a major earthquake, Goldfinger pointed out.

“Their plate age was too old, and they didn’t have a really large earthquake in their recent history,” Goldfinger said. “In fact, if you look at a northern Japan seismic risk map from several years ago, it looks quite benign — but this was an artifact of recent statistics.”

Paleoseismic evidence of subduction zone earthquakes is not yet plentiful in most cases, so little is known about the long-term earthquake potential of most major faults. Scientists can determine whether a fault has ruptured in the past — when and to what extent — but they cannot easily estimate how big a specific earthquake might have been. Most, Goldfinger says, fall into ranges — say, 8.4 to 8.7.

Nevertheless, that type of evidence can be more telling than historical records because it may take many thousands of years to capture the full range of earthquake behavior.

In their analysis, the researchers point to several subduction zone areas that previously had been discounted as potential 9.0 earthquake producers — but may be due for reconsideration. These include central Chile, Peru, New Zealand, the Kuriles fault between Japan and Russia, the western Aleutian Islands, the Philippines, Java, the Antilles Islands and Makran, Pakistan/Iran.

Onshore faults such as the Himalayan Front may also be hiding outsized earthquakes, the researchers add. Their work was supported by the National Science Foundation.

Goldfinger, who directs the Active Tectonics and Seafloor Mapping Laboratory at Oregon State, is a leading expert on the Cascadia Subduction Zone off the Pacific Northwest coast of North America. His comparative studies have taken him to the Indian Ocean, Japan and Chile, and in 2007, he led the first American research ship into Sumatra waters in nearly 30 years to study similarities between the Indian Ocean subduction zone and Cascadia.

Paleoseismic evidence abounds in the Cascadia Subduction Zone, Goldfinger pointed out. When a major offshore earthquake occurs, the disturbance causes mud and sand to begin streaming down the continental margins and into the undersea canyons. Coarse sediments called turbidites run out onto the abyssal plain; these sediments stand out distinctly from the fine particulate matter that accumulates on a regular basis between major tectonic events.

By dating the fine particles through carbon-14 analysis and other methods, Goldfinger and colleagues can estimate with a great deal of accuracy when major earthquakes have occurred. Over the past 10,000 years, there have been 19 earthquakes that extended along most of the Cascadia Subduction Zone margin, stretching from southern Vancouver Island to the Oregon-California border.

“These would typically be of a magnitude from about 8.7 to 9.2 — really huge earthquakes,” Goldfinger said. “We’ve also determined that there have been 22 additional earthquakes that involved just the southern end of the fault. We are assuming that these are slightly smaller — more like 8.0 — but not necessarily. They were still very large earthquakes that if they happened today could have a devastating impact.”

Other researchers on the analysis include Yasutaka Ikeda of University of Tokyo, Robert S. Yeats of Oregon State University, and Junjie Ren, of the Chinese Seismological Bureau.

Journal Reference:

  1. C. Goldfinger, Y. Ikeda, R. S. Yeats, J. Ren. Superquakes and SupercyclesSeismological Research Letters, 2013; 84 (1): 24 DOI: 10.1785/0220110135

Cacique Cobra Coral rompe parceria com a prefeitura (O Globo)

Governo teria deixado de entregar, nos prazos previstos, relatórios com um balanço dos investimentos em prevenção realizados ano passado na cidade

O GLOBO

Publicado:14/01/13 – 0h08

RIO — Em pleno verão carioca, o sistema de alerta e prevenção a enchentes do Rio perdeu um colaborador incomum. O porta-voz da Fundação Cacique Cobra Coral, Osmar Santos, anunciou no domingo que rompeu o convênio técnico-científico que mantinha com a prefeitura do Rio. O motivo é que a prefeitura deixou de entregar, nos prazos previstos, relatórios com um balanço dos investimentos em prevenção realizados ano passado na cidade. A ONG é comandada pela médium Adelaide Scritori, que afirma ter o poder de controlar o tempo. Desde a administração do ex-prefeito Cesar Maia, Adelaide esteve à disposição para prestar assistência espiritual a fim de tentar reduzir os estragos causados por temporais. Em janeiro de 2009, a prefeitura chegou a anunciar o fim da parceria, mas voltou atrás após uma forte chuva.

— Alguém da burocracia muito atarefado esqueceu da gente. Mas, caso a prefeitura queira continuar a receber nossa consultoria, que é gratuita, estamos à disposição — disse Osmar Santos.

Leia mais sobre esse assunto em http://oglobo.globo.com/rio/cacique-cobra-coral-rompe-parceria-com-prefeitura-7285402#ixzz2Il9blV38 © 1996 – 2013. Todos direitos reservados a Infoglobo Comunicação e Participações S.A. Este material não pode ser publicado, transmitido por broadcast, reescrito ou redistribuído sem autorização.

Heat, Flood or Icy Cold, Extreme Weather Rages Worldwide (N.Y.Times)

NY Times

January 10, 2013

By SARAH LYALL

WORCESTER, England — Britons may remember 2012 as the year the weather spun off its rails in a chaotic concoction of drought, deluge and flooding, but the unpredictability of it all turns out to have been all too predictable: Around the world, extreme has become the new commonplace.

Especially lately. China is enduring its coldest winter in nearly 30 years. Brazil is in the grip of a dreadful heat spell. Eastern Russia is so freezing — minus 50 degrees Fahrenheit, and counting — that the traffic lights recently stopped working in the city of Yakutsk.

Bush fires are raging across Australia, fueled by a record-shattering heat wave. Pakistan was inundated by unexpected flooding in September. A vicious storm bringing rain, snow and floods just struck the Middle East. And in the United States, scientists confirmed this week what people could have figured out simply by going outside: last year was the hottest since records began.

“Each year we have extreme weather, but it’s unusual to have so many extreme events around the world at once,” said Omar Baddour, chief of the data management applications division at the World Meteorological Organization, in Geneva. “The heat wave in Australia; the flooding in the U.K., and most recently the flooding and extensive snowstorm in the Middle East — it’s already a big year in terms of extreme weather calamity.”

Such events are increasing in intensity as well as frequency, Mr. Baddour said, a sign that climate change is not just about rising temperatures, but also about intense, unpleasant, anomalous weather of all kinds.

Here in Britain, people are used to thinking of rain as the wallpaper on life’s computer screen — an omnipresent, almost comforting background presence. But even the hardiest citizen was rattled by the near-biblical fierceness of the rains that bucketed down, and the floods that followed, three different times in 2012.

Rescuers plucked people by boat from their swamped homes in St. Asaph, North Wales. Whole areas of the country were cut off when roads and train tracks were inundated at Christmas. In Megavissey, Cornwall, a pub owner closed his business for good after it flooded 11 times in two months.

It was no anomaly: the floods of 2012 followed the floods of 2007 and also the floods of 2009, which all told have resulted in nearly $6.5 billion in insurance payouts. The Met Office, Britain’s weather service, declared 2012 the wettest year in England, and the second-wettest in Britain as a whole, since records began more than 100 years ago. Four of the five wettest years in the last century have come in the past decade (the fifth was in 1954).

The biggest change, said Charles Powell, a spokesman for the Met Office, is the frequency in Britain of “extreme weather events” — defined as rainfall reaching the top 1 percent of the average amount for that time of year. Fifty years ago, such episodes used to happen every 100 days; now they happen every 70 days, he said.

The same thing is true in Australia, where bush fires are raging across Tasmania and the current heat wave has come after two of the country’s wettest years ever. On Tuesday, Sydney experienced its fifth-hottest day since records began in 1910, with the temperature climbing to 108.1 degrees. The first eight days of 2013 were among the 20 hottest on record.

Every decade since the 1950s has been hotter in Australia than the one before, said Mark Stafford Smith, science director of the Climate Adaptation Flagship at the Commonwealth Scientific and Industrial Research Organization.

To the north, the extremes have swung the other way, with a band of cold settling across Russia and Northern Europe, bringing thick snow and howling winds to Stockholm, Helsinki and Moscow. (Incongruously, there were also severe snowstorms in Sicily and southern Italy for the first time since World War II; in December, tornadoes and waterspouts struck the Italian coast.)

In Siberia, thousands of people were left without heat when natural gas liquefied in its pipes and water mains burst. Officials canceled bus transportation between cities for fear that roadside breakdowns could lead to deaths from exposure, and motorists were advised not to venture far afield except in columns of two or three cars. In Altai, to the east, traffic officials warned drivers not to use poor-quality diesel, saying that it could become viscous in the cold and clog fuel lines.

Meanwhile, China is enduring its worst winter in recent memory, with frigid temperatures recorded in Harbin, in the northeast. In the western region of Xinjiang, more than 1,000 houses collapsed under a relentless onslaught of snow, while in Inner Mongolia, 180,000 livestock froze to death. The cold has wreaked havoc with crops, sending the price of vegetables soaring.

Way down in South America, energy analysts say that Brazil may face electricity rationing for the first time since 2002, as a heat wave and a lack of rain deplete the reservoirs for hydroelectric plants. The summer has been punishingly hot. The temperature in Rio de Janeiro climbed to 109.8 degrees on Dec. 26, the city’s highest temperature since official records began in 1915.

At the same time, in the Middle East, Jordan is battling a storm packing torrential rain, snow, hail and floods that are cascading through tunnels, sweeping away cars and spreading misery in Syrian refugee camps. Amman has been virtually paralyzed, with cars abandoned, roads impassable and government offices closed.

Israel and the Palestinian territories are grappling with similar conditions, after a week of intense rain and cold winds ushered in a snowstorm that dumped eight inches in Jerusalem alone.

Amir Givati, head of the surface water department at the Israel Hydrological Service, said the storm was truly unusual because of its duration, its intensity and its breadth. Snow and hail fell not just in the north, but as far south as the desert city of Dimona, best known for its nuclear reactor.

In Beirut on Wednesday night, towering waves crashed against the Corniche, the seaside promenade downtown, flinging water and foam dozens of feet in the air as lightning flickered across the dark sea at multiple points along the horizon. Many roads were flooded as hail pounded the city.

Several people died, including a baby boy in a family of shepherds who was swept out of his mother’s arms by floodwaters. The greatest concern was for the 160,000 Syrian refugees who have fled to Lebanon, taking shelter in schools, sheds and, where possible, with local families. Some refugees are living in farm outbuildings, which are particularly vulnerable to cold and rain.

Barry Lynn, who runs a forecasting business and is a lecturer at the Hebrew University’s department of earth science, said a striking aspect of the whole thing was the severe and prolonged cold in the upper atmosphere, a big-picture shift that indicated the Atlantic Ocean was no longer having the moderating effect on weather in the Middle East and Europe that it has historically.

“The intensity of the cold is unusual,” Mr. Lynn said. “It seems the weather is going to become more intense; there’s going to be more extremes.”

In Britain, where changes to the positioning of the jet stream — a ribbon of air high up in the atmosphere that helps steer weather systems — may be contributing to the topsy-turvy weather, people are still recovering from the December floods. In Worcester last week, the river Severn remained flooded after three weeks, with playing fields buried under water.

In the shop at the Worcester Cathedral, Julie Smith, 54, was struggling, she said, to adjust to the new uncertainty.

“For the past seven or eight years, there’s been a serious incident in a different part of the country,” Mrs. Smith said. “We don’t expect extremes. We don’t expect it to be like this.”

Reporting was contributed by Jodi Rudoren from Jerusalem; Irit Pazner Garshowitz from Tzur Hadassah, Israel; Fares Akram from Gaza City, Gaza; Ellen Barry and Andrew Roth from Moscow; Ranya Kadri from Amman, Jordan; Dan Levin from Harbin, China; Jim Yardley from New Delhi; Anne Barnard from Beirut, Lebanon; Matt Siegel from Sydney, Australia; Scott Sayare from Paris; and Simon Romero from Rio de Janeiro.

*   *   *

 It’s Official: 2012 Was Hottest Year Ever in U.S.

By JUSTIN GILLIS

NY Times, January 8, 2012

http://www.nytimes.com/2013/01/09/science/earth/2012-was-hottest-year-ever-in-us.html?hp&_r=0

The numbers are in: 2012, the year of a surreal March heat wave, a severe drought in the corn belt and a massive storm that caused broad devastation in the mid-Atlantic states, turns out to have been the hottest year ever recorded in the contiguous United States.

How hot was it? The temperature differences between years are usually measured in fractions of a degree, but last year blew away the previous record, set in 1998, by a full degree Fahrenheit.

If that does not sound sufficiently impressive, consider that 34,008 new daily high records were set at weather stations across the country, compared with only 6,664 new record lows, according to a count maintained by the Weather Channel meteorologist Guy Walton, using federal temperature records.

That ratio, which was roughly in balance as recently as the 1970s, has been out of whack for decades as the country has warmed, but never by as much as it was last year.

“The heat was remarkable,” said Jake Crouch, a scientist with the National Climatic Data Center in Asheville, N.C., which released the official climate compilation on Tuesday. “It was prolonged. That we beat the record by one degree is quite a big deal.”

Scientists said that natural variability almost certainly played a role in last year’s extreme heat and drought. But many of them expressed doubt that such a striking new record would have been set without the backdrop of global warming caused by the human release of greenhouse gases. And they warned that 2012 was likely a foretaste of things to come, as continuing warming makes heat extremes more likely.

Even so, the last year’s record for the United States is not expected to translate into a global temperature record when figures are released in coming weeks. The year featured a La Niña weather pattern, which tends to cool the global climate over all, and scientists expect it to be the world’s eighth or ninth warmest year on record.

Assuming that prediction holds up, it will mean that the 10 warmest years on record all fell within the past 15 years, a measure of how much the planet has warmed. Nobody who is under 28 has lived through a month of global temperatures that fell below the 20th-century average, because the last such month was February 1985.

Last year’s weather in the United States began with an unusually warm winter, with relatively little snow across much of the country, followed by a March that was so hot that trees burst into bloom and swimming pools opened early. The soil dried out in the March heat, helping to set the stage for a drought that peaked during the warmest July on record.

The drought engulfed 61 percent of the nation, killed corn and soybean crops and sent prices spiraling. It was comparable to a severe drought in the 1950s, Mr. Crouch said, but not quite as severe as the legendary Dust Bowl drought of the 1930s, which was exacerbated by poor farming practices that allowed topsoil to blow away.

Extensive records covering the lower 48 states go back to 1895; Alaska and Hawaii have shorter records and are generally not included in long-term climate comparisons for that reason.

Mr. Crouch pointed out that until last year, the coldest year in the historical record for the lower 48 states, 1917, was separated from the warmest year, 1998, by only 4.2 degrees Fahrenheit. That is why the 2012 record, and its one degree increase over 1998, strikes climatologists as so unusual.

“We’re taking quite a large step above what the period of record has shown for the contiguous United States,” he said.

In addition to being the nation’s warmest year, 2012 turned out to be the second-worst on a measure called the Climate Extremes Index, surpassed only by 1998.

Experts are still counting, but so far 11 disasters in 2012 have exceeded a threshold of $1 billion in damages, including several tornado outbreaks; Hurricane Isaac, which hit the Gulf Coast in August; and, late in the year, Hurricane Sandy, which caused damage likely to exceed $60 billion in nearly half the states, primarily in the mid-Atlantic region.

Among those big disasters was one bearing a label many people had never heard before: the derecho, a line of severe, fast-moving thunderstorms that struck central and eastern parts of the country starting on June 29, killing more than 20 people, toppling trees and knocking out power for millions of households.

For people who escaped both the derecho and Hurricane Sandy relatively unscathed, the year may be remembered most for the sheer breadth and oppressiveness of the summer heat wave. By the calculations of the climatic data center, a third of the nation’s population experienced 10 or more days of summer temperatures exceeding 100 degrees Fahrenheit.

Among the cities that set temperature records in 2012 were Nashville; Athens, Ga.; and Cairo, Ill., all of which hit 109 degrees on June 29; Greenville, S.C., which hit 107 degrees on July 1; and Lamar, Colo., which hit 112 degrees on June 27.

With the end of the growing season, coverage of the drought has waned, but the drought itself has not. Mr. Crouch pointed out that at the beginning of January, 61 percent of the country was still in moderate to severe drought conditions. “I foresee that it’s going to be a big story moving forward in 2013,” he said.

On the end of the world / sobre o fim do mundo (21.12.2012)

O mundo não acabou (Folha de S.Paulo)

Contardo Calligaris – 27/12/2012 – 03h00

Pode ser que o mundo acabe entre hoje (segunda, dia em que escrevo) e quinta, 27, dia em que seria publicada esta coluna. Em tese, eu não devo me preocupar: meu título não será desmentido –pois, se o mundo acabar, não haverá mais ninguém para verificar que eu me enganei.

Tudo isso, em termos, pois o fim do mundo esperado (mais ou menos ansiosamente) por alguns (ou por muitos) não é o sumiço definitivo e completo da espécie. Ao contrário: em geral, quem fantasia com o fim do mundo se vê como um dos sobreviventes e, imaginando as dificuldades no mundo destruído, aparelha-se para isso.

Na cultura dos EUA, os “survivalists” são também “preppers”: ou seja, quem planeja sobreviver se prepara. A catástrofe iminente pode ser mais uma “merecida” vingança divina contra Sodoma e Gomorra, a realização de uma antiga profecia, a consequência de uma guerra (nuclear, química ou biológica), o efeito do aquecimento global ou, enfim (última moda), o resultado de uma crise financeira que levaria todos à ruina e à fome.

A preparação dos sobreviventes pode incluir ou não o deslocamento para lugares mais seguros (abrigos debaixo da terra, picos de montanhas que, por alguma razão, serão poupados, lugares “místicos” com proteção divina, plataformas de encontro com extraterrestres etc.), mas dificilmente dispensa a acumulação de bens básicos de subsistência (alimentos, água, remédios, combustíveis, geradores, baterias) e (pelo seu bem, não se esqueça disso) de armas de todo tipo (caça e defesa) com uma quantidade descomunal de munições -sem contar coletes a prova de balas e explosivos.

Imaginemos que você esteja a fim de perguntar “armas para o quê?”. Afinal, você diria, talvez a gente precise de armas de caça, pois o supermercado da esquina estará fechado. Mas por que as armas para defesa? Se houver mesmo uma catástrofe, ela não poderia nos levar a descobrir novas formas de solidariedade entre os que sobraram? Pois bem, se você coloca esse tipo de perguntas, é que você não fantasia com o fim do mundo.

Para entender no que consiste a fantasia do fim do mundo, não é preciso comparar os diferentes futuros pós-catastróficos possíveis. Assim como não é preciso considerar se, por exemplo, nos vários cenários desolados do dia depois, há ou não o encontro com um Adão ou uma Eva com quem recomeçar a espécie. Pois essas são apenas variações, enquanto a necessidade das armas (e não só para caçar os últimos coelhos e faisões) é uma constante, que revela qual é o sonho central na expectativa do fim do mundo.

Em todos os fins do mundo que povoam os devaneios modernos, alguns ou muitos sobrevivem (entre eles, obviamente, o sonhador), mas o que sempre sucumbe é a ordem social. A catástrofe, seja ela qual for, serve para garantir que não haverá mais Estado, condado, município, lei, polícia, nação ou condomínio. Nenhum tipo de coletividade instituída sobreviverá ao fim do mundo. Nele (e graças a ele) perderá sua força e seu valor qualquer obrigação que emane da coletividade e, em geral, dos outros: seremos, como nunca fomos, indivíduos, dependendo unicamente de nós mesmos.

Esse é o desejo dos sonhos do fim do mundo: o fim de qualquer primazia da vida coletiva sobre nossas escolhas particulares. O que nos parece justo, no nosso foro íntimo, sempre tentará prevalecer sobre o que, em outros tempos, teria sido ou não conforme à lei.

Por isso, depois do fim do mundo, a gente se relacionará sem mediações –sem juízes, sem padres, sem sábios, sem pais, sem autoridade reconhecida: nós nos encararemos, no amor e no ódio, com uma mão sempre pronta em cima do coldre.

E não é preciso desejar explicitamente o fim do mundo para sentir seu charme. A confrontação direta entre indivíduos talvez seja a situação dramática preferida pelas narrativas que nos fazem sonhar: a dura história do pioneiro, do soldado, do policial ou do criminoso, vagando num território em que nada (além de sua consciência) pode lhes servir de guia e onde nada se impõe a não ser pela força.

Na coluna passada, comentei o caso do jovem que matou a mãe e massacrou 20 crianças e seis adultos numa escola primária de Newtown, Connecticut. Pois bem, a mãe era uma “survivalist”; ela se preparava para o fim do mundo. Talvez, junto com as armas e as munições acumuladas, ela tenha transmitido ao filho alguma versão de seu devaneio de fim do mundo.

*   *   *

Are You Prepared for Zombies? (American Anthropological Association blog)

By Joslyn O. – December 21, 2012 at 12:52 pm

 

In light of all the end of the world talk, a repost of this Zombie preppers post from last spring:

Today’s guest blog post is by cultural anthropologist and AAA member, Chad Huddleston. He is an Assistant Professor at St. Louis University in the Sociology, Anthropology and Criminal Justice department.

Recently, a host of new shows, such as Doomsday Preppers on NatGeo and Doomsday Bunkers on Discovery Channel, has focused on people with a wide array of concerns about possible events that may threaten their lives.  Both of these shows focus on what are called ‘preppers.’ While the people that may have performed these behaviors in the past might have been called ‘survivalists,’ many ‘preppers’ have distanced themselves from that term, due to its cultural baggage: stereotypical anti-government, gun-loving, racist, extremists that are most often associated with the fundamentalist (politically and religiously) right side of the spectrum.

I’ve been doing fieldwork with preppers for the past two years, focusing on a group called Zombie Squad. It is ‘the nation’s premier non-stationary cadaver suppression task force,’ as well as a grassroots, 501(c)3 charity organization.  Zombie Squad’s story is that while the zombie removal business is generally slow, there is no reason to be unprepared.  So, while it is waiting for the “zombpacolpyse,” it focuses its time on disaster preparedness education for the membership and community.

The group’s position is that being prepared for zombies means that you are prepared for anything, especially those events that are much more likely than a zombie uprising – tornadoes, an interruption in services, ice storms, flooding, fires, and earthquakes.

For many in this group, Hurricane Katrina was the event that solidified their resolve to prep.  They saw what we all saw – a natural disaster in which services were not available for most, leading to violence, death and chaos. Their argument is that the more prepared the public is before a disaster occurs, the less resources they will require from first responders and those agencies that come after them.

In fact, instead of being a victim of natural disaster, you can be an active responder yourself, if you are prepared.  Prepare they do.  Members are active in gaining knowledge of all sorts – first aid, communications, tactical training, self-defense, first responder disaster training, as well as many outdoor survival skills, like making fire, building shelters, hunting and filtering water.

This education is individual, feeding directly into the online forum they maintain (which has just under 30,000 active members from all over the world), and by monthly local meetings all over the country, as well as annual national gatherings in southern Missouri, where they socialize, learn survival skills and practice sharpshooting.

Sound like those survivalists of the past?  Emphatically no.  Zombie Squad’s message is one of public education and awareness, very successful charity drives for a wide array of organizations, and inclusion of all ethnicities, genders, religions and politics.  Yet, the group is adamant on leaving politics and religion out of discussions on the group and prepping. You will not find exclusive language on their forum or in their media.  That is not to say that the individuals in the group do not have opinions on one side or the other of these issues, but it is a fact that those issues are not to be discussed within the community of Zombie Squad.

Considering the focus on ‘future doom’ and the types of fears that are being pushed on the shows mentioned above, usually involve protecting yourself from disaster and then other people that have survived the disaster, Zombie Squad is a refreshing twist to the ‘prepper’ discourse.  After all, if a natural disaster were to befall your region, whom would you rather be knocking at your door: ‘raiders’ or your neighborhood Zombie Squad member?

And the answer is no: they don’t really believe in zombies.

 

Água marginalizada: O reflexo da sociedade (Envolverde)

9/12/2012 – 10h35

por Sarah Bueno Motter e Giovani de Oliveira, da EcoAgência

Diluvio Água marginalizada: O reflexo da sociedade

O Dilúvio é o maior riacho que corta a cidade de Porto Alegre. Foto: Divulgação/Internet

As margens são um limite. Até onde o Dilúvio vai, até onde ele pode ir. Balizado pelo concreto humano, o arroio que corta a capital faz parte da rotina da cidade. Em suas margens, estão os congestionamentos e a ansiedade de Porto Alegre. Nas suas beiradas, está, na hora do rush, o stress de querer chegar rápido ao outro lado da cidade e não conseguir a velocidade pretendida. A poluição que corre dentro do Dilúvio também passa nos seus limiares, os quais são contaminados pela exaustão da sociedade perante sua rotina.

As margens do Dilúvio transbordam o vazio de nossa civilização que corre apressada sem nem saber o motivo. Que deixa à sua margem aqueles que não têm o capital e as oportunidades iguais, aqueles que não têm o carro, aqueles que não têm a casa. Esses ficam às margens.

As bordas também refletem as novas tendências. O desejo da ciclovia, do transporte limpo. Elas falam de um novo caminho que a cidade “quer” abrir. Um caminho para o sustentável.

Mas a sustentabilidade não caminha junto da miséria e da desigualdade e ela não é parceira do descaso. A sustentabilidade não está nas aparências. Ela não é balizada por frágeis mudanças sem conteúdo maciço, sem a pretensão de uma metamorfose. Ela não parte do nada e não chega a lugar nenhum. Ela não se inaugura com uma quadra de ciclovia, ela é uma estrada inteira.

A água, quando cai no Dilúvio, faz o barulho característico dos riachos, aquele som que muitas vezes queremos levar para casa, comprando uma fonte de decoração. O barulho é tão bonito e característico, mas o concreto afasta a cidade da natureza, que suja de nossos resíduos, continua seu caminho. As margens do Dilúvio são uma síntese do que somos. Os carros, os excluídos, a sujeira, os “novos caminhos” e a natureza que teima e vive entre o cinza da ambição humana.

O Dilúvio é o símbolo de uma sociedade precária, individualista e agressiva. Como muitas das crianças que moram embaixo de suas pontes, suas águas são agredidas desde o começo de sua vida. Já em sua nascente, na Lomba do Sabão, o arroio é violentado pela ocupação irregular da área. Famílias, sem condições de moradia, ocupam um local protegido por lei, e jogam seus dejetos nas águas do Dilúvio. Pessoas violentadas pela sociedade do ter, sem espaço para tentar ser, violentam também o arroio e invadem seu espaço.

Espaço que cada vez existe menos. Espaço cada vez mais ocupado pelo lixo, espaço que nós não temos mais. O espaço que poderia ser de lazer, de contato com a natureza em meio à cidade, torna-se um espaço do qual fugimos. Não a toa, algumas pessoas defendem que se cubra o Dilúvio. Defendem uma grande tampa de concreto, que não cure a ferida, mas nos impeça de ver ou sentir.

Mas incrivelmente, violentado do começo ao fim, o Dilúvio segue vivo, suas águas, são a moradia de peixes, pescados por improváveis gaivotas porto-alegrenses. E suas margens, costeadas pelo cinza, ainda conservam um verde, que insiste em se manter vivo.

* Publicado originalmente no site EcoAgência.

Visualizing The Way Americans Value Water (fastcoexist.com)

By Ariel Schwartz (accessed December 17, 2012)

It’s a pretty precious resource, considering that we need it to live. But do we actually care enough to change our behavior to make sure we have it in the future?

The aging water infrastructure in the U.S. is fragile, to say the least; every year, over 1.7 trillion gallons of water are lost due to leaks and breaks in the system. It’s never good to waste water, but that’s a staggeringly unacceptable figure at a time when the country is facing unprecedented droughts. But on a grassroots level, things may be starting to change. Water technology company Xylem’s new Value of Water Index, which examines American attitudes toward water, indicates that the public is finally realizing the magnitude of our water problem–and that everyone might need to pitch in to fix it.

According to the report–culled from a survey of 1,008 voters in the U.S.–79% of Americans realize we have a water scarcity problem. That may seem high, but 86% of respondents also say they have dealt with water shortages and contamination, meaning it takes a lot (or is just impossible) to convince some people. A whopping 88% of respondents think the country’s water structure needs reform.

Americans also think they have some personal responsibility for the crisis–specifically, 31% of respondents think they should have to pay a bit more on water bills for infrastructure improvements. If Americans upped their monthly water bill by just $7.70, we would see an extra $6.4 billion for water infrastructure investments.

In spite of everything, 69% of those polled say they take clean water for granted, and just 29% think problems with our water infrastructure will seriously affect them (remember: the vast majority of respondents have dealt with water shortages and contamination already). Water awareness still has a long way to go–but it will most likely be sped up as water shortages become more common.

Here’s the whole infographic

Eight examples of where the IPCC has missed the mark on its predictions and projections (The Daily Climate)

flooded-768

A “king tide” leaves parts of Sausalito, Calif., flooded in 2010. Disagreement over the impact of ice-sheet melting on sea-level rise has led the Intergovernmental Panel on Climate Change to omit their influence – and thus underestimate sea-level rise – in recent reports, a pattern the panel repeats with other key findings. Photo by Yanna B./flickr.

Dec. 6, 2012

Correction appended

By Glenn Scherer
The Daily Climate

Scientists will tell you: There are no perfect computer models. All are incomplete representations of nature, with uncertainty built into them. But one thing is certain: Several fundamental projections found in Intergovernmental Panel on Climate Change reports have consistently underestimated real-world observations, potentially leaving world governments at doubt as to how to guide climate policy.

emissions

Emissions

At the heart of all IPCC projections are “emission scenarios:” low-, mid-, and high-range estimates for future carbon emissions. From these “what if” estimates flow projections for temperature, sea-rise, and more.

Projection: In 2001, the IPCC offered a range of fossil fuel and industrial emissions trends, from a best-case scenario of 7.7 billion tons of carbon released each year by 2010 to a worst-case scenario of 9.7 billion tons.

Reality: In 2010, global emissions from fossil fuels alone totaled 9.1 billion tons of carbon, according to federal government’s Earth Systems Research Laboratory.

Why the miss? While technically within the range, scientists never expected emissions to rise so high so quickly, said IPCC scientist Christopher Fields. The IPCC, for instance, failed to anticipate China’s economic growth, or resistance by the United States and other nations to curbing greenhouse gases.

“We really haven’t explored a world in which the emissions growth rate is as rapid as we have actually seen happen,” Fields said.

Temperature

IPCC models use the emission scenarios discussed above to estimate average global temperature increases by the year 2100.

warming-300

Projection: The IPCC 2007 assessment projected a worst-case temperature rise of 4.3° to 11.5° Fahrenheit, with a high probability of 7.2°F.

Reality: We are currently on track for a rise of between 6.3° and 13.3°F, with a high probability of an increase of 9.4°F by 2100, according to the Massachusetts Institute of Technology. Other modelers are getting similar results, including a study published earlier this month by the Global Carbon Project consortium confirming the likelihood of a 9ºF rise.

Why the miss? IPCC emission scenarios underestimated global CO2 emission rates, which means temperature rates were underestimated too. And it could get worse: IPCC projections haven’t included likely feedbacks such as large-scale melting of Arctic permafrost and subsequent release of large quantities of CO2 and methane, a greenhouse gas 20 times more potent, albeit shorter lived, in the atmosphere than carbon dioxide.

Arctic Meltdown

Five years ago, the summer retreat of Arctic ice wildly outdistanced all 18 IPCC computer models, amazing IPCC scientists. It did so again in 2012.

ice-600

Projection: The IPCC has always confidently projected that the Arctic ice pack was safe at least until 2050 or well beyond 2100.

Reality: Summer ice is thinning faster than every climate projection, and today scientists predict an ice-free Arctic in years, not decades. Last summer, Arctic sea ice extent plummeted to 1.32 million square miles, the lowest level ever recorded – 50 percent below the long-term 1979 to 2000 average.

Why the miss? For scientists, it is increasingly clear that the models are under-predicting the rate of sea ice retreat because they are missing key real-world interactions.

“Sea ice modelers have speculated that the 2007 minimum was an aberration… a matter of random variability, noise in the system, that sea ice would recover.… That no longer looks tenable,” says IPCC scientist Michael Mann. “It is a stunning reminder that uncertainty doesn’t always act in our favor.”

Ice Sheets

Greenland and Antarctica are melting, even though IPCC said in 1995 that they wouldn’t be.

Projection: In 1995, IPCC projected “little change in the extent of the Greenland and Antarctic ice sheets… over the next 50-100 years.” In 2007 IPCC embraced a drastic revision: “New data… show[s] that losses from the ice sheets of Greenland and Antarctica have very likely contributed to sea level rise over 1993 to 2003.”

Today, ice loss in Greenland and Antarctica is trending at least 100 years ahead of projections compared to IPCC’s first three reports.

Reality: Today, ice loss in Greenland and Antarctica is trending at least 100 years ahead of projections compared to IPCC’s first three reports.

Why the miss? “After 2001, we began to realize there were complex dynamics at work – ice cracks, lubrication and sliding of ice sheets,” that were melting ice sheets quicker, said IPCC scientist Kevin Trenberth. New feedbacks unknown to past IPCC authors have also been found. A 2012 study, for example, showed that the reflectivity of Greenland’s ice sheet is decreasing, causing ice to absorb more heat, likely escalating melting.

Sea-Level Rise

The fate of the world’s coastlines has become a classic example of how the IPCC, when confronted with conflicting science, tends to go silent.

Projection: In the 2001 report, the IPCC projected a sea rise of 2 millimeters per year. The worst-case scenario in the 2007 report, which looked mostly at thermal expansion of the oceans as temperatures warmed, called for up to 1.9 feet of sea-level-rise by century’s end.

Today: Observed sea-level-rise has averaged 3.3 millimeters per year since 1990. By 2009, various studies that included ice-melt offered drastically higher projections of between 2.4 and 6.2 feet sea level rise by 2100.

Why the miss? IPCC scientists couldn’t agree on a value for the contribution melting Greenland and Antarctic ice sheets would add to sea-level rise. So they simply left out the data to reach consensus. Science historian Naomi Oreskes calls this – one of IPCC’s biggest underestimates – “consensus by omission.”

Ocean Acidification

To its credit, the IPCC admits to vast climate change unknowns. Ocean acidification is one such impact.

Projection: Unmentioned as a threat in the 1990, 1995 and 2001 IPCC reports. First recognized in 2007, when IPCC projected acidification of between 0.14 and 0.35 pH units by 2100. “While the effects of observed ocean acidification on the marine biosphere are as yet undocumented,” said the report, “the progressive acidification of oceans is expected to have negative impacts on marine shell-forming organisms (e.g. corals) and their dependent species.”

Reality: The world’s oceans absorb about a quarter of the carbon dioxide humans release annually into the atmosphere. Since the start of the Industrial Revolution, the pH of surface ocean waters has fallen by 0.1 pH units. Since the pH scale is logarithmic, this change represents a stunning 30 percent increase in acidity.

Why the miss? Scientists didn’t have the data. They began studying acidification by the late 1990s, but there weren’t many papers on the topic until mid-2000, missing the submission deadline for IPCC’s 2001 report. Especially alarming are new findings that ocean temperatures and currents are causing parts of the seas to become acidic far faster than expected, threatening oysters and other shellfish.

National Oceanic and Atmospheric Administration chief Jane Lubchenco has called acidification the “equally evil twin” to global warming.

Thawing Tundra

Some carbon-cycle feedbacks that could vastly amplify climate change – especially a massive release of carbon and methane from thawing permafrost – are extremely hard to model.

Projection: In 2007, IPCC reported with “high confidence” that “methane emissions from tundra… and permafrost have accelerated in the past two decades, and are likely to accelerate further.” However, the IPCC offered no projections regarding permafrost melt.

Reality: Scientists estimate that the world’s permafrost holds 1.5 trillion tons of frozen carbon. That worries scientists: The Arctic is warming faster than anywhere else on earth, and researchers are seeing soil temperatures climb rapidly, too. Some permafrost degradation is already occurring.

Large-scale tundra wildfires in 2012 added to the concern.

Why the miss? This is controversial science, with some researchers saying the Arctic tundra is stable, others saying it will defrost only over long periods of time, and still more convinced we are on the verge of a tipping point, where the tundra thaws rapidly and catastrophically. A major 2005 study, for instance, warned that the entire top 11 feet of global permafrost could disappear by century’s end, with potentially cataclysmic climate impacts.

The U.N. Environmental Programme revealed this week that IPCC’s fifth assessment, due for release starting in September, 2013, will again “not include the potential effects of the permafrost carbon feedback on global climate.”

Tipping points

The IPCC has been silent on tipping points – non-linear “light switch” moments when the climate system abruptly shifts from one paradigm to another.

The trouble with tipping points is they’re hard to spot until you’ve passed one.

Projection: IPCC has made no projections regarding tipping-point thresholds.

Reality: The scientific jury is still out as to whether we have reached any climate thresholds – a point of no return for, say, an ice-free Arctic, a Greenland meltdown, the slowing of the North Atlantic Ocean circulation, or permanent changes in large-scale weather patterns like the jet stream, El Niño or monsoons. The trouble with tipping points is they’re hard to spot until you’ve passed one.

Why the miss? Blame the computers: These non-linear events are notoriously hard to model. But with scientists recognizing the sizeable threat tipping points represent, they will be including some projections in the 2013-14 assessment.

Correction (Dec. 6, 2012): Earlier editions incorrectly compared global carbon dioxide emissions against carbon emissions scenarios. Carbon dioxide is heavier, incorrectly skewing the comparison. Global use of fossil fuels in 2010 produced about 30 billion tons of carbon dioxide but only 9.1 tons of carbon, putting emissions within the extreme end of IPCC scenarios. The story has been changed to reflect that.

© Glenn Scherer, 2012. All rights reserved.

Graphic of emissions scenario courtesy U.S. Global Change Research Program. Photo of activist warning of 6ºC warming © Adela Nistora. Graphic showing Arctic summer ice projections vs. observations by the Vancouver Observer.

Glenn Scherer is senior editor of Blue Ridge Press, a news service that has been providing environmental commentary and news to U.S. newspapers since 2007.

DailyClimate.org is a foundation-funded news service covering climate change. Contact editor Douglas Fischer at dfischer [at] dailyclimate.org

Go With the Flow in Flood Prediction (Science Daily)

Dec. 3, 2012 — Floods have once again wreaked havoc across the country and climate scientists and meteorologists suggest that the problem is only going to get worse with wetter winters and rivers bursting their banks becoming the norm. A team based at Newcastle University and their colleagues in China have developed a computer model that can work out how the flood flow will develop and where flooding will be worst based on an understanding of fluid dynamics and the underlying topology of a region.

Writing in the journal Progress in Computational Fluid Dynamics,Newcastle civil engineer Qiuhua Liang and colleagues and Chi Zhang of Dalian University of Technology and Junxian Yin, China Institute of Water Resources and Hydropower Research in Beijing, explain how they have developed an adaptive computer model that could provide accurate and efficient predictions about the flow of water as a flood occurs. Such a model might provide environmental agencies and authorities with a more precise early-warning system for residents and businesses in a region at risk of flood. It could also be used by insurance companies to determine the relative risk of different areas within a given region and so make their underwriting of the risk economically viable.

The model is based on a numerical solution to the hydrodynamic equations of fluid flow . This allows the researchers to plot the likely movement of water during a dam break or flash flood over different kinds of terrain and around obstacles even when flood waves are spreading rapidly. The researchers have successfully tested their model on real-world flood data.

The team points out that flood disasters have become a major threat to human lives and assets. “Flood management is therefore an important task for different levels of governments and authorities in many countries”, the researchers explain. “The availability of accurate and efficient flood modelling tools is vital to assist engineers and managers charged with flood risk assessment, prevention and alleviation.”

Journal Reference:

  1. Chi Zhang, Qiuhua Liang, Junxian Yin. A first-order adaptive solution to rapidly spreading flood waves.Progress in Computational Fluid Dynamics, An International Journal, 2013; 13 (1): 1 DOI: 10.1504/PCFD.2013.050645

Blame, Responsibility and Demand for Change Following Floods (Science Daily)

Nov. 25, 2012 — New research shows concerns about governmental failure to act effectively and fairly in the aftermath of extreme weather events can affect the degree to which residents are willing to protect themselves.

Published in the journal Nature Climate Change, the findings of a team led by scientists at the University could prove key to establishing how society should evolve to cope with more turbulent weather and more frequent mega storms.

The team examined attitudes in Cumbria in north west England and Galway in western Ireland, which were both hit by heavy flooding in November 2009. Record rainfall was recorded in both countries, resulting in a number of deaths, properties being severely damaged and economic disruption.

Professor Neil Adger of Geography at the University of Exeter, who led the research, said: “The flooding of 2009 was devastating to both communities. Our study is the first to track the impacts of floods across two countries and how communities and individuals demand change after such events. When people in both studies felt that government had fallen short of their expectations, we found that the resulting perception of helplessness leads to an unwillingness to take personal action to prevent flooding in future.”

Scientists at the University of Exeter worked with colleagues at the National University of Ireland Maynooth and the Tyndall Centre for Climate Change Research at the University of East Anglia, which also provided funding for the study.

Researchers surveyed 356 residents in both areas eight months after the flooding. They measured perceptions of governments’ performances in dealing with the aftermath, as well as perceptions of fairness in that response and the willingness of individuals to take action.

Dr Irene Lorenzoni of the Tyndall Centre comments: “Residents in Galway were significantly more likely to believe that their property would be flooded again than those in Cumbria. Yet it was Cumbrians who believed they had more personal responsibility to adapt to reduce future incidents.

“Whether people felt responses were fair also diverged. In our survey in Cumbria three quarters of respondents agreed that everyone in their community had received prompt help following the flooding, while in Galway it was less than half.”

Dr Conor Murphy of the National University of Ireland, Maynooth said: “The strong perception in Galway that authorities failed to deliver on the expectations of flooded communities in late 2009 is a wakeup call. Given the high exposure of development in flood prone areas it is clear that both England and Ireland need to make major investments in building flood resilience with changing rainfall patterns induced by climate change. Political demand for those investments will only grow.”

Professor Adger says: “Our research shows that climate change is likely to lead to a series of crises which will cause major disruption as instant short-term solutions are sought. We need to consider the implicit contract between citizens and government agencies when planning for floods, to enable fairer and smoother processes of adaptation.”

Journal Reference:

  1. W. Neil Adger, Tara Quinn, Irene Lorenzoni, Conor Murphy, John Sweeney. Changing social contracts in climate-change adaptationNature Climate Change, 2012; DOI:10.1038/nclimate1751

The Opportunistic Apocalypse (Savage Minds)

by  on December 14th, 2012

The third in a guest series about the “Mayan Apocalypse” predicted for Dec. 21, 2012.  The first two posts are here and here.

There are opportunities in the apocalypse.  The end of the world has been commodified.  A few are seriously investing in bunkers, boats, and survival supplies. Tourism is up, not only to Mayan archaeological sites, but also to places like Bugarach, France and Mt. Rtanj, Serbia.  But even those of us on a budget can afford at least a book, a T-shirt or a handbag.

There are opportunities here for academics, too. Many scholars have been quoted in the press lately saying that nothing will happen on Dec 21 , in addition to those who have written comprehensive books and articles discrediting the impending doom. Obviously publishing helps individual careers, and that does not detract from our collective responsibility to debunk ideas that might lead people to physical or financial harm.  But neither can we divorce our work from its larger social implications.

It is telling that the main scholarly players in debunking the Mayan Apocalypse in the U.S. are NASA (which is facing budget cuts) and anthropologists.  Both groups feel the need to prove they are relevant because our collective jobs depend on it. I don’t need to go into great detail with this crowd about academia’s current situation. Academia has gone from being a well-respected, stable job to one where most classes are taught by underpaid, uninsured part-time adjuncts, and many Ph.D.s never find work in academia at all. Tuition fees for undergraduates have skyrocketed while full-time faculty salaries have stagnated.

Among the public (too often talked about as being in “the real world,” as if academics were somehow immune to taxes or swine flu), there seems to be a general distrust of intellectuals. That, combined with the current economic situation, has translated into a loss of research funding, such as cuts to the Fulbright program and NSF. Some public officials specifically state that science and engineering are worth funding, but anthropology is not.  To add insult to injury, the University of California wants to move away from that whole “reading” thing and rebrand itself as a web startup.

Articles, books with general readership, being quoted in the newspaper, and yes, blogging are all concrete ways to show funding agencies and review committees that what we do matters. The way to get exposure among those general audiences is to engage with what interests them — like the end of the world.  Dec. 21, 2012 has become an internet meme. Many online references to it are debunkings or tongue-in-cheek. Newspaper articles on unrelated topics make passing references in jest, stores offer just-in-case-it’s-real sales, people are planning parties.  There seems to be more written to discredit the apocalypse, or make fun on it, than to prepare for it.

We need to remember that this non-believer attention has a purpose, and that purpose is not just (or even primarily) about convincing believers that nothing is going to happen. Rather, it serves to demonstrate something about non-believers themselves.  “We” are sensible and logical, while “they” are superstitious and credulous. “We” value science and data, while “they” turn to astrology, misreadings of ancient texts, and esoteric spirituality.   ”We” remember the non-apocalypses of the past, while “they” have forgotten.

I would argue that discrediting the Mayan Apocalypse is part of an ongoing process of creating western modernity (cue Latour). That modernity requires an “other,” and here that “other” is defined in this case primarily by religious/spiritual belief in the Mayan apocalypse.  The more “other” these Apocalypse believers are, the more clearly they reflect the modernity of non-believers.  (Of course, there are also the “others” of the Maya themselves, and I’ll address that issue in my next post.)

This returns us to the difference I drew in my first post between “Transitional Apocalyptic Expectations” (TAE) and “Catastrophic Apocalyptic Expectations” (CAE).  I suspect the majority of believers are expecting something like a TAE-type event, but media attention focuses on discrediting CAE beliefs, such as a rogue planet hitting the Earth or massive floods. These would be dire catastrophes, but they will also be far easier to disprove. We will all notice if a planet does or does not hit the Earth next week, but many of us — myself included — will miss a transformation in human consciousness among the enlightened.

By providing the (very real) scientific data to discredit the apocalypse, scholars are incorporated into this project of modernity.  Much of the scholarly work on this phenomenon is fascinating and subtle, but the press picks up on two main themes.  One is scientific proof that the apocalypse will not happen, such as astronomical data that Earth is not on a collision course with another planet, Mayan epigraphy that shows the Long Count does not really end, and ethnography that suggests most Maya themselves are not worried about any of this.  The other scholarly theme the press circulates is the long history of apocalyptic beliefs in the west.  In the logic of the metanarrative of western progress, this connects contemporary Apocalypse believers to the past, nonmodernity and “otherness.”

I now find myself in an uncomfortable position, although it is an intellectually interesting corner to be backed into. I agree with my colleagues that the world will not end, that Mayan ideas have been misappropriated, and that we have a responsibility to address public concerns.  At the same time, I can’t help but feel we are being drawn, either reluctantly or willingly, into a larger project than extends far beyond next week.

*   *   *

2012, the movie we love to hate

by  on December 11th, 2012

The second in a guest series about the “Mayan Apocalypse” predicted for Dec. 21, 2012.  The first post is here.

Last summer, I traveled to Philadelphia to visit the Penn Museum exhibit “Maya: the Lords of Time.” It was, as one might expect given the museum collection and the scholars involved, fantastic.  I want to comment on just the beginning of the exhibit, however. On entering, one is immediately greeted by a wall crowded with TV screens, all showing different clips of predicted disasters and people talking fearfully about the end of the world. The destruction, paranoia, and cacophony create a ambiance of chaos and uncertainty. Turning the corner, these images are replaced by widely spaced Mayan artifacts and stela. The effect is striking.  One moves from media-induced insanity to serenity, from endless disturbing jump-cuts to the well-lit, quiet contemplation of beautiful art.

Among these images were scenes from Director Roland Emmerich’s blockbuster film 2012 (2009). This over-the-top disaster film is well used in that context.  Still, it is interesting how often 2012 is mentioned by academics and other debunkers — almost as often as they mention serious alternative thinkers about the Mayan calendar, such as Jose Arguelles (although the film receives less in-depth coverage than he does).

I find this interesting because 2012 is clearly not trying to convince us to stockpile canned goods or build boats to prepare for the end of the Maya Long Count, any more than Emmerich’s previous films were meant to prepare us for alien invasion (Independence Day, 1996) or the effects of global climate change (The Day After Tomorrow, 2004).  Like Emmerich’s previous films,2012 is a chance to watch the urban industrialized world burn (in that way, it has much in common with the currently popular zombie film genre). If you want to see John Cusack survive increasingly implausible crumbling urban landscapes, this film is for you.

The Maya, however, are barely mentioned in 2012. There are no Mayan characters, no one travels to Mesoamerica, there is no mention of the Long Count.  Emmerich’s goal for 2012 was, in his own words (here and here), “a modern retelling of Noah’s Ark.” In fact, he claims that the movie originally had nothing to do with the 2012 phenomenon at all.  Instead, he was convinced – reluctantly – to include the concept because of public interest in the Maya calendar.

This explains why the Maya only receive two passing mentions in 2012 — one is a brief comment that even “they” had been able to predict the end of the world, the other a short news report on a cult suicide in Tikal. The marketing aspect of the film emphasized these Maya themes (all of the film footage about the Maya is in the trailer, the movie website starts with a rotating image of the Maya calendar, and there are related extras on the DVD), but the movie itself had basically nothing to do with the Maya, the Mayan Long Count, or Dec 21.

Nevertheless, this film’s impact on public interest in Dec 21 is measurable.  Google Trends, which gives data on the number of times particular search terms are used, gives us a sense of the impact of this $200,000,000  film. I looked at a number of related terms, but have picked the ones that show thegeneral pattern: There is a spike of interest in 2012 apocalyptic ideas when the 2012 marketing campaign starts (November 2008), a huge spike when the film is released (November 2009), and a higher baseline of interest from then until now. Since January, interest in the Mayan calendar/apocalypse has been steadily climbing (and in fact, is higher every time I check this link; it automatically updates). In other words, the 2012 movie both responded to, and reinforced, public interest in the 2012 phenomenon.

Here I return to Michael D. Gordin’s The Pseudoscience Wars (2012).  This delightful book deals with the scientific response to Velikovsky, who believed that the miracles of the Old Testament and other ancient myths documented the emergence of a comet from Jupiter, its traumatic interactions with Earth, and its eventual settling into the role of the planet Venus. (The final chapter also discusses the 2012 situation.)  Gordin’s main focus is understanding why Velikovsky — unlike others labeled “crackpots” before him — stirred the public ire of astronomers and physicists. Academics’ real concern was not Velikovsky’s ideas per se, but how much attention he received by being published by MacMillan — a major publisher of science textbooks — which implied the book had scientific legitimacy. Velikovsky’s “Worlds in Collision” was a major bestseller when it was released in 1950, and academics felt the ideas had to be addressed so that the public would not be misled.

With the Mayan Apocalypse, no major academic publisher is lending legitimacy to these theories.   Books about expected events of 2012 (mainly TAE ideas) are published by specialty presses that focus on the spiritual counterculture, such as Evolver EditionsInner Traditions/Bear & CompanyShambhala, and John Hunt Publishing.  Instead, film media has become the battleground for public attention (perhaps because reading is declining?). The immense amount of money put into movies, documentaries, and TV shows about the Mayan Apocalypse is creating public interest today, and in some ways this parallels what Macmillan did for Velikovsky in the 1950s.

One example of this is the viral marketing campaign for 2012 conducted in November 2008.   Columbia pictures created webpages that were not clearly marked as advertising (these no longer appear to be available), promoting the idea that scientists really did know the world would end and were preparing.  This type of advertising was not unique to this film, but in this case it reinforced already existing fears that the end really was nigh.  NASA began responding to public fears about 2012 as a result of this marketing campaign, and many of the academics interested in addressing these concerns also published after this time.

Academics are caught in something of a bind here.  Do we respond to public fears, in the hopes of debunking them, but no doubt also increasing the public interest in the very ideas we wish to discredit?  Should we respond in the hopes of selling a few more books or receiving a few more citations, thus generating interest in the rest of what our discipline does?  As anthropologists we are not immune to the desires of public interest, certainly (obviously I’m not — here I am, blogging away), nor should we be.  Perhaps something good can come of the non-end-of-the-world.  I’ll turn to this question next time.

*   *   *

The End is Nigh. Start blogging.

by  on December 4th, 2012

Savage Minds welcomes guest blogger Clare A. Sammells.

My thanks to the editors of Savage Minds for allowing me to guest blog this month. Hopefully I will not be among the last of Savage Mind’s guests, given that the End of the World is nigh.

You hadn’t heard? On or around Dec 21, 2012, the Maya Long Count will mark the end of a 5125 year cycle. Will this be a mere a calendrical turn, no more inherently eventful that the transition from Dec 31, 2012 to Jan 1, 2013? Will this be a moment of astronomical alignments, fiery conflagrations, and social upheavals? Or will there be a shift in human consciousness, an opportunity for the prepared to improve their lives and achieve enlightenment?

I am going to bet with the house: I do not think the world is going to end in a few weeks.  That way, either the world doesn’t end — another victory for predictive anthropology! — or the world does end, and nothing I write here will matter much anyway. (More seriously, I don’t think our world is destined to end with a bang).

I am not a Mayanist, an archaeologist, or an astronomer. I won’t be discussing conflicting interpretations of Maya long count dates, astronomical observations, or Classical-era Maya stela inscriptions. Books by David Stuart,Anthony Aveni, and Matthew Restall and Amara Solari all provide detailed arguments using those data, and analyze the current phenomenon in light of the long history of western fascinations with End Times.  Articles by John HoopesKevin Whitesides, and Robert Sitler, among others, address “New Age” interpretations of the Maya.  Many ethnographers have considered how Maya peoples understand their complex interactions with “New Age” spiritualists and tourists, among them Judith MaxwellQuetzil Casteneda and Walter Little.

My own interest lies in how indigenous timekeeping is interpreted in the Andes. I conducted ethnographic research focusing on tourism in Tiwanaku, Bolivia — a pre-Incan archaeological site near Lake Titicaca, and a contemporary Aymara village.  One of the first things I noticed was that every tour guide tells visitors about multiple calendars inscribed in the stones of the site, most famously in the Puerta del Sol.  These calendrical interpretations are meaningful to Bolivian visitors, foreign tourists, and local Tiwanakenos for understanding the histories, ethnicities, and politics centered in this place. I took a stab at addressing some of these ideas in a recent article, where I considered how interconnected archaeological theories and political projects of the 1930s fed into what is today accepted conventional knowledge about Tiwanakota calendars.  I’m now putting together a book manuscript about temporal intersections in Tiwanaku.  The parallels between that situation and the Maya 2012 Phenomena led me to consider the prophecies, expectations, YouTube videos, blog posts, scholarly debunkings, and tourist travels motivated by the end of the Maya Long Count.

survey by the National Geographic Channel suggested that 27% of those in the United States think the Maya may have predicted a catastrophe for December 21.  But it is important to note that there is no agreement, even among believers, about what will happen. I tend to think of these beliefs as collecting into two broad (and often overlapping) camps.

Many believe that “something” will happen on (or around) Dec 21, 2012, but do not anticipate world destruction. I think of these beliefs as “Transitional Apocalyptic Expectations” (TAE). Writers such as José Argüelles and John Major Jenkins, for example, believe that there will be a shift in human consciousness, and tend to view the end of the 13th baktun as an opportunity for human improvement.

On the other hand, there are those who believe that the world will end abruptly, in fire, flood, cosmic radiation, or collision with other planets. I think of these beliefs as “Catastrophic Apocalyptic Expectations” (CAE).  While some share my belief that the numbers of serious CAE-ers is small, there are panics and survivalists reported by the press in RussiaFrance, and Los Angeles.  Tragically, there has been at least one suicide.  And of course, there has been a major Hollywood movie (“2012″), which I’ll be discussing more in my next post.

As anthropologists, we certainly should respond to public fears.  But we should also wonder why this fear, out of so many possible fears, is the one to capture public imagination.  Beliefs in paranormal activities, astrology, and the like are historically common, although the specifics change over time.  Michael D. Gordin’s excellent book The Pseudoscience Wars (2012) convincingly suggests that there are larger societal reasons why some fringe theories attract scholarly and public attention while others go ignored.  The Mayan Apocalypse has certainly attracted massive attention, from scholarly rebuttals from anthropologists, NASA, and others, to numerous popular parodies such as GQ’s survival tipsLOLcats, and my personal favorite, an advertisement for Mystic Mayan Power Cloaks.

There seems to be a general fascination with the Mayan calendar — even among those who know relatively little about the peoples that label refers to.  Some are anxiously watching the calendar count down, others are trying to reassure them, and many more simply watching, cracking jokes, or even selling supplies.  But there is something interesting about the fact that so many in the United States and Europe are talking about it at all.  I look forward to exploring these questions further with all of you.

Clare A. Sammells is Assistant Professor of Anthropology at Bucknell University. She is currently living in Madrid, where she is writing about concepts of time in Tiwanaku and conducting ethnographic research on food among Bolivian migrants.  She is not stockpiling canned goods.