Arquivo da tag: Desastre

New Book Explores ‘Noah’s Flood’: Says Bible and Science Can Get Along (Science Daily)

ScienceDaily (Aug. 14, 2012) — David Montgomery is a geomorphologist, a geologist who studies changes to topography over time and how geological processes shape landscapes. He has seen firsthand evidence of how the forces that have shaped Earth run counter to some significant religious beliefs.

But the idea that scientific reason and religious faith are somehow at odds with each other, he said, “is, in my view, a false dichotomy.”

In a new book, “The Rocks Don’t Lie: A Geologist Investigates Noah’s Flood” (Aug. 27, 2012, W.W. Norton), Montgomery explores the long history of religious thinking — particularly among Christians — on matters of geological discovery, from the writings of St. Augustine 1,700 years ago to the rise in the mid-20th century of the most recent rendering of creationism.

“The purpose is not to tweak people of faith but to remind everyone about the long history in the faith community of respecting what we can learn from observing the world,” he said.

Many of the earliest geologists were clergy, he said. Nicolas Steno, considered the founder of modern geology, was a 17th century Roman Catholic priest who has achieved three of the four steps to being declared a saint in the church.

“Though there are notable conflicts between religion and science — the famous case of Galileo Galilei, for example — there also is a church tradition of working to reconcile biblical stories with known scientific fact,” Montgomery said.

“What we hear today as the ‘Christian’ positions are really just one slice of a really rich pie,” he said.

For nearly two centuries there has been overwhelming geological evidence that a global flood, as depicted in the story of Noah in the biblical book of Genesis, could not have happened. Not only is there not enough water in the Earth system to account for water levels above the highest mountaintop, but uniformly rising levels would not allow the water to have the erosive capabilities attributed to Noah’s Flood, Montgomery said.

Some rock formations millions of years old show no evidence of such large-scale water erosion. Montgomery is convinced any such flood must have been, at best, a regional event, perhaps a catastrophic deluge in Mesopotamia. There are, in fact, Mesopotamian stories with details very similar, but predating, the biblical story of Noah’s Flood.

“If your world is small enough, all floods are global,” he said.

Perhaps the greatest influence in prompting him to write “The Rocks Don’t Lie” was a 2002 expedition to the Tsangpo River on the Tibetan Plateau. In the fertile river valley he found evidence in sediment layers that a great lake had formed in the valley many centuries ago, not once but numerous times. Downstream he found evidence that a glacier on several occasions advanced far enough to block the river, creating the huge lake.

But ice makes an unstable dam, and over time the ice thinned and finally give way, unleashing a tremendous torrent of water down the deepest gorge in the world. It was only after piecing the story together from geological evidence that Montgomery learned that local oral traditions told of exactly this kind of great flood.

“To learn that the locals knew about it and talked about it for the last thousand years really jolted my thinking. Here was evidence that a folk tale might be reality based,” he said.

He has seen evidence of huge regional floods in the scablands of Eastern Washington, carved by torrents when glacial Lake Missoula breached its ice dam in Montana and raced across the landscape, and he found Native American stories that seem to tell of this catastrophic flood.

Other flood stories dating back to the early inhabitants of the Pacific Northwest and from various islands in the Pacific Ocean, for example, likely tell of inundation by tsunamis after large earthquakes.

But he noted that in some regions of the world — in Africa, for example — there are no flood stories in the oral traditions because there the annual floods help sustain life rather than bring destruction.

Floods are not always responsible for major geological features. Hiking a trail from the floor of the Grand Canyon to its rim, Montgomery saw unmistakable evidence of the canyon being carved over millions of years by the flow of the Colorado River, not by a global flood several thousand years ago as some people still believe.

He describes that hike in detail in “The Rocks Don’t Lie.” He also explores changes in the understanding of where fossils came from, how geologists read Earth history in layers of rock, and the writings of geologists and religious authorities through the centuries.

Montgomery hopes the book might increase science literacy. He noted that a 2001 National Science Foundation survey found that more than half of American adults didn’t realize that dinosaurs were extinct long before humans came along.

But he also would like to coax readers to make sense of the world through both what they believe and through what they can see for themselves, and to keep an open mind to new ideas.

“If you think you know everything, you’ll never learn anything,” he said.

Global Warming’s Terrifying New Math (Rolling Stone)

Three simple numbers that add up to global catastrophe – and that make clear who the real enemy is

by: Bill McKibben

reckoning illoIllustration by Edel Rodriguez

If the pictures of those towering wildfires in Colorado haven’t convinced you, or the size of your AC bill this summer, here are some hard numbers about climate change: June broke or tied 3,215 high-temperature records across the United States. That followed the warmest May on record for the Northern Hemisphere – the 327th consecutive month in which the temperature of the entire globe exceeded the 20th-century average, the odds of which occurring by simple chance were 3.7 x 10-99, a number considerably larger than the number of stars in the universe.

Meteorologists reported that this spring was the warmest ever recorded for our nation – in fact, it crushed the old record by so much that it represented the “largest temperature departure from average of any season on record.” The same week, Saudi authorities reported that it had rained in Mecca despite a temperature of 109 degrees, the hottest downpour in the planet’s history.

Not that our leaders seemed to notice. Last month the world’s nations, meeting in Rio for the 20th-anniversary reprise of a massive 1992 environmental summit, accomplished nothing. Unlike George H.W. Bush, who flew in for the first conclave, Barack Obama didn’t even attend. It was “a ghost of the glad, confident meeting 20 years ago,” the British journalist George Monbiot wrote; no one paid it much attention, footsteps echoing through the halls “once thronged by multitudes.” Since I wrote one of the first books for a general audience about global warming way back in 1989, and since I’ve spent the intervening decades working ineffectively to slow that warming, I can say with some confidence that we’re losing the fight, badly and quickly – losing it because, most of all, we remain in denial about the peril that human civilization is in.

When we think about global warming at all, the arguments tend to be ideological, theological and economic. But to grasp the seriousness of our predicament, you just need to do a little math. For the past year, an easy and powerful bit of arithmetical analysis first published by financial analysts in the U.K. has been making the rounds of environmental conferences and journals, but it hasn’t yet broken through to the larger public. This analysis upends most of the conventional political thinking about climate change. And it allows us to understand our precarious – our almost-but-not-quite-finally hopeless – position with three simple numbers.

The First Number: 2° Celsius

If the movie had ended in Hollywood fashion, the Copenhagen climate conference in 2009 would have marked the culmination of the global fight to slow a changing climate. The world’s nations had gathered in the December gloom of the Danish capital for what a leading climate economist, Sir Nicholas Stern of Britain, called the “most important gathering since the Second World War, given what is at stake.” As Danish energy minister Connie Hedegaard, who presided over the conference, declared at the time: “This is our chance. If we miss it, it could take years before we get a new and better one. If ever.”

In the event, of course, we missed it. Copenhagen failed spectacularly. Neither China nor the United States, which between them are responsible for 40 percent of global carbon emissions, was prepared to offer dramatic concessions, and so the conference drifted aimlessly for two weeks until world leaders jetted in for the final day. Amid considerable chaos, President Obama took the lead in drafting a face-saving “Copenhagen Accord” that fooled very few. Its purely voluntary agreements committed no one to anything, and even if countries signaled their intentions to cut carbon emissions, there was no enforcement mechanism. “Copenhagen is a crime scene tonight,” an angry Greenpeace official declared, “with the guilty men and women fleeing to the airport.” Headline writers were equally brutal: COPENHAGEN: THE MUNICH OF OUR TIMES? asked one.

The accord did contain one important number, however. In Paragraph 1, it formally recognized “the scientific view that the increase in global temperature should be below two degrees Celsius.” And in the very next paragraph, it declared that “we agree that deep cuts in global emissions are required… so as to hold the increase in global temperature below two degrees Celsius.” By insisting on two degrees – about 3.6 degrees Fahrenheit – the accord ratified positions taken earlier in 2009 by the G8, and the so-called Major Economies Forum. It was as conventional as conventional wisdom gets. The number first gained prominence, in fact, at a 1995 climate conference chaired by Angela Merkel, then the German minister of the environment and now the center-right chancellor of the nation.

Some context: So far, we’ve raised the average temperature of the planet just under 0.8 degrees Celsius, and that has caused far more damage than most scientists expected. (A third of summer sea ice in the Arctic is gone, the oceans are 30 percent more acidic, and since warm air holds more water vapor than cold, the atmosphere over the oceans is a shocking five percent wetter, loading the dice for devastating floods.) Given those impacts, in fact, many scientists have come to think that two degrees is far too lenient a target. “Any number much above one degree involves a gamble,” writes Kerry Emanuel of MIT, a leading authority on hurricanes, “and the odds become less and less favorable as the temperature goes up.” Thomas Lovejoy, once the World Bank’s chief biodiversity adviser, puts it like this: “If we’re seeing what we’re seeing today at 0.8 degrees Celsius, two degrees is simply too much.” NASA scientist James Hansen, the planet’s most prominent climatologist, is even blunter: “The target that has been talked about in international negotiations for two degrees of warming is actually a prescription for long-term disaster.” At the Copenhagen summit, a spokesman for small island nations warned that many would not survive a two-degree rise: “Some countries will flat-out disappear.” When delegates from developing nations were warned that two degrees would represent a “suicide pact” for drought-stricken Africa, many of them started chanting, “One degree, one Africa.”

Despite such well-founded misgivings, political realism bested scientific data, and the world settled on the two-degree target – indeed, it’s fair to say that it’s the only thing about climate change the world has settled on. All told, 167 countries responsible for more than 87 percent of the world’s carbon emissions have signed on to the Copenhagen Accord, endorsing the two-degree target. Only a few dozen countries have rejected it, including Kuwait, Nicaragua and Venezuela. Even the United Arab Emirates, which makes most of its money exporting oil and gas, signed on. The official position of planet Earth at the moment is that we can’t raise the temperature more than two degrees Celsius – it’s become the bottomest of bottom lines. Two degrees.

The Second Number: 565 Gigatons

Scientists estimate that humans can pour roughly 565 more gigatons of carbon dioxide into the atmosphere by midcentury and still have some reasonable hope of staying below two degrees. (“Reasonable,” in this case, means four chances in five, or somewhat worse odds than playing Russian roulette with a six-shooter.)

This idea of a global “carbon budget” emerged about a decade ago, as scientists began to calculate how much oil, coal and gas could still safely be burned. Since we’ve increased the Earth’s temperature by 0.8 degrees so far, we’re currently less than halfway to the target. But, in fact, computer models calculate that even if we stopped increasing CO2 now, the temperature would likely still rise another 0.8 degrees, as previously released carbon continues to overheat the atmosphere. That means we’re already three-quarters of the way to the two-degree target.

How good are these numbers? No one is insisting that they’re exact, but few dispute that they’re generally right. The 565-gigaton figure was derived from one of the most sophisticated computer-simulation models that have been built by climate scientists around the world over the past few decades. And the number is being further confirmed by the latest climate-simulation models currently being finalized in advance of the next report by the Intergovernmental Panel on Climate Change. “Looking at them as they come in, they hardly differ at all,” says Tom Wigley, an Australian climatologist at the National Center for Atmospheric Research. “There’s maybe 40 models in the data set now, compared with 20 before. But so far the numbers are pretty much the same. We’re just fine-tuning things. I don’t think much has changed over the last decade.” William Collins, a senior climate scientist at the Lawrence Berkeley National Laboratory, agrees. “I think the results of this round of simulations will be quite similar,” he says. “We’re not getting any free lunch from additional understanding of the climate system.”

We’re not getting any free lunch from the world’s economies, either. With only a single year’s lull in 2009 at the height of the financial crisis, we’ve continued to pour record amounts of carbon into the atmosphere, year after year. In late May, the International Energy Agency published its latest figures – CO2 emissions last year rose to 31.6 gigatons, up 3.2 percent from the year before. America had a warm winter and converted more coal-fired power plants to natural gas, so its emissions fell slightly; China kept booming, so its carbon output (which recently surpassed the U.S.) rose 9.3 percent; the Japanese shut down their fleet of nukes post-Fukushima, so their emissions edged up 2.4 percent. “There have been efforts to use more renewable energy and improve energy efficiency,” said Corinne Le Quéré, who runs England’s Tyndall Centre for Climate Change Research. “But what this shows is that so far the effects have been marginal.” In fact, study after study predicts that carbon emissions will keep growing by roughly three percent a year – and at that rate, we’ll blow through our 565-gigaton allowance in 16 years, around the time today’s preschoolers will be graduating from high school. “The new data provide further evidence that the door to a two-degree trajectory is about to close,” said Fatih Birol, the IEA’s chief economist. In fact, he continued, “When I look at this data, the trend is perfectly in line with a temperature increase of about six degrees.” That’s almost 11 degrees Fahrenheit, which would create a planet straight out of science fiction.

So, new data in hand, everyone at the Rio conference renewed their ritual calls for serious international action to move us back to a two-degree trajectory. The charade will continue in November, when the next Conference of the Parties (COP) of the U.N. Framework Convention on Climate Change convenes in Qatar. This will be COP 18 – COP 1 was held in Berlin in 1995, and since then the process has accomplished essentially nothing. Even scientists, who are notoriously reluctant to speak out, are slowly overcoming their natural preference to simply provide data. “The message has been consistent for close to 30 years now,” Collins says with a wry laugh, “and we have the instrumentation and the computer power required to present the evidence in detail. If we choose to continue on our present course of action, it should be done with a full evaluation of the evidence the scientific community has presented.” He pauses, suddenly conscious of being on the record. “I should say, a fuller evaluation of the evidence.”

So far, though, such calls have had little effect. We’re in the same position we’ve been in for a quarter-century: scientific warning followed by political inaction. Among scientists speaking off the record, disgusted candor is the rule. One senior scientist told me, “You know those new cigarette packs, where governments make them put a picture of someone with a hole in their throats? Gas pumps should have something like that.”

The Third Number: 2,795 Gigatons

This number is the scariest of all – one that, for the first time, meshes the political and scientific dimensions of our dilemma. It was highlighted last summer by the Carbon Tracker Initiative, a team of London financial analysts and environmentalists who published a report in an effort to educate investors about the possible risks that climate change poses to their stock portfolios. The number describes the amount of carbon already contained in the proven coal and oil and gas reserves of the fossil-fuel companies, and the countries (think Venezuela or Kuwait) that act like fossil-fuel companies. In short, it’s the fossil fuel we’re currently planning to burn. And the key point is that this new number – 2,795 – is higher than 565. Five times higher.

The Carbon Tracker Initiative – led by James Leaton, an environmentalist who served as an adviser at the accounting giant PricewaterhouseCoopers – combed through proprietary databases to figure out how much oil, gas and coal the world’s major energy companies hold in reserve. The numbers aren’t perfect – they don’t fully reflect the recent surge in unconventional energy sources like shale gas, and they don’t accurately reflect coal reserves, which are subject to less stringent reporting requirements than oil and gas. But for the biggest companies, the figures are quite exact: If you burned everything in the inventories of Russia’s Lukoil and America’s ExxonMobil, for instance, which lead the list of oil and gas companies, each would release more than 40 gigatons of carbon dioxide into the atmosphere.

Which is exactly why this new number, 2,795 gigatons, is such a big deal. Think of two degrees Celsius as the legal drinking limit – equivalent to the 0.08 blood-alcohol level below which you might get away with driving home. The 565 gigatons is how many drinks you could have and still stay below that limit – the six beers, say, you might consume in an evening. And the 2,795 gigatons? That’s the three 12-packs the fossil-fuel industry has on the table, already opened and ready to pour.

We have five times as much oil and coal and gas on the books as climate scientists think is safe to burn. We’d have to keep 80 percent of those reserves locked away underground to avoid that fate. Before we knew those numbers, our fate had been likely. Now, barring some massive intervention, it seems certain.

Yes, this coal and gas and oil is still technically in the soil. But it’s already economically aboveground – it’s figured into share prices, companies are borrowing money against it, nations are basing their budgets on the presumed returns from their patrimony. It explains why the big fossil-fuel companies have fought so hard to prevent the regulation of carbon dioxide – those reserves are their primary asset, the holding that gives their companies their value. It’s why they’ve worked so hard these past years to figure out how to unlock the oil in Canada’s tar sands, or how to drill miles beneath the sea, or how to frack the Appalachians.

If you told Exxon or Lukoil that, in order to avoid wrecking the climate, they couldn’t pump out their reserves, the value of their companies would plummet. John Fullerton, a former managing director at JP Morgan who now runs the Capital Institute, calculates that at today’s market value, those 2,795 gigatons of carbon emissions are worth about $27 trillion. Which is to say, if you paid attention to the scientists and kept 80 percent of it underground, you’d be writing off $20 trillion in assets. The numbers aren’t exact, of course, but that carbon bubble makes the housing bubble look small by comparison. It won’t necessarily burst – we might well burn all that carbon, in which case investors will do fine. But if we do, the planet will crater. You can have a healthy fossil-fuel balance sheet, or a relatively healthy planet – but now that we know the numbers, it looks like you can’t have both. Do the math: 2,795 is five times 565. That’s how the story ends.

So far, as I said at the start, environmental efforts to tackle global warming have failed. The planet’s emissions of carbon dioxide continue to soar, especially as developing countries emulate (and supplant) the industries of the West. Even in rich countries, small reductions in emissions offer no sign of the real break with the status quo we’d need to upend the iron logic of these three numbers. Germany is one of the only big countries that has actually tried hard to change its energy mix; on one sunny Saturday in late May, that northern-latitude nation generated nearly half its power from solar panels within its borders. That’s a small miracle – and it demonstrates that we have the technology to solve our problems. But we lack the will. So far, Germany’s the exception; the rule is ever more carbon.

This record of failure means we know a lot about what strategies don’t work. Green groups, for instance, have spent a lot of time trying to change individual lifestyles: the iconic twisty light bulb has been installed by the millions, but so have a new generation of energy-sucking flatscreen TVs. Most of us are fundamentally ambivalent about going green: We like cheap flights to warm places, and we’re certainly not going to give them up if everyone else is still taking them. Since all of us are in some way the beneficiaries of cheap fossil fuel, tackling climate change has been like trying to build a movement against yourself – it’s as if the gay-rights movement had to be constructed entirely from evangelical preachers, or the abolition movement from slaveholders.

People perceive – correctly – that their individual actions will not make a decisive difference in the atmospheric concentration of CO2; by 2010, a poll found that “while recycling is widespread in America and 73 percent of those polled are paying bills online in order to save paper,” only four percent had reduced their utility use and only three percent had purchased hybrid cars. Given a hundred years, you could conceivably change lifestyles enough to matter – but time is precisely what we lack.

A more efficient method, of course, would be to work through the political system, and environmentalists have tried that, too, with the same limited success. They’ve patiently lobbied leaders, trying to convince them of our peril and assuming that politicians would heed the warnings. Sometimes it has seemed to work. Barack Obama, for instance, campaigned more aggressively about climate change than any president before him – the night he won the nomination, he told supporters that his election would mark the moment “the rise of the oceans began to slow and the planet began to heal.” And he has achieved one significant change: a steady increase in the fuel efficiency mandated for automobiles. It’s the kind of measure, adopted a quarter-century ago, that would have helped enormously. But in light of the numbers I’ve just described, it’s obviously a very small start indeed.

At this point, effective action would require actually keeping most of the carbon the fossil-fuel industry wants to burn safely in the soil, not just changing slightly the speed at which it’s burned. And there the president, apparently haunted by the still-echoing cry of “Drill, baby, drill,” has gone out of his way to frack and mine. His secretary of interior, for instance, opened up a huge swath of the Powder River Basin in Wyoming for coal extraction: The total basin contains some 67.5 gigatons worth of carbon (or more than 10 percent of the available atmospheric space). He’s doing the same thing with Arctic and offshore drilling; in fact, as he explained on the stump in March, “You have my word that we will keep drilling everywhere we can… That’s a commitment that I make.” The next day, in a yard full of oil pipe in Cushing, Oklahoma, the president promised to work on wind and solar energy but, at the same time, to speed up fossil-fuel development: “Producing more oil and gas here at home has been, and will continue to be, a critical part of an all-of-the-above energy strategy.” That is, he’s committed to finding even more stock to add to the 2,795-gigaton inventory of unburned carbon.

Sometimes the irony is almost Borat-scale obvious: In early June, Secretary of State Hillary Clinton traveled on a Norwegian research trawler to see firsthand the growing damage from climate change. “Many of the predictions about warming in the Arctic are being surpassed by the actual data,” she said, describing the sight as “sobering.” But the discussions she traveled to Scandinavia to have with other foreign ministers were mostly about how to make sure Western nations get their share of the estimated $9 trillion in oil (that’s more than 90 billion barrels, or 37 gigatons of carbon) that will become accessible as the Arctic ice melts. Last month, the Obama administration indicated that it would give Shell permission to start drilling in sections of the Arctic.

Almost every government with deposits of hydrocarbons straddles the same divide. Canada, for instance, is a liberal democracy renowned for its internationalism – no wonder, then, that it signed on to the Kyoto treaty, promising to cut its carbon emissions substantially by 2012. But the rising price of oil suddenly made the tar sands of Alberta economically attractive – and since, as NASA climatologist James Hansen pointed out in May, they contain as much as 240 gigatons of carbon (or almost half of the available space if we take the 565 limit seriously), that meant Canada’s commitment to Kyoto was nonsense. In December, the Canadian government withdrew from the treaty before it faced fines for failing to meet its commitments.

The same kind of hypocrisy applies across the ideological board: In his speech to the Copenhagen conference, Venezuela’s Hugo Chavez quoted Rosa Luxemburg, Jean-Jacques Rousseau and “Christ the Redeemer,” insisting that “climate change is undoubtedly the most devastating environmental problem of this century.” But the next spring, in the Simon Bolivar Hall of the state-run oil company, he signed an agreement with a consortium of international players to develop the vast Orinoco tar sands as “the most significant engine for a comprehensive development of the entire territory and Venezuelan population.” The Orinoco deposits are larger than Alberta’s – taken together, they’d fill up the whole available atmospheric space.

So: the paths we have tried to tackle global warming have so far produced only gradual, halting shifts. A rapid, transformative change would require building a movement, and movements require enemies. As John F. Kennedy put it, “The civil rights movement should thank God for Bull Connor. He’s helped it as much as Abraham Lincoln.” And enemies are what climate change has lacked.

But what all these climate numbers make painfully, usefully clear is that the planet does indeed have an enemy – one far more committed to action than governments or individuals. Given this hard math, we need to view the fossil-fuel industry in a new light. It has become a rogue industry, reckless like no other force on Earth. It is Public Enemy Number One to the survival of our planetary civilization. “Lots of companies do rotten things in the course of their business – pay terrible wages, make people work in sweatshops – and we pressure them to change those practices,” says veteran anti-corporate leader Naomi Klein, who is at work on a book about the climate crisis. “But these numbers make clear that with the fossil-fuel industry, wrecking the planet is their business model. It’s what they do.”

According to the Carbon Tracker report, if Exxon burns its current reserves, it would use up more than seven percent of the available atmospheric space between us and the risk of two degrees. BP is just behind, followed by the Russian firm Gazprom, then Chevron, ConocoPhillips and Shell, each of which would fill between three and four percent. Taken together, just these six firms, of the 200 listed in the Carbon Tracker report, would use up more than a quarter of the remaining two-degree budget. Severstal, the Russian mining giant, leads the list of coal companies, followed by firms like BHP Billiton and Peabody. The numbers are simply staggering – this industry, and this industry alone, holds the power to change the physics and chemistry of our planet, and they’re planning to use it.

They’re clearly cognizant of global warming – they employ some of the world’s best scientists, after all, and they’re bidding on all those oil leases made possible by the staggering melt of Arctic ice. And yet they relentlessly search for more hydrocarbons – in early March, Exxon CEO Rex Tillerson told Wall Street analysts that the company plans to spend $37 billion a year through 2016 (about $100 million a day) searching for yet more oil and gas.

There’s not a more reckless man on the planet than Tillerson. Late last month, on the same day the Colorado fires reached their height, he told a New York audience that global warming is real, but dismissed it as an “engineering problem” that has “engineering solutions.” Such as? “Changes to weather patterns that move crop-production areas around – we’ll adapt to that.” This in a week when Kentucky farmers were reporting that corn kernels were “aborting” in record heat, threatening a spike in global food prices. “The fear factor that people want to throw out there to say, ‘We just have to stop this,’ I do not accept,” Tillerson said. Of course not – if he did accept it, he’d have to keep his reserves in the ground. Which would cost him money. It’s not an engineering problem, in other words – it’s a greed problem.

You could argue that this is simply in the nature of these companies – that having found a profitable vein, they’re compelled to keep mining it, more like efficient automatons than people with free will. But as the Supreme Court has made clear, they are people of a sort. In fact, thanks to the size of its bankroll, the fossil-fuel industry has far more free will than the rest of us. These companies don’t simply exist in a world whose hungers they fulfill – they help create the boundaries of that world.

Left to our own devices, citizens might decide to regulate carbon and stop short of the brink; according to a recent poll, nearly two-thirds of Americans would back an international agreement that cut carbon emissions 90 percent by 2050. But we aren’t left to our own devices. The Koch brothers, for instance, have a combined wealth of $50 billion, meaning they trail only Bill Gates on the list of richest Americans. They’ve made most of their money in hydrocarbons, they know any system to regulate carbon would cut those profits, and they reportedly plan to lavish as much as $200 million on this year’s elections. In 2009, for the first time, the U.S. Chamber of Commerce surpassed both the Republican and Democratic National Committees on political spending; the following year, more than 90 percent of the Chamber’s cash went to GOP candidates, many of whom deny the existence of global warming. Not long ago, the Chamber even filed a brief with the EPA urging the agency not to regulate carbon – should the world’s scientists turn out to be right and the planet heats up, the Chamber advised, “populations can acclimatize to warmer climates via a range of behavioral, physiological and technological adaptations.” As radical goes, demanding that we change our physiology seems right up there.

Environmentalists, understandably, have been loath to make the fossil-fuel industry their enemy, respecting its political power and hoping instead to convince these giants that they should turn away from coal, oil and gas and transform themselves more broadly into “energy companies.” Sometimes that strategy appeared to be working – emphasis on appeared. Around the turn of the century, for instance, BP made a brief attempt to restyle itself as “Beyond Petroleum,” adapting a logo that looked like the sun and sticking solar panels on some of its gas stations. But its investments in alternative energy were never more than a tiny fraction of its budget for hydrocarbon exploration, and after a few years, many of those were wound down as new CEOs insisted on returning to the company’s “core business.” In December, BP finally closed its solar division. Shell shut down its solar and wind efforts in 2009. The five biggest oil companies have made more than $1 trillion in profits since the millennium – there’s simply too much money to be made on oil and gas and coal to go chasing after zephyrs and sunbeams.

Much of that profit stems from a single historical accident: Alone among businesses, the fossil-fuel industry is allowed to dump its main waste, carbon dioxide, for free. Nobody else gets that break – if you own a restaurant, you have to pay someone to cart away your trash, since piling it in the street would breed rats. But the fossil-fuel industry is different, and for sound historical reasons: Until a quarter-century ago, almost no one knew that CO2 was dangerous. But now that we understand that carbon is heating the planet and acidifying the oceans, its price becomes the central issue.

If you put a price on carbon, through a direct tax or other methods, it would enlist markets in the fight against global warming. Once Exxon has to pay for the damage its carbon is doing to the atmosphere, the price of its products would rise. Consumers would get a strong signal to use less fossil fuel – every time they stopped at the pump, they’d be reminded that you don’t need a semimilitary vehicle to go to the grocery store. The economic playing field would now be a level one for nonpolluting energy sources. And you could do it all without bankrupting citizens – a so-called “fee-and-dividend” scheme would put a hefty tax on coal and gas and oil, then simply divide up the proceeds, sending everyone in the country a check each month for their share of the added costs of carbon. By switching to cleaner energy sources, most people would actually come out ahead.

There’s only one problem: Putting a price on carbon would reduce the profitability of the fossil-fuel industry. After all, the answer to the question “How high should the price of carbon be?” is “High enough to keep those carbon reserves that would take us past two degrees safely in the ground.” The higher the price on carbon, the more of those reserves would be worthless. The fight, in the end, is about whether the industry will succeed in its fight to keep its special pollution break alive past the point of climate catastrophe, or whether, in the economists’ parlance, we’ll make them internalize those externalities.

It’s not clear, of course, that the power of the fossil-fuel industry can be broken. The U.K. analysts who wrote the Carbon Tracker report and drew attention to these numbers had a relatively modest goal – they simply wanted to remind investors that climate change poses a very real risk to the stock prices of energy companies. Say something so big finally happens (a giant hurricane swamps Manhattan, a megadrought wipes out Midwest agriculture) that even the political power of the industry is inadequate to restrain legislators, who manage to regulate carbon. Suddenly those Chevron reserves would be a lot less valuable, and the stock would tank. Given that risk, the Carbon Tracker report warned investors to lessen their exposure, hedge it with some big plays in alternative energy.

“The regular process of economic evolution is that businesses are left with stranded assets all the time,” says Nick Robins, who runs HSBC’s Climate Change Centre. “Think of film cameras, or typewriters. The question is not whether this will happen. It will. Pension systems have been hit by the dot-com and credit crunch. They’ll be hit by this.” Still, it hasn’t been easy to convince investors, who have shared in the oil industry’s record profits. “The reason you get bubbles,” sighs Leaton, “is that everyone thinks they’re the best analyst – that they’ll go to the edge of the cliff and then jump back when everyone else goes over.”

So pure self-interest probably won’t spark a transformative challenge to fossil fuel. But moral outrage just might – and that’s the real meaning of this new math. It could, plausibly, give rise to a real movement.

Once, in recent corporate history, anger forced an industry to make basic changes. That was the campaign in the 1980s demanding divestment from companies doing business in South Africa. It rose first on college campuses and then spread to municipal and state governments; 155 campuses eventually divested, and by the end of the decade, more than 80 cities, 25 states and 19 counties had taken some form of binding economic action against companies connected to the apartheid regime. “The end of apartheid stands as one of the crowning accomplishments of the past century,” as Archbishop Desmond Tutu put it, “but we would not have succeeded without the help of international pressure,” especially from “the divestment movement of the 1980s.”

The fossil-fuel industry is obviously a tougher opponent, and even if you could force the hand of particular companies, you’d still have to figure out a strategy for dealing with all the sovereign nations that, in effect, act as fossil-fuel companies. But the link for college students is even more obvious in this case. If their college’s endowment portfolio has fossil-fuel stock, then their educations are being subsidized by investments that guarantee they won’t have much of a planet on which to make use of their degree. (The same logic applies to the world’s largest investors, pension funds, which are also theoretically interested in the future – that’s when their members will “enjoy their retirement.”) “Given the severity of the climate crisis, a comparable demand that our institutions dump stock from companies that are destroying the planet would not only be appropriate but effective,” says Bob Massie, a former anti-apartheid activist who helped found the Investor Network on Climate Risk. “The message is simple: We have had enough. We must sever the ties with those who profit from climate change – now.”

Movements rarely have predictable outcomes. But any campaign that weakens the fossil-fuel industry’s political standing clearly increases the chances of retiring its special breaks. Consider President Obama’s signal achievement in the climate fight, the large increase he won in mileage requirements for cars. Scientists, environmentalists and engineers had advocated such policies for decades, but until Detroit came under severe financial pressure, it was politically powerful enough to fend them off. If people come to understand the cold, mathematical truth – that the fossil-fuel industry is systematically undermining the planet’s physical systems – it might weaken it enough to matter politically. Exxon and their ilk might drop their opposition to a fee-and-dividend solution; they might even decide to become true energy companies, this time for real.

Even if such a campaign is possible, however, we may have waited too long to start it. To make a real difference – to keep us under a temperature increase of two degrees – you’d need to change carbon pricing in Washington, and then use that victory to leverage similar shifts around the world. At this point, what happens in the U.S. is most important for how it will influence China and India, where emissions are growing fastest. (In early June, researchers concluded that China has probably under-reported its emissions by up to 20 percent.) The three numbers I’ve described are daunting – they may define an essentially impossible future. But at least they provide intellectual clarity about the greatest challenge humans have ever faced. We know how much we can burn, and we know who’s planning to burn more. Climate change operates on a geological scale and time frame, but it’s not an impersonal force of nature; the more carefully you do the math, the more thoroughly you realize that this is, at bottom, a moral issue; we have met the enemy and they is Shell.

Meanwhile the tide of numbers continues. The week after the Rio conference limped to its conclusion, Arctic sea ice hit the lowest level ever recorded for that date. Last month, on a single weekend, Tropical Storm Debby dumped more than 20 inches of rain on Florida – the earliest the season’s fourth-named cyclone has ever arrived. At the same time, the largest fire in New Mexico history burned on, and the most destructive fire in Colorado’s annals claimed 346 homes in Colorado Springs – breaking a record set the week before in Fort Collins. This month, scientists issued a new study concluding that global warming has dramatically increased the likelihood of severe heat and drought – days after a heat wave across the Plains and Midwest broke records that had stood since the Dust Bowl, threatening this year’s harvest. You want a big number? In the course of this month, a quadrillion kernels of corn need to pollinate across the grain belt, something they can’t do if temperatures remain off the charts. Just like us, our crops are adapted to the Holocene, the 11,000-year period of climatic stability we’re now leaving… in the dust.

This story is from the August 2nd, 2012 issue of Rolling Stone.

Climate models that predict more droughts win further scientific support (Washington Post)

The drought of 2012: It has been more than a half-century since a drought this extensive hit the United States, NOAA reported July 16. The effects are growing and may cost the U.S. economy $50 billion.

By Hristio Boytchev, Published: August 13

The United States will suffer a series of severe droughts in the next two decades, according to a new study published in the journal Nature Climate Change. Moreover, global warming will play an increasingly important role in their abundance and severity, claims Aiguo Dai, the study’s author.

His findings bolster conclusions from climate models used by researchers around the globe that have predicted severe and widespread droughts in coming decades over many land areas. Those models had been questioned because they did not fully reflect actual drought patterns when they were applied to conditions in the past. However, using a statistical method with data about sea surface temperatures, Dai, a climate researcher at the federally funded National Center for Atmospheric Research, found that the model accurately portrayed historic climate events.

“We can now be more confident that the models are correct,” Dai said, “but unfortunately, their predictions are dire.”

In the United States, the main culprit currently is a cold cycle in the surface temperature of the eastern Pacific Ocean. It decreases precipitation, especially over the western part of the country. “We had a similar situation in the Dust Bowl era of the 1930s,” said Dai, who works at the research center’s headquarters in Boulder, Colo.

While current models cannot predict the severity of a drought in a given year, they can assess its probability. “Considering the current trend, I was not surprised by the 2012 drought,” Dai said.

The Pacific cycle is expected to last for the next one or two decades, bringing more aridity. On top of that comes climate change. “Global warming has a subtle effect on drought at the moment,” Dai said, “but by the end of the cold cycle, global warming might take over and continue to cause dryness.”

While the variations in sea temperatures primarily influence precipitation, global warming is expected to bring droughts by increasing evaporation over land. Additionally, Dai predicts more dryness in South America, Southern Europe and Africa.

“The similarity between the observed droughts and the projections from climate models here is striking,” said Peter Cox, a professor of climate system dynamics at Britain’s University of Exeter, who was not involved in Dai’s research. He said he also agrees that the latest models suggest increasing drought to be consistent with man-made climate change.

Calgary hail storm: Cloud seeding credited for sparing city from worse disaster (The Calgary Herald)

‘The storm was a monster,’ says weather modification company

BY THANDI FLETCHER, CALGARY HERALD AUGUST 14, 2012

Paul Newell captured dramatic images in the Bearspaw area of northwest Calgary just before the start of the hailstorm on Sunday, Aug. 12, 2012.

Paul Newell captured dramatic images in the Bearspaw area of northwest Calgary just before the start of the hailstorm on Sunday, Aug. 12, 2012. Photograph by: Reader photo , Paul Newell

A ferocious storm that hammered parts of Calgary with hail stones larger than golf balls late Sunday, causing millions of dollars worth of damage, could have been much worse if cloud-seeding planes hadn’t attempted to calm it down.

“The storm was a monster,” said Terry Krauss, project director of the Alberta Severe Weather Management Society, which contracts American-based company Weather Modification Inc. to seed severe weather clouds in Alberta’s skies. The society is funded by a group of insurance companies with a goal of reducing hail damage claims.

Before the storm hit, Krauss said, the company sent all four of its cloud-seeding aircraft into the thick and swirling black clouds. The planes flew for more than 12 hours, shooting silver iodide, a chemical agent that helps limit the size of hail stones, at the top and base of the clouds, until midnight.

But despite the heavy seeding, golf-ball-sized hail stones pelted parts of Calgary late Sunday night, causing widespread damage to cars and homes.

“This one was a beast. It took everything we threw at it and still was able to wreak some havoc,” said Krauss. “I believe if we hadn’t seeded, it would have even been worse.”

Northeast Calgary was worst hit by the storm, where the hail was between five and six centimetres, said Environment Canada meteorologist John Paul Craig. Other parts of the city saw toonie-sized hail from a second storm system, said Craig.

Craig said Sunday’s storm was worse than Calgary’s last major hailstorm, which saw four-centimetre hail stones, in July 2010.

“These hail stones were just a little bit bigger,” he said.

At Royal Oak Audi in the city’s northwest, broken glass from smashed windows littered the lot Monday morning. Of the 85 new and used cars on the lot, general manager Murray Dorren said not a single car was spared from the storm.

“It’s devastating — that’s probably the best word I can come up with,” he said. “It’s unbelievable that Mother Nature can do this much damage in a very short time. I think it probably took a matter of 10 minutes and there’s millions of dollars worth of damage.

Dorren estimated the damage at about $2 million. Across the lot, the dinged-up vehicles looked like dimpled golf balls from the repetitive pounding of the sizable stones. Some windows and sunroofs were shattered, while others were pierced by the heavy hail.

“They look like bullet holes right through the windscreen,” salesman Nick Berkland said of the damage.

Insurance companies and brokers were inundated with calls all day as customers tried to file claims on their wrecked cars and homes.

Ron Biggs, claims director for Intact Insurance, said it’s too early to tell how many claims the hail event will spurn, although he said they received about two to three times their normal call volume on Monday.

Biggs said the level of damage so far appears to be similar to the July 2010 hailstorm, when Intact received about 12,000 hail damage claims.

Chief operating officer Bruce Rabik of Rogers Insurance, which insures several car dealerships in Calgary, said the damage is extensive.

“It’s certainly a bad one,” he said. “We’ve had one dealership, which they estimate 600 damaged cars. A couple other dealerships with 200 damaged cars each.”

Rabik said claims adjusters are overwhelmed with the volume of claims. He urged customers to be patient as it may take a day or two as insurance workers make their way to each home.

Shredded leaves, twigs and broken branches blanketed pathways along the Bow and Elbow rivers as city crews worked to clear them, said Calgary parks pathway lead Duane Sutherland.

“This was the worst that I’ve seen,” said Sutherland.

Once daylight broke Monday, Royal Oak resident Satya Mudlair inspected the exterior of his home, which was riddled with damage. “Lots of holes in the siding, window damage to the two bedroom windows, and the roof a little bit,” he said.

The apple tree in his backyard has also lost about half its apples, he said. Fortunately, his car was parked inside the garage and was spared any dents.

Mudlair said his insurance company told him it would take two or three weeks before the damage would be repaired. “There’s a big pile of names ahead of me,” he said.

Mudlair’s wife, Nirmalla, had just fallen asleep when she was awoken by the sound of hail stones hitting the roof.

“It was very bad. It was like, thump, thump,” she described the pelting sound. “We got scared and I kept running from room to room.”

Cloud-seeding expert Krauss said Calgary has experienced more severe weather than usual this year, although Sunday’s storm was by far the worst.

“It has been a very stormy year,” he said.

© Copyright (c) The Calgary Herald

Para evitar catástrofes ambientais (FAPERJ)

Vilma Homero

05/07/2012

 Nelson Fernandes / UFRJ
 
  Novos métodos podem prever onde e quando
ocorrerão deslizamentos na região serrana

Quando várias áreas de Nova Friburgo, Petrópolis e Teresópolis sofreram deslizamentos, em janeiro de 2011, soterrando mais de mil pessoas em toneladas de lama e destroços, a pergunta que ficou no ar foi se o desastre poderia ter sido minimizado. No que depender do Instituto de Geociências da Universidade Federal do Rio de Janeiro (UFRJ), as consequências provocadas por cataclismas ambientais como esses poderão ser cada vez menores. Para isso, os pesquisadores estão desenvolvendo uma série de projetos multidisciplinares para viabilizar sistemas de análise de riscos. Um deles é o Prever, que, com suporte de programas computacionais, une os avanços alcançados em metodologias de sensoriamento remoto, geoprocessamento, geomorfologia e geotecnia, à modelagem matemática para a previsão do tempo em áreas mais suscetíveis a deslizamentos, como a região serrana. “Embora a realidade dos vários municípios daquela região seja bastante diferente, há em comum uma falta de metodologias voltadas à previsão para esse tipo de risco. O fundamental agora é desenvolver métodos capazes de prever a localização espacial e temporal desses processos. Ou seja, saber “onde” e “quando” esses deslizamentos podem ocorrer”, explica o geólogo Nelson Ferreira Fernandes, professor do Departamento de Geografia da UFRJ e Cientista do Nosso Estado da FAPERJ.Para elaborar métodos de previsão de risco, em tempo real, que incluam movimentos de massa deflagrados em resposta a entradas pluviométricas, os pesquisadores estão traçando um mapeamento, realizado a partir de sucessivas imagens captadas por satélites, que são cruzadas com mapas geológicos e geotécnicos. O Prever combina modelos de simulação climática e de previsão de eventos pluviométricos extremos, desenvolvidos na área da meteorologia, com modelos matemáticos de previsão, mais as informações desenvolvidos pela geomorfologia e pela geotecnia, que nos indicam as áreas mais suscetíveis a deslizamentos. Assim, podemos elaborar traçar previsões de risco, em tempo real, classificando os resultados de acordo com a gravidade desse risco, que varia continuamente, no espaço e no tempo”, explica Nelson.

Para isso, os Departamentos de Geografia, Geologia e Meteorologia do Instituto de Geociências da UFRJ se unem à Faculdade de Geologia da Universidade do Estado do Rio de Janeiro (Uerj) e ao Departamento de Engenharia Civil da Pontifícia Universidade Católica (PUC-Rio). Com a sobreposição de informações, pode-se apontar, nas imagens resultantes, as áreas mais sensíveis a deslizamentos. “Somando esses conhecimentos acadêmicos aos dados de órgãos estaduais, como o Núcleo de Análise de Desastres (Nade), do Departamento de Recursos Minerais (DRM-RJ), responsável pelo apoio técnico à Defesa Civil, estaremos não apenas atualizando constantemente os mapas usados hoje pelos órgãos do governo do estado e pela Defesa Civil, como estaremos também facilitando um planejamento mais preciso para a tomada de decisões.”

 Divulgação / UFRJ
Uma simulação mostra em imagem a possibilidade de
um deslizamento de massas na
 região de Jacarepaguá

Esse novo mapeamento também significa melhor qualidade e maior precisão e mais detalhamento de imagens. “Obviamente, com melhores instrumentos em mãos, o que quer dizer mapas mais detalhados e precisos, os gestores públicos também poderão planejar e agir de forma mais acurada e em tempo real”, afirma Nelson. Segundo o pesquisador, esses mapas precisam ter atualização constante para acompanhar a dinâmica da interferência da ocupação humana sobre a topografia das várias regiões. “Isso vem acontecendo seja pelo corte de encostas, seja pela ocupação de áreas aterradas ou pelas mudanças em consequência da drenagem de rios. Tudo isso altera a topografia e, no caso de chuvas mais fortes e prolongadas, pode tornar determinados solos mais propensos a deslizamentos ou a alagamentos e enchentes”, exemplifica Nelson.Mas os sistemas de análises de desastres e riscos ambientais também compreendem outras linhas de pesquisa. No Prever, se trabalha em duas linhas de ação distintas. “Uma delas é a de clima, em que detectamos as áreas em que haverá um aumento pluviométrico a longo prazo e fornecemos informações a órgãos de decisão e planejamento. Outra é a previsão de curtíssimo prazo, o chamadonowcasting.” No caso de previsão de longo prazo, a professora Ana Maria Bueno Nunes, do Departamento de Meteorologia da mesma universidade, vem trabalhando no projeto “Implementação de um Sistema de Modelagem Regional: Estudos de Tempo e Clima”, sob sua coordenação, com a proposta de uma reconstrução do hidroclima da América do Sul, uma extensão daquele projeto.

“Unindo dados sobre precipitação fornecidos por satélite às informações das estações atmosféricas, é possível, através de modelagem computacional, traçar estimativas de precipitação. Assim, podemos não apenas saber quando haverá chuvas de intensidade mais forte, ou mais prolongadas, como também observar em mapas passados qual foi a convergência de fatores que provocou uma situação de desastre. A reconstrução é uma forma de estudar o passado para entender cenários atuais que se mostrem semelhantes. E, com isso, ajudamos a melhorar os modelos de previsão”, afirma Ana. Estas informações, que a princípio servirão para uso acadêmico e científico, permitirão que se tenha dados cada vez mais detalhados de como se formam grandes chuvas, aquelas que são capazes de provocar inundações em determinadas áreas. “Isso permitirá não apenas compreender melhor as condições em que certas situações de calamidade acontecem, como prever quando essas condições podem se repetir. Com o projeto, estamos também formando recursos humanos ainda mais especializados nessa área”, avalia a pesquisadora, cujo trabalho conta com recursos de um Auxílio à Pesquisa (APQ 1).

Também integrante do projeto, o professor Gutemberg Borges França, da UFRJ, explica que existem três tipos de previsão meteorológica: a sinótica – que traça previsões numa média de 6h até sete dias, cobrindo alguns milhares de km, como o continente sul-americano; a de mesoescala, que faz previsões sobre uma média de 6h a dois dias, cobrindo algumas centenas de km, como o estado do Rio de Janeiro; e a de curto prazo, ou nowcasting, que varia de poucos minutos até 3h a 6h, sobre uma área específica de poucos km, como a região metropolitana do Rio de Janeiro, por exemplo.

Se previsões de longo prazo são importantes, as de curto prazo, ou nowcasting, também são. Segundo Gutemberg, os atuais modelos numéricos de previsão ainda são deficientes para realizar a previsão de curto prazo, que termina sendo feita em grande parte com base na experiência do meteorologista, pela interpretação das informações de várias fontes de dados disponíveis, como imagens de satélites; de estações meteorológicas de superfície e altitude; de radar e sodar (Sonic Detection and Ranging), e modelos numéricos. “No entanto, o meteorologista carece ainda hoje de ferramentas objetivas que possam auxiliá-lo na integração dessas diversas informações para realizar uma previsão de curto prazo mais acurada”, argumenta Gutemberg.Atualmente, o Rio de Janeiro já dispõe de estações de recepção de satélites, estação de altitude – radiosondagem – que geram perfis atmosféricos, estações meteorológicas de superfície e radar. O Laboratório de Meteorologia Aplicada do Departamento de Meteorologia, da UFRJ, está desenvolvendo, desde 2005, ferramentas de previsão de curto prazo, utilizando inteligência computacional, visando o aprimoramento das previsões de eventos meteorológicos extremos para o Rio de Janeiro. “Com inteligência computacional, temos essa informação em tempo mais curto e de forma mais acurada.”, resume.

© FAPERJ – Todas as matérias poderão ser reproduzidas, desde que citada a fonte.

Festival interativo leva visitantes a experimentar situações de desastre ambiental (Agência Brasil)

01/6/2012 – 10h42

por Thais Leitão, da Agência Brasil

Chamada53 Festival interativo leva visitantes a experimentar situações de desastre ambientalRio de Janeiro – Uma floresta que entra em chamas colocando em risco a vida de animais e da vegetação existente; uma geleira intacta que de repente começa a derreter ou uma casa que sofre inundação. Todas essas situações, provocadas pelo desequilíbrio ambiental, podem ser experimentadas pelo público durante o Green Nation Fest, festival interativo e sensorial que começou hoje (31) na Quinta da Boa Vista, zona norte do Rio de Janeiro, e vai até 7 de junho.

De acordo com o diretor da organização não governamental (ONG) Centro de Cultura, Informação e Meio Ambiente (Cima), que organiza do evento, Marcos Didonet, o objetivo é levar experiências práticas aos visitantes e estimular o público a agir de forma mais sustentável. A Cima desenvolve há mais de 20 anos ações em parceria com instituições privadas, governamentais e multilaterais.

“O objetivo é alcançar o grande público que não está acostumado a vivenciar a questão ambiental, trazendo o assunto de forma mais interessante, agradável e prática. Para isso, nossos artistas e cientistas bolaram essas instalações capazes de promover sensações que serão ainda mais frequentes se não mudarmos nossos padrões de consumo e comportamentos cotidianos”, afirmou.

No local, também há tendas onde ocorrem oficinas lúdicas e educativas. Em uma delas, montada pelo Instituto Estadual do Ambiente (Inea), um grupo de 30 alunos da rede municipal do Rio aprendeu, hoje, a produzir carteiras usando caixas de leite e recortes de tecido.

Para a estudante Ana Beatriz Leão, 14 anos, a ideia é criativa e pode servir para presentear amigos. “É legal porque a gente geralmente joga no lixo e agora sabe que dá para fazer outras coisas com a caixa. A que eu fiz, vou dar para uma amiga que tenho certeza que vai gostar”, contou a adolescente.

Na mesma tenda, os visitantes podem conferir outros produtos feitos com material reutilizado, como uma pequena bateria produzida com latinhas de refrigerante, livros infantis com retalhos de tecidos e bonecos com caixa de sapato.

Entre os meninos, uma das atividades preferidas é o Gol de Bicicleta na qual os participantes pedalam e geram energia para seu time. A cada watt gerado, um gol é marcado para o time de preferência. Além disso, uma bateria é abastecida e leva energia para ser utilizada em outra instalação do festival.

Os amigos Gustavo Fonseca e Roberto Damião, ambos de 11 anos, também alunos da rede municipal do Rio, disseram que a experiência é “muito intensa”.

“Foi muito legal porque a gente aprendeu outra maneira de gerar energia e ainda fez gol pro Mengão”, disse Roberto, que torce pelo Flamengo.

O evento, com entrada gratuita, também oferece uma a Mostra Internacional de Cinema, com 12 longas-metragens, e seminários com convidados brasileiros e internacionais sobre economia verde e criativa, que serão abertos para debates. A programação completa pode ser conferida no site www.greennationfest.com.br.

* Publicado originalmente no site da Agência Brasil.

 

Resilient People More Satisfied With Life (Science Daily)

ScienceDaily (May 23, 2012) — When confronted with adverse situations such as the loss of a loved one, some people never fully recover from the pain. Others, the majority, pull through and experience how the intensity of negative emotions (e.g. anxiety, depression) grows dimmer with time until they adapt to the new situation. A third group is made up of individuals whose adversities have made them grow personally and whose life takes on new meaning, making them feel stronger than before.

Researchers at the Basic Psychology Unit at Universitat Autònoma de Barcelona analyzed the responses of 254 students from the Faculty of Psychology in different questionnaires. The purpose was to evaluate their level of satisfaction with life and find connections between their resilience and their capacity of emotional recovery, one of the components of emotional intelligence which consists in the ability to control one’s emotions and those of others.

Research data shows that students who are more resilient, 20% of those surveyed, are more satisfied with their lives and are also those who believe they have control over their emotions and their state of mind. Resilience therefore has a positive prediction effect on the level of satisfaction with one’s life.

“Some of the characteristics of being resilient can be worked on and improved, such as self-esteem and being able to regulate one’s emotions. Learning these techniques can offer people the resources needed to help them adapt and improve their quality of life”, explains Dr Joaquín T Limonero, professor of the UAB Research Group on Stress and Health at UAB and coordinator of the research.

Published recently in Behavioral Psychology, the study included the participation of UAB researcher Jordi Fernández Castro; professors of the Gimbernat School of Nursing (a UAB-affiliated centre) Joaquín Tomás-Sábado and Amor Aradilla Herrera; and psychologist and researcher of Egarsat, M. José Gómez-Romero.

How Twitter Is Used to Share Information After a Disaster (Science Daily)

ScienceDaily (May 22, 2012) — A study from North Carolina State University shows how people used Twitter following the 2011 nuclear disaster in Japan, highlighting challenges for using the social media tool to share information. The study also indicates that social media haven’t changed what we communicate so much as how quickly we can disseminate it.

“I wanted to see if Twitter was an effective tool for sharing meaningful information about nuclear risk in the wake of the disaster at the Fukushima Daiichi nuclear plant,” says Dr. Andrew Binder, an assistant professor of communication at NC State and author of a paper describing the work. “I knew people would be sharing information, but I wanted to see whether it was anecdotal or substantive, and whether users were providing analysis and placing information in context.

“In the bigger picture, I wanted to see whether social media is changing the way we communicate, or if we are communicating the same way using different tools.”

Binder searched for Twitter posts, or “tweets,” originating in the United States that specifically referenced “Fukushima Daiichi” – which is the name of the nuclear plant – rather than searching for the term “Fukushima.” This allowed him to target tweets about the plant instead of general references to the tsunami and overarching disaster in the region. Using that as a base, Binder then selected every 20th tweet on each day over the two weeks following the onset of the Fukushima disaster – from March 11 to March 25, 2011 – to create a representative sample of these tweets.

Fifteen percent of the tweets in the sample contained some mention of risk-related terms, such as hazard or exposure, while 17.7 percent of the tweets included language that helped place the events at Fukushima Daiichi and their potential causes or consequences in context. For example, one tweet read “Most of the 100s of workers at Fukushima Daiichi live close to the plant so it’s their families and houses at risk.” Overall, 54 percent of the tweets included hyperlinks to external websites, of which 62.7 percent linked to traditional news sources.

“I found that, initially, tweets that mentioned risk were unlikely to include analysis or information on context,” Binder says. “Similarly, tweets that attempted to help understand events at Fukushima Daiichi rarely mentioned risk. By the time people began tweeting about risk issues in a meaningful way, the initial high level of interest had begun to wane significantly.”

Binder also found that people were more likely to include links to websites as time went on. And, as time passed, a higher percentage of those links were to traditional news sites.

“This highlights a significant problem,” Binder says. “People are clearly looking to news outlets for insight and analysis into disasters such as this one. But news organizations have fewer and fewer reporters who specialize in covering science and technology issues – and those are the very reporters who would be able to provide insight and analysis on these events.”

The study also seems to imply that social media have not significantly changed the content of our communications. “This case, at least, indicates that Twitter is allowing people to share news quickly and easily,” Binder says. “But the news they are sharing is not much different from that available to someone reading a print newspaper – they’re simply getting it sooner.”

Are You Prepared for Zombies? (American Anthropological Association)

by Joslyn O.

 Today’s guest blog post is by cultural anthropologist and AAA member, Chad Huddleston. He is an Assistant Professor at St. Louis University in the Sociology, Anthropology and Criminal Justice department.

Recently, a host of new shows, such as Doomsday Preppers on NatGeo and Doomsday Bunkers on Discovery Channel, has focused on people with a wide array of concerns about possible events that may threaten their lives. Both of these shows focus on what are called ‘preppers.’ While the people that may have performed these behaviors in the past might have been called ‘survivalists,’ many ‘preppers’ have distanced themselves from that term, due to its cultural baggage: stereotypical anti-government, gun-loving, racist, extremists that are most often associated with the fundamentalist (politically and religiously) right side of the spectrum.

I’ve been doing fieldwork with preppers for the past two years, focusing on a group called Zombie Squad. It is ‘the nation’s premier non-stationary cadaver suppression task force,’ as well as a grassroots, 501(c)3 charity organization. Zombie Squad’s story is that while the zombie removal business is generally slow, there is no reason to be unprepared. So, while it is waiting for the “zombpacolpyse,” it focuses its time on disaster preparedness education for the membership and community.

The group’s position is that being prepared for zombies means that you are prepared for anything, especially those events that are much more likely than a zombie uprising – tornadoes, an interruption in services, ice storms, flooding, fires, and earthquakes.

For many in this group, Hurricane Katrina was the event that solidified their resolve to prep. They saw what we all saw – a natural disaster in which services were not available for most, leading to violence, death and chaos. Their argument is that the more prepared the public is before a disaster occurs, the less resources they will require from first responders and those agencies that come after them.

In fact, instead of being a victim of natural disaster, you can be an active responder yourself, if you are prepared. Prepare they do. Members are active in gaining knowledge of all sorts – first aid, communications, tactical training, self-defense, first responder disaster training, as well as many outdoor survival skills, like making fire, building shelters, hunting and filtering water.

This education is individual, feeding directly into the online forum they maintain (which has just under 30,000 active members from all over the world), and by monthly local meetings all over the country, as well as annual national gatherings in southern Missouri, where they socialize, learn survival skills and practice sharpshooting.

Sound like those survivalists of the past? Emphatically no. Zombie Squad’s message is one of public education and awareness, very successful charity drives for a wide array of organizations, and inclusion of all ethnicities, genders, religions and politics. Yet, the group is adamant on leaving politics and religion out of discussions on the group and prepping. You will not find exclusive language on their forum or in their media. That is not to say that the individuals in the group do not have opinions on one side or the other of these issues, but it is a fact that those issues are not to be discussed within the community of Zombie Squad.

Considering the focus on ‘future doom’ and the types of fears that are being pushed on the shows mentioned above, usually involve protecting yourself from disaster and then other people that have survived the disaster, Zombie Squad is a refreshing twist to the ‘prepper’ discourse. After all, if a natural disaster were to befall your region, whom would you rather be knocking at your door: ‘raiders’ or your neighborhood Zombie Squad member?

And the answer is no: they don’t really believe in zombies.

US police sentenced for Katrina killings (Al Jazeera)

The brother of Lance Madison (C) was shot dead on September 4, 2005, at the Danziger Bridge in new Orleans [Reuters]

Five ex-police officers given prison terms for roles in shootings and cover-up in days after Hurricane Katrina in 2005.

Last Modified: 05 Apr 2012 01:03

The brother of Lance Madison (C) was shot dead on September 4, 2005, at the Danziger Bridge in new Orleans [Reuters]

Five former New Orleans police officers have been sentenced to prison terms ranging from six to 65 years for their roles in deadly shootings of unarmed residents in the chaotic days after Hurricane Katrina.

The presiding judge lashed out at prosecutors for two hours on Wednesday on their handling of the case in which police shot six people at a bridge on September 4, 2005, killing two, less than a week after Katrina made landfall.

To make the shootings appear justified, officers conspired to plant a gun, fabricate witnesses and falsify reports. The case became the centerpiece of the US Justice Department’s push to clean up the troubled New Orleans Police Department.

Kenneth Bowen, Robert Gisevius, Anthony Villavaso and Robert Faulcon were convicted of federal firearms charges that carried mandatory minimum prison sentences of at least 35 years. Retired officer Arthur Kaufman, who was assigned to investigate the shootings, was convicted of helping orchestrate the cover-up.

Faulcon, who was convicted on charges in both fatal shootings, faces the stiffest sentence of 65 years. Bowen and Gisevius each face 40 years, while Villavaso was sentenced to 38. Kaufman received the lightest sentence at six
years.

Community ‘disservice’

Afterward, US District Judge Kurt Engelhardt accused prosecutors of cutting overly lenient plea deals with five other officers who cooperated with the civil rights investigation. The former officers pleaded guilty to helping cover up the shooting and are already serving prison terms ranging from three to eight years.

“These through-the-looking-glass plea deals that tied the hands of this court … are an affront to the court and a disservice to the community,” Engelhardt said.

The judge also questioned the credibility of the officers who pleaded guilty and testified against those who went to trial.

In particular, the judge criticized prosecutors for seeking a 20-year prison sentence for Kaufman, yet Michael Lohman, who was the highest-ranking officer at the scene of the shooting, received four years under his deal for pleading guilty to participating in the cover-up.

‘Unbearable’ pain

Engelhardt heard several hours of arguments and testimony earlier on Wednesday from prosecutors, defense attorneys, relatives of shooting victims and the officers. Ronald Madison and 17-year-old James Brissette died in the shootings.

“This has been a long and painful six-and-a-half years,” said Lance Madison, whose 40-year-old, mentally disabled brother, Ronald, was killed at the bridge. “The people of New Orleans and my family are ready for justice.”

Madison individually addressed each defendant, including Faulcon, who shot his brother: “When I look at you, my pain becomes unbearable. You took the life of an angel and basically ripped my heart out.”

Madison also said he was horrified by Kaufman’s actions in the cover-up: “You tried to frame me, a man you knew was innocent, and send me to prison for the rest of my life.”

Lance Madison was arrested on attempted murder charges after police falsely accused him of shooting at the officers on the bridge. He was jailed for three weeks before a judge freed him.

None of the officers addressed the court before they were sentenced.

Chaotic aftermath

Katrina struck on August 29, 2005, leading to the collapse of levees and flooding an estimated 80 per cent of the city. New Orleans was plunged into chaos as residents who hadn’t evacuated were driven from their homes to what high places they could find.

Officers who worked in the city at the time but were not charged in the bridge case on Wednesday told Engelhardt of the lawlessness that followed the flood, and that they feared for their lives.

On the morning of September 4, one group of residents was crossing the Danziger Bridge in the city’s Gentilly area in search of food and supplies when police arrived.

The officers had received calls that shots were being fired. Gunfire reports were common after Katrina.

Faulcon was convicted of fatally shooting Madison, but the jury decided the killing didn’t amount to murder. He, Gisevius, Bowen and Villavaso were convicted in Brissette’s killing, but jurors didn’t hold any of them individually responsible for causing his death.

All five officers were convicted of participating in a cover-up.

Global Warming May Worsen Effects of El Niño, La Niña Events (Climate Central)

Published: October 12th, 2011

By Michael D. Lemonick

Does this mean Texas is toast?

As just about everyone knows, El Niño is a periodic unusual warming of the surface water in the eastern and central tropical Pacific Ocean. Actually, that’s pretty much a lie. Most people don’t know the definition of El Niño or its mirror image, La Niña, and truthfully, most people don’t much care.

What you do care about if you’re a Texan suffering through the worst one-year drought on record, or a New Yorker who had to dig out from massive snowstorms last winter (tied in part to La Niña), or a Californian who has ever had to deal with the torrential rains that trigger catastrophic mudslides (linked to El Niño), is that these natural climate cycles can elevate the odds of natural disasters where you live.

At the moment, we’re now entering the second year of the La Niña part of the cycle. La Niña is one key reason why the Southwest was so dry last winter and through the spring and summer, and since La Niña is projected to continue through the coming winter, Texas and nearby states aren’t likely to get much relief.

Precipitation outlook for winter 2011-12, showing the likelihood of below average precipitation in Texas and other drought-stricken states.

But Niñas and Niños (the broader cycle, for you weather/climate geeks, is known as the “El Niño-Southern Oscillation,” or “ENSO”) don’t just operate in isolation. They’re part of the broader climate system, which means that climate change could theoretically change how they operate — make them develop more frequently, for example, or less frequently, or be more or less pronounced. Climate change could also intensify the effects of El Niño and La Niña events.

Climate scientists have been wrestling with the first question for a while now, and they still don’t really have a definitive answer. Some climate models have suggested that global warming has already begun to cause subtle changes in ENSO cycles, and that the changes will become more pronounced later this century. But a new study, published in the Journal of Climate, doesn’t find much evidence for that.

But on the second question, the new study is a lot more definitive. “Due to a warmer and moister atmosphere,” said co-author Baylor Fox-Kemper, of the University of Colorado in a press release, “the impacts of El Niño are changing even though El Niño itself doesn’t change.”

That’s because global warming has begun to change the playing field on which El Niño and La Niña operate, just as it’s changing the background conditions that give rise to our everyday weather. The Texas drought is a prime example. Its most likely cause is reduced rainfall from La Niña-related weather patterns. But however dry Texas and Oklahoma might have been otherwise, the killer heat wave that plagued the region this past summer — the sort of heat wave global warming is already making more commonplace — baked much of the remaining moisture out of both the soil and vegetation. No wonder large parts of the Lone Star State have gone up in smoke.

A map of sea surface temperature anomalies, showing a swath of cooler than average waters in the central and eastern tropical Pacific Ocean – a telltale sign La Niña conditions.

When the next El Niño occurs in a year or two, it will probably bring heavy rains to places like Southern California, whose unstable hillsides tend to slide when soggy. Except now, thanks to global warming, the typical El Niño-related storms that roll in off the Pacific may well be turbocharged, since a warmer atmosphere can hold more water. This is the reason, say many climate scientists, that downpours have become heavier in recent decades across broad geographical areas.

La Niña, plus the added moisture in the air from global warming, have also been partially implicated in the massive snowstorms that struck the Northeast and Mid-Atlantic states during the last two winters. Those could get worse as well, suggests the new analysis. “What we see,” says Fox-Kemper, “is that certain atmospheric patterns, such as the blocking high pressure south of Alaska typical of La Niña winters, strengthen…so, the cooling of North America expected in a La Niña winter would be stronger in future climates.” So to pre-answer the question that will inevitably be asked next winter: no, more snow does NOT contradict the idea that the planet is warming. Quite the contrary.

Finally, for those who really do want to know what El Niño and La Niña actually are, as opposed to what they do, you can go to NOAA’s El Niño page. But be warned: there will be a quiz, and the word “thermocline” will appear.

Comments

By Kirk Petersen (Maplewood, NJ 07040)
on October 13th, 2011

Seventh paragraph, third sentence should begin “Its most likely cause”—not “it’s”.

Radiação vaza em indústria nuclear no Rio (Correio Braziliense)

JC e-mail 4367, de 19 de Outubro de 2011.

Ocorreram três vazamentos dentro da Fábrica de Combustível Nuclear, pertencente ao governo federal, em Resende (RJ). Dois deles, envolvendo substâncias químicas. Outro, urânio enriquecido altamente radioativo. A empresa admite “falhas”, mas descarta danos a funcionários e ao meio ambiente

Produto radioativo vaza em indústria nuclear de Resende (RJ). A empresa, pertencente ao governo federal, confirma o caso, reconhece “falhas” em equipamentos, mas descarta danos aos funcionários e ao meio ambiente

Engenheiros e técnicos de segurança do trabalho detectaram três vazamentos dentro da Fábrica de Combustível Nuclear (FCN), em Resende (RJ), dois deles envolvendo substâncias químicas e um de urânio enriquecido (UO2), elemento altamente radioativo. A constatação dos vazamentos foi comunicada pelos engenheiros e técnicos a seus superiores por e-mails internos. O Correio teve acesso a cópias desses e-mails.

O pó de urânio vazou de um equipamento chamado homogeneizador e caiu no piso da sala. O episódio foi registrado em 14 de julho de 2009. Em janeiro de 2010, o alarme de atenção da fábrica foi acionado em razão do vazamento de gás liquefeito usado no forno que queima os excessos de gases resultantes da produção de pastilhas de urânio. E, em julho deste ano, um engenheiro suspeitou do vazamento de amônia e comunicou o ocorrido aos gerentes.

Os três casos não representaram riscos aos trabalhadores, ao meio ambiente e ao funcionamento da fábrica, garantem a diretoria da fábrica – pertencente ao governo federal – e a presidência da Comissão Nacional de Energia Nuclear (Cnen), órgão responsável pela fiscalização de atividades radioativas no Brasil. “O urânio ficou numa sala confinada, hermeticamente fechada, não foi para o meio ambiente”, diz o diretor de Produção de Combustível Nuclear da FCN, Samuel Fayad Filho. Ele reconhece “falhas” nos equipamentos e diz que “todos os procedimentos foram tomados” em relação aos problemas detectados. “Não há vazamento de material radioativo em Resende”, assegura.

O Correio consultou especialistas para saber o que significam as informações que circularam internamente na FCN. Para o engenheiro nuclear Aquilino Senra, “é evidente que houve uma falha”. “Não era para o pó de UO2 sair dessa prensa”, diz o engenheiro nuclear, vice-diretor do Instituto Alberto Luiz Coimbra de Pós-Graduação e Pesquisa de Engenharia (Coppe), da Universidade Federal do Rio de Janeiro. “É uma anormalidade clara o vazamento de UO2 da prensa e a presença da substância no solo.”

Em relação ao vazamento de gás liquefeito, Aquilino afirma que “gás vazado não é boa coisa”. “Detectores existem para isso, mas o ponto é por que o gás vazou.” O Correio ouviu também um técnico ligado à Presidência da República, sob a condição de anonimato: “Não me parece um problema grave, pois a Presidência não foi avisada”, diz.

Funções – A FCN é um conjunto de fábricas responsáveis pela montagem do elemento combustível, pela fabricação do pó e da pastilha de urânio e por uma pequena parte do enriquecimento de urânio. O mineral é extraído em Caetité (BA). O processo de enriquecimento é feito quase todo fora do país, mas parte dele já ocorre na FCN. Cabe à fábrica, além dessa pequena fatia do enriquecimento, produzir as pastilhas que serão utilizadas na geração de energia nuclear pelas usinas Angra 1 e Angra 2, em Angra dos Reis (RJ).

Hoje, a FCN é responsável pelo enriquecimento de 10% do urânio necessário para Angra 1 e de 5% para Angra 2, segundo Samuel Fayad. A FCN faz parte da estatal Indústrias Nucleares do Brasil (INB), subordinada ao Ministério de Ciência, Tecnologia e Inovação (MCT).

O episódio do vazamento de pó de urânio foi relatado por um técnico de segurança do trabalho às coordenações superiores. A Cnen confirmou ao Correio o alerta. “O fato é irrelevante em termos de segurança. O referido pó foi identificado em área controlada, dentro de ambiente com contenção para material radioativo, não afetando trabalhadores da unidade ou o meio ambiente”, sustenta o órgão, por meio da assessoria de imprensa.

Crise – O setor de geração de energia nuclear vive um conflito e uma crise dentro do governo federal. O presidente da Comissão Nacional de Energia Nuclear (Cnen), Angelo Padilha, assumiu o cargo em 7 de julho, depois de o ministro da Ciência e Tecnologia, Aloizio Mercadante, demitir Odair Dias Gonçalves. Odair perdeu o cargo após revelações de que a usina Angra 2 operou por 10 anos sem licença definitiva e de que o Brasil passou a importar urânio em razão de licenças travadas. Até agora, a Agência Reguladora de Energia Nuclear é apenas um projeto, em razão de conflitos dentro do setor. A agência vai retirar da Cnen – principal acionista das Indústrias Nucleares do Brasil – a função de regulação e fiscalização.

Secitece promove I Fórum “Ceará Faz Ciência” (Funcap)

POR ADMIN, EM 13/10/2011

Com a Assessoria de Comunicação da Secitece

O evento será realizado nos dias 17 e 18 de outubro, no auditório do Planetário do Centro Dragão do Mar de Arte e Cultura.

Nos dias 17 e 18 de outubro, a Secretaria da Ciência Tecnologia e Educação Superior (Secitece), realizará o “I Fórum Ceará Faz Ciência”, com o tema” Mudanças climáticas, desastres naturais e prevenção de riscos”. A iniciativa integra a programação estadual da Semana Nacional de Ciência e Tecnologia.

O secretário da Ciência e Tecnologia, René Barreira, fará a abertura do evento, dia 17, às 17h, no auditório do Planetário Rubens de Azevedo. Na ocasião, será prestada homenagem ao pesquisador cearense Expedito Parente, conhecido como o pai do biodiesel, que faleceu em setembro.

No dia 18, partir das 9h, as atividades serão retomadas com as seguintes palestras: “Onda gigante no litoral brasileiro. É possível?”, com o prof. Francisco Brandão, chefe do Laboratório de Sismologia da Coordenadoria Estadual de Defesa Civil, e “As quatro estações do ano no Ceará: perceba suas interferências na fisiologia e no meio ambiente”, ministrada por Dermeval Carneiro, prof. de Física e Astronomia, presidente da Sociedade Brasileira dos Amigos da Astronomia e diretor do Planetário Rubens de Azevedo – Dragão do Mar.

No período da tarde, a partir das 14h30, será a vez da palestra “Desastres Naturais: como prevenir e atuar em situações de risco”, com o Tenente Coronel Leandro Silva Nogueira, secretário Executivo da Coordenadoria Estadual de Defesa Civil. Para finalizar o Fórum, a engenheira agrônoma do Departamento de Recursos Hidricos e Meio Ambiente da Fundação Cearense de Meteorologia e Recursos Hídricos (Funceme), Sonia Barreto Perdigão, ministrará palestra sobre “Mudanças Climáticas e Desertificação no Ceará”, às 16h30.

Os interessados em participar do I Fórum “Ceará Faz Ciência”, a ser realizado nos dias 17 e 18/10, no Dragão do Mar, em Fortaleza, devem fazer sua pré-inscrição. O formulário a ser preenchido está disponível no site da Secitece. A participação é gratuita.

Serviço
I Fórum Ceará Faz Ciência
Data: 17 e 18 de outubro de 2011
Local: Auditório do Planetário Rubens de Azevedo
Informações: (85) 3101-6466
Inscrições gratuitas.