Arquivo da tag: ciência

Global Warming’s Terrifying New Math (Rolling Stone)

Three simple numbers that add up to global catastrophe – and that make clear who the real enemy is

by: Bill McKibben

reckoning illoIllustration by Edel Rodriguez

If the pictures of those towering wildfires in Colorado haven’t convinced you, or the size of your AC bill this summer, here are some hard numbers about climate change: June broke or tied 3,215 high-temperature records across the United States. That followed the warmest May on record for the Northern Hemisphere – the 327th consecutive month in which the temperature of the entire globe exceeded the 20th-century average, the odds of which occurring by simple chance were 3.7 x 10-99, a number considerably larger than the number of stars in the universe.

Meteorologists reported that this spring was the warmest ever recorded for our nation – in fact, it crushed the old record by so much that it represented the “largest temperature departure from average of any season on record.” The same week, Saudi authorities reported that it had rained in Mecca despite a temperature of 109 degrees, the hottest downpour in the planet’s history.

Not that our leaders seemed to notice. Last month the world’s nations, meeting in Rio for the 20th-anniversary reprise of a massive 1992 environmental summit, accomplished nothing. Unlike George H.W. Bush, who flew in for the first conclave, Barack Obama didn’t even attend. It was “a ghost of the glad, confident meeting 20 years ago,” the British journalist George Monbiot wrote; no one paid it much attention, footsteps echoing through the halls “once thronged by multitudes.” Since I wrote one of the first books for a general audience about global warming way back in 1989, and since I’ve spent the intervening decades working ineffectively to slow that warming, I can say with some confidence that we’re losing the fight, badly and quickly – losing it because, most of all, we remain in denial about the peril that human civilization is in.

When we think about global warming at all, the arguments tend to be ideological, theological and economic. But to grasp the seriousness of our predicament, you just need to do a little math. For the past year, an easy and powerful bit of arithmetical analysis first published by financial analysts in the U.K. has been making the rounds of environmental conferences and journals, but it hasn’t yet broken through to the larger public. This analysis upends most of the conventional political thinking about climate change. And it allows us to understand our precarious – our almost-but-not-quite-finally hopeless – position with three simple numbers.

The First Number: 2° Celsius

If the movie had ended in Hollywood fashion, the Copenhagen climate conference in 2009 would have marked the culmination of the global fight to slow a changing climate. The world’s nations had gathered in the December gloom of the Danish capital for what a leading climate economist, Sir Nicholas Stern of Britain, called the “most important gathering since the Second World War, given what is at stake.” As Danish energy minister Connie Hedegaard, who presided over the conference, declared at the time: “This is our chance. If we miss it, it could take years before we get a new and better one. If ever.”

In the event, of course, we missed it. Copenhagen failed spectacularly. Neither China nor the United States, which between them are responsible for 40 percent of global carbon emissions, was prepared to offer dramatic concessions, and so the conference drifted aimlessly for two weeks until world leaders jetted in for the final day. Amid considerable chaos, President Obama took the lead in drafting a face-saving “Copenhagen Accord” that fooled very few. Its purely voluntary agreements committed no one to anything, and even if countries signaled their intentions to cut carbon emissions, there was no enforcement mechanism. “Copenhagen is a crime scene tonight,” an angry Greenpeace official declared, “with the guilty men and women fleeing to the airport.” Headline writers were equally brutal: COPENHAGEN: THE MUNICH OF OUR TIMES? asked one.

The accord did contain one important number, however. In Paragraph 1, it formally recognized “the scientific view that the increase in global temperature should be below two degrees Celsius.” And in the very next paragraph, it declared that “we agree that deep cuts in global emissions are required… so as to hold the increase in global temperature below two degrees Celsius.” By insisting on two degrees – about 3.6 degrees Fahrenheit – the accord ratified positions taken earlier in 2009 by the G8, and the so-called Major Economies Forum. It was as conventional as conventional wisdom gets. The number first gained prominence, in fact, at a 1995 climate conference chaired by Angela Merkel, then the German minister of the environment and now the center-right chancellor of the nation.

Some context: So far, we’ve raised the average temperature of the planet just under 0.8 degrees Celsius, and that has caused far more damage than most scientists expected. (A third of summer sea ice in the Arctic is gone, the oceans are 30 percent more acidic, and since warm air holds more water vapor than cold, the atmosphere over the oceans is a shocking five percent wetter, loading the dice for devastating floods.) Given those impacts, in fact, many scientists have come to think that two degrees is far too lenient a target. “Any number much above one degree involves a gamble,” writes Kerry Emanuel of MIT, a leading authority on hurricanes, “and the odds become less and less favorable as the temperature goes up.” Thomas Lovejoy, once the World Bank’s chief biodiversity adviser, puts it like this: “If we’re seeing what we’re seeing today at 0.8 degrees Celsius, two degrees is simply too much.” NASA scientist James Hansen, the planet’s most prominent climatologist, is even blunter: “The target that has been talked about in international negotiations for two degrees of warming is actually a prescription for long-term disaster.” At the Copenhagen summit, a spokesman for small island nations warned that many would not survive a two-degree rise: “Some countries will flat-out disappear.” When delegates from developing nations were warned that two degrees would represent a “suicide pact” for drought-stricken Africa, many of them started chanting, “One degree, one Africa.”

Despite such well-founded misgivings, political realism bested scientific data, and the world settled on the two-degree target – indeed, it’s fair to say that it’s the only thing about climate change the world has settled on. All told, 167 countries responsible for more than 87 percent of the world’s carbon emissions have signed on to the Copenhagen Accord, endorsing the two-degree target. Only a few dozen countries have rejected it, including Kuwait, Nicaragua and Venezuela. Even the United Arab Emirates, which makes most of its money exporting oil and gas, signed on. The official position of planet Earth at the moment is that we can’t raise the temperature more than two degrees Celsius – it’s become the bottomest of bottom lines. Two degrees.

The Second Number: 565 Gigatons

Scientists estimate that humans can pour roughly 565 more gigatons of carbon dioxide into the atmosphere by midcentury and still have some reasonable hope of staying below two degrees. (“Reasonable,” in this case, means four chances in five, or somewhat worse odds than playing Russian roulette with a six-shooter.)

This idea of a global “carbon budget” emerged about a decade ago, as scientists began to calculate how much oil, coal and gas could still safely be burned. Since we’ve increased the Earth’s temperature by 0.8 degrees so far, we’re currently less than halfway to the target. But, in fact, computer models calculate that even if we stopped increasing CO2 now, the temperature would likely still rise another 0.8 degrees, as previously released carbon continues to overheat the atmosphere. That means we’re already three-quarters of the way to the two-degree target.

How good are these numbers? No one is insisting that they’re exact, but few dispute that they’re generally right. The 565-gigaton figure was derived from one of the most sophisticated computer-simulation models that have been built by climate scientists around the world over the past few decades. And the number is being further confirmed by the latest climate-simulation models currently being finalized in advance of the next report by the Intergovernmental Panel on Climate Change. “Looking at them as they come in, they hardly differ at all,” says Tom Wigley, an Australian climatologist at the National Center for Atmospheric Research. “There’s maybe 40 models in the data set now, compared with 20 before. But so far the numbers are pretty much the same. We’re just fine-tuning things. I don’t think much has changed over the last decade.” William Collins, a senior climate scientist at the Lawrence Berkeley National Laboratory, agrees. “I think the results of this round of simulations will be quite similar,” he says. “We’re not getting any free lunch from additional understanding of the climate system.”

We’re not getting any free lunch from the world’s economies, either. With only a single year’s lull in 2009 at the height of the financial crisis, we’ve continued to pour record amounts of carbon into the atmosphere, year after year. In late May, the International Energy Agency published its latest figures – CO2 emissions last year rose to 31.6 gigatons, up 3.2 percent from the year before. America had a warm winter and converted more coal-fired power plants to natural gas, so its emissions fell slightly; China kept booming, so its carbon output (which recently surpassed the U.S.) rose 9.3 percent; the Japanese shut down their fleet of nukes post-Fukushima, so their emissions edged up 2.4 percent. “There have been efforts to use more renewable energy and improve energy efficiency,” said Corinne Le Quéré, who runs England’s Tyndall Centre for Climate Change Research. “But what this shows is that so far the effects have been marginal.” In fact, study after study predicts that carbon emissions will keep growing by roughly three percent a year – and at that rate, we’ll blow through our 565-gigaton allowance in 16 years, around the time today’s preschoolers will be graduating from high school. “The new data provide further evidence that the door to a two-degree trajectory is about to close,” said Fatih Birol, the IEA’s chief economist. In fact, he continued, “When I look at this data, the trend is perfectly in line with a temperature increase of about six degrees.” That’s almost 11 degrees Fahrenheit, which would create a planet straight out of science fiction.

So, new data in hand, everyone at the Rio conference renewed their ritual calls for serious international action to move us back to a two-degree trajectory. The charade will continue in November, when the next Conference of the Parties (COP) of the U.N. Framework Convention on Climate Change convenes in Qatar. This will be COP 18 – COP 1 was held in Berlin in 1995, and since then the process has accomplished essentially nothing. Even scientists, who are notoriously reluctant to speak out, are slowly overcoming their natural preference to simply provide data. “The message has been consistent for close to 30 years now,” Collins says with a wry laugh, “and we have the instrumentation and the computer power required to present the evidence in detail. If we choose to continue on our present course of action, it should be done with a full evaluation of the evidence the scientific community has presented.” He pauses, suddenly conscious of being on the record. “I should say, a fuller evaluation of the evidence.”

So far, though, such calls have had little effect. We’re in the same position we’ve been in for a quarter-century: scientific warning followed by political inaction. Among scientists speaking off the record, disgusted candor is the rule. One senior scientist told me, “You know those new cigarette packs, where governments make them put a picture of someone with a hole in their throats? Gas pumps should have something like that.”

The Third Number: 2,795 Gigatons

This number is the scariest of all – one that, for the first time, meshes the political and scientific dimensions of our dilemma. It was highlighted last summer by the Carbon Tracker Initiative, a team of London financial analysts and environmentalists who published a report in an effort to educate investors about the possible risks that climate change poses to their stock portfolios. The number describes the amount of carbon already contained in the proven coal and oil and gas reserves of the fossil-fuel companies, and the countries (think Venezuela or Kuwait) that act like fossil-fuel companies. In short, it’s the fossil fuel we’re currently planning to burn. And the key point is that this new number – 2,795 – is higher than 565. Five times higher.

The Carbon Tracker Initiative – led by James Leaton, an environmentalist who served as an adviser at the accounting giant PricewaterhouseCoopers – combed through proprietary databases to figure out how much oil, gas and coal the world’s major energy companies hold in reserve. The numbers aren’t perfect – they don’t fully reflect the recent surge in unconventional energy sources like shale gas, and they don’t accurately reflect coal reserves, which are subject to less stringent reporting requirements than oil and gas. But for the biggest companies, the figures are quite exact: If you burned everything in the inventories of Russia’s Lukoil and America’s ExxonMobil, for instance, which lead the list of oil and gas companies, each would release more than 40 gigatons of carbon dioxide into the atmosphere.

Which is exactly why this new number, 2,795 gigatons, is such a big deal. Think of two degrees Celsius as the legal drinking limit – equivalent to the 0.08 blood-alcohol level below which you might get away with driving home. The 565 gigatons is how many drinks you could have and still stay below that limit – the six beers, say, you might consume in an evening. And the 2,795 gigatons? That’s the three 12-packs the fossil-fuel industry has on the table, already opened and ready to pour.

We have five times as much oil and coal and gas on the books as climate scientists think is safe to burn. We’d have to keep 80 percent of those reserves locked away underground to avoid that fate. Before we knew those numbers, our fate had been likely. Now, barring some massive intervention, it seems certain.

Yes, this coal and gas and oil is still technically in the soil. But it’s already economically aboveground – it’s figured into share prices, companies are borrowing money against it, nations are basing their budgets on the presumed returns from their patrimony. It explains why the big fossil-fuel companies have fought so hard to prevent the regulation of carbon dioxide – those reserves are their primary asset, the holding that gives their companies their value. It’s why they’ve worked so hard these past years to figure out how to unlock the oil in Canada’s tar sands, or how to drill miles beneath the sea, or how to frack the Appalachians.

If you told Exxon or Lukoil that, in order to avoid wrecking the climate, they couldn’t pump out their reserves, the value of their companies would plummet. John Fullerton, a former managing director at JP Morgan who now runs the Capital Institute, calculates that at today’s market value, those 2,795 gigatons of carbon emissions are worth about $27 trillion. Which is to say, if you paid attention to the scientists and kept 80 percent of it underground, you’d be writing off $20 trillion in assets. The numbers aren’t exact, of course, but that carbon bubble makes the housing bubble look small by comparison. It won’t necessarily burst – we might well burn all that carbon, in which case investors will do fine. But if we do, the planet will crater. You can have a healthy fossil-fuel balance sheet, or a relatively healthy planet – but now that we know the numbers, it looks like you can’t have both. Do the math: 2,795 is five times 565. That’s how the story ends.

So far, as I said at the start, environmental efforts to tackle global warming have failed. The planet’s emissions of carbon dioxide continue to soar, especially as developing countries emulate (and supplant) the industries of the West. Even in rich countries, small reductions in emissions offer no sign of the real break with the status quo we’d need to upend the iron logic of these three numbers. Germany is one of the only big countries that has actually tried hard to change its energy mix; on one sunny Saturday in late May, that northern-latitude nation generated nearly half its power from solar panels within its borders. That’s a small miracle – and it demonstrates that we have the technology to solve our problems. But we lack the will. So far, Germany’s the exception; the rule is ever more carbon.

This record of failure means we know a lot about what strategies don’t work. Green groups, for instance, have spent a lot of time trying to change individual lifestyles: the iconic twisty light bulb has been installed by the millions, but so have a new generation of energy-sucking flatscreen TVs. Most of us are fundamentally ambivalent about going green: We like cheap flights to warm places, and we’re certainly not going to give them up if everyone else is still taking them. Since all of us are in some way the beneficiaries of cheap fossil fuel, tackling climate change has been like trying to build a movement against yourself – it’s as if the gay-rights movement had to be constructed entirely from evangelical preachers, or the abolition movement from slaveholders.

People perceive – correctly – that their individual actions will not make a decisive difference in the atmospheric concentration of CO2; by 2010, a poll found that “while recycling is widespread in America and 73 percent of those polled are paying bills online in order to save paper,” only four percent had reduced their utility use and only three percent had purchased hybrid cars. Given a hundred years, you could conceivably change lifestyles enough to matter – but time is precisely what we lack.

A more efficient method, of course, would be to work through the political system, and environmentalists have tried that, too, with the same limited success. They’ve patiently lobbied leaders, trying to convince them of our peril and assuming that politicians would heed the warnings. Sometimes it has seemed to work. Barack Obama, for instance, campaigned more aggressively about climate change than any president before him – the night he won the nomination, he told supporters that his election would mark the moment “the rise of the oceans began to slow and the planet began to heal.” And he has achieved one significant change: a steady increase in the fuel efficiency mandated for automobiles. It’s the kind of measure, adopted a quarter-century ago, that would have helped enormously. But in light of the numbers I’ve just described, it’s obviously a very small start indeed.

At this point, effective action would require actually keeping most of the carbon the fossil-fuel industry wants to burn safely in the soil, not just changing slightly the speed at which it’s burned. And there the president, apparently haunted by the still-echoing cry of “Drill, baby, drill,” has gone out of his way to frack and mine. His secretary of interior, for instance, opened up a huge swath of the Powder River Basin in Wyoming for coal extraction: The total basin contains some 67.5 gigatons worth of carbon (or more than 10 percent of the available atmospheric space). He’s doing the same thing with Arctic and offshore drilling; in fact, as he explained on the stump in March, “You have my word that we will keep drilling everywhere we can… That’s a commitment that I make.” The next day, in a yard full of oil pipe in Cushing, Oklahoma, the president promised to work on wind and solar energy but, at the same time, to speed up fossil-fuel development: “Producing more oil and gas here at home has been, and will continue to be, a critical part of an all-of-the-above energy strategy.” That is, he’s committed to finding even more stock to add to the 2,795-gigaton inventory of unburned carbon.

Sometimes the irony is almost Borat-scale obvious: In early June, Secretary of State Hillary Clinton traveled on a Norwegian research trawler to see firsthand the growing damage from climate change. “Many of the predictions about warming in the Arctic are being surpassed by the actual data,” she said, describing the sight as “sobering.” But the discussions she traveled to Scandinavia to have with other foreign ministers were mostly about how to make sure Western nations get their share of the estimated $9 trillion in oil (that’s more than 90 billion barrels, or 37 gigatons of carbon) that will become accessible as the Arctic ice melts. Last month, the Obama administration indicated that it would give Shell permission to start drilling in sections of the Arctic.

Almost every government with deposits of hydrocarbons straddles the same divide. Canada, for instance, is a liberal democracy renowned for its internationalism – no wonder, then, that it signed on to the Kyoto treaty, promising to cut its carbon emissions substantially by 2012. But the rising price of oil suddenly made the tar sands of Alberta economically attractive – and since, as NASA climatologist James Hansen pointed out in May, they contain as much as 240 gigatons of carbon (or almost half of the available space if we take the 565 limit seriously), that meant Canada’s commitment to Kyoto was nonsense. In December, the Canadian government withdrew from the treaty before it faced fines for failing to meet its commitments.

The same kind of hypocrisy applies across the ideological board: In his speech to the Copenhagen conference, Venezuela’s Hugo Chavez quoted Rosa Luxemburg, Jean-Jacques Rousseau and “Christ the Redeemer,” insisting that “climate change is undoubtedly the most devastating environmental problem of this century.” But the next spring, in the Simon Bolivar Hall of the state-run oil company, he signed an agreement with a consortium of international players to develop the vast Orinoco tar sands as “the most significant engine for a comprehensive development of the entire territory and Venezuelan population.” The Orinoco deposits are larger than Alberta’s – taken together, they’d fill up the whole available atmospheric space.

So: the paths we have tried to tackle global warming have so far produced only gradual, halting shifts. A rapid, transformative change would require building a movement, and movements require enemies. As John F. Kennedy put it, “The civil rights movement should thank God for Bull Connor. He’s helped it as much as Abraham Lincoln.” And enemies are what climate change has lacked.

But what all these climate numbers make painfully, usefully clear is that the planet does indeed have an enemy – one far more committed to action than governments or individuals. Given this hard math, we need to view the fossil-fuel industry in a new light. It has become a rogue industry, reckless like no other force on Earth. It is Public Enemy Number One to the survival of our planetary civilization. “Lots of companies do rotten things in the course of their business – pay terrible wages, make people work in sweatshops – and we pressure them to change those practices,” says veteran anti-corporate leader Naomi Klein, who is at work on a book about the climate crisis. “But these numbers make clear that with the fossil-fuel industry, wrecking the planet is their business model. It’s what they do.”

According to the Carbon Tracker report, if Exxon burns its current reserves, it would use up more than seven percent of the available atmospheric space between us and the risk of two degrees. BP is just behind, followed by the Russian firm Gazprom, then Chevron, ConocoPhillips and Shell, each of which would fill between three and four percent. Taken together, just these six firms, of the 200 listed in the Carbon Tracker report, would use up more than a quarter of the remaining two-degree budget. Severstal, the Russian mining giant, leads the list of coal companies, followed by firms like BHP Billiton and Peabody. The numbers are simply staggering – this industry, and this industry alone, holds the power to change the physics and chemistry of our planet, and they’re planning to use it.

They’re clearly cognizant of global warming – they employ some of the world’s best scientists, after all, and they’re bidding on all those oil leases made possible by the staggering melt of Arctic ice. And yet they relentlessly search for more hydrocarbons – in early March, Exxon CEO Rex Tillerson told Wall Street analysts that the company plans to spend $37 billion a year through 2016 (about $100 million a day) searching for yet more oil and gas.

There’s not a more reckless man on the planet than Tillerson. Late last month, on the same day the Colorado fires reached their height, he told a New York audience that global warming is real, but dismissed it as an “engineering problem” that has “engineering solutions.” Such as? “Changes to weather patterns that move crop-production areas around – we’ll adapt to that.” This in a week when Kentucky farmers were reporting that corn kernels were “aborting” in record heat, threatening a spike in global food prices. “The fear factor that people want to throw out there to say, ‘We just have to stop this,’ I do not accept,” Tillerson said. Of course not – if he did accept it, he’d have to keep his reserves in the ground. Which would cost him money. It’s not an engineering problem, in other words – it’s a greed problem.

You could argue that this is simply in the nature of these companies – that having found a profitable vein, they’re compelled to keep mining it, more like efficient automatons than people with free will. But as the Supreme Court has made clear, they are people of a sort. In fact, thanks to the size of its bankroll, the fossil-fuel industry has far more free will than the rest of us. These companies don’t simply exist in a world whose hungers they fulfill – they help create the boundaries of that world.

Left to our own devices, citizens might decide to regulate carbon and stop short of the brink; according to a recent poll, nearly two-thirds of Americans would back an international agreement that cut carbon emissions 90 percent by 2050. But we aren’t left to our own devices. The Koch brothers, for instance, have a combined wealth of $50 billion, meaning they trail only Bill Gates on the list of richest Americans. They’ve made most of their money in hydrocarbons, they know any system to regulate carbon would cut those profits, and they reportedly plan to lavish as much as $200 million on this year’s elections. In 2009, for the first time, the U.S. Chamber of Commerce surpassed both the Republican and Democratic National Committees on political spending; the following year, more than 90 percent of the Chamber’s cash went to GOP candidates, many of whom deny the existence of global warming. Not long ago, the Chamber even filed a brief with the EPA urging the agency not to regulate carbon – should the world’s scientists turn out to be right and the planet heats up, the Chamber advised, “populations can acclimatize to warmer climates via a range of behavioral, physiological and technological adaptations.” As radical goes, demanding that we change our physiology seems right up there.

Environmentalists, understandably, have been loath to make the fossil-fuel industry their enemy, respecting its political power and hoping instead to convince these giants that they should turn away from coal, oil and gas and transform themselves more broadly into “energy companies.” Sometimes that strategy appeared to be working – emphasis on appeared. Around the turn of the century, for instance, BP made a brief attempt to restyle itself as “Beyond Petroleum,” adapting a logo that looked like the sun and sticking solar panels on some of its gas stations. But its investments in alternative energy were never more than a tiny fraction of its budget for hydrocarbon exploration, and after a few years, many of those were wound down as new CEOs insisted on returning to the company’s “core business.” In December, BP finally closed its solar division. Shell shut down its solar and wind efforts in 2009. The five biggest oil companies have made more than $1 trillion in profits since the millennium – there’s simply too much money to be made on oil and gas and coal to go chasing after zephyrs and sunbeams.

Much of that profit stems from a single historical accident: Alone among businesses, the fossil-fuel industry is allowed to dump its main waste, carbon dioxide, for free. Nobody else gets that break – if you own a restaurant, you have to pay someone to cart away your trash, since piling it in the street would breed rats. But the fossil-fuel industry is different, and for sound historical reasons: Until a quarter-century ago, almost no one knew that CO2 was dangerous. But now that we understand that carbon is heating the planet and acidifying the oceans, its price becomes the central issue.

If you put a price on carbon, through a direct tax or other methods, it would enlist markets in the fight against global warming. Once Exxon has to pay for the damage its carbon is doing to the atmosphere, the price of its products would rise. Consumers would get a strong signal to use less fossil fuel – every time they stopped at the pump, they’d be reminded that you don’t need a semimilitary vehicle to go to the grocery store. The economic playing field would now be a level one for nonpolluting energy sources. And you could do it all without bankrupting citizens – a so-called “fee-and-dividend” scheme would put a hefty tax on coal and gas and oil, then simply divide up the proceeds, sending everyone in the country a check each month for their share of the added costs of carbon. By switching to cleaner energy sources, most people would actually come out ahead.

There’s only one problem: Putting a price on carbon would reduce the profitability of the fossil-fuel industry. After all, the answer to the question “How high should the price of carbon be?” is “High enough to keep those carbon reserves that would take us past two degrees safely in the ground.” The higher the price on carbon, the more of those reserves would be worthless. The fight, in the end, is about whether the industry will succeed in its fight to keep its special pollution break alive past the point of climate catastrophe, or whether, in the economists’ parlance, we’ll make them internalize those externalities.

It’s not clear, of course, that the power of the fossil-fuel industry can be broken. The U.K. analysts who wrote the Carbon Tracker report and drew attention to these numbers had a relatively modest goal – they simply wanted to remind investors that climate change poses a very real risk to the stock prices of energy companies. Say something so big finally happens (a giant hurricane swamps Manhattan, a megadrought wipes out Midwest agriculture) that even the political power of the industry is inadequate to restrain legislators, who manage to regulate carbon. Suddenly those Chevron reserves would be a lot less valuable, and the stock would tank. Given that risk, the Carbon Tracker report warned investors to lessen their exposure, hedge it with some big plays in alternative energy.

“The regular process of economic evolution is that businesses are left with stranded assets all the time,” says Nick Robins, who runs HSBC’s Climate Change Centre. “Think of film cameras, or typewriters. The question is not whether this will happen. It will. Pension systems have been hit by the dot-com and credit crunch. They’ll be hit by this.” Still, it hasn’t been easy to convince investors, who have shared in the oil industry’s record profits. “The reason you get bubbles,” sighs Leaton, “is that everyone thinks they’re the best analyst – that they’ll go to the edge of the cliff and then jump back when everyone else goes over.”

So pure self-interest probably won’t spark a transformative challenge to fossil fuel. But moral outrage just might – and that’s the real meaning of this new math. It could, plausibly, give rise to a real movement.

Once, in recent corporate history, anger forced an industry to make basic changes. That was the campaign in the 1980s demanding divestment from companies doing business in South Africa. It rose first on college campuses and then spread to municipal and state governments; 155 campuses eventually divested, and by the end of the decade, more than 80 cities, 25 states and 19 counties had taken some form of binding economic action against companies connected to the apartheid regime. “The end of apartheid stands as one of the crowning accomplishments of the past century,” as Archbishop Desmond Tutu put it, “but we would not have succeeded without the help of international pressure,” especially from “the divestment movement of the 1980s.”

The fossil-fuel industry is obviously a tougher opponent, and even if you could force the hand of particular companies, you’d still have to figure out a strategy for dealing with all the sovereign nations that, in effect, act as fossil-fuel companies. But the link for college students is even more obvious in this case. If their college’s endowment portfolio has fossil-fuel stock, then their educations are being subsidized by investments that guarantee they won’t have much of a planet on which to make use of their degree. (The same logic applies to the world’s largest investors, pension funds, which are also theoretically interested in the future – that’s when their members will “enjoy their retirement.”) “Given the severity of the climate crisis, a comparable demand that our institutions dump stock from companies that are destroying the planet would not only be appropriate but effective,” says Bob Massie, a former anti-apartheid activist who helped found the Investor Network on Climate Risk. “The message is simple: We have had enough. We must sever the ties with those who profit from climate change – now.”

Movements rarely have predictable outcomes. But any campaign that weakens the fossil-fuel industry’s political standing clearly increases the chances of retiring its special breaks. Consider President Obama’s signal achievement in the climate fight, the large increase he won in mileage requirements for cars. Scientists, environmentalists and engineers had advocated such policies for decades, but until Detroit came under severe financial pressure, it was politically powerful enough to fend them off. If people come to understand the cold, mathematical truth – that the fossil-fuel industry is systematically undermining the planet’s physical systems – it might weaken it enough to matter politically. Exxon and their ilk might drop their opposition to a fee-and-dividend solution; they might even decide to become true energy companies, this time for real.

Even if such a campaign is possible, however, we may have waited too long to start it. To make a real difference – to keep us under a temperature increase of two degrees – you’d need to change carbon pricing in Washington, and then use that victory to leverage similar shifts around the world. At this point, what happens in the U.S. is most important for how it will influence China and India, where emissions are growing fastest. (In early June, researchers concluded that China has probably under-reported its emissions by up to 20 percent.) The three numbers I’ve described are daunting – they may define an essentially impossible future. But at least they provide intellectual clarity about the greatest challenge humans have ever faced. We know how much we can burn, and we know who’s planning to burn more. Climate change operates on a geological scale and time frame, but it’s not an impersonal force of nature; the more carefully you do the math, the more thoroughly you realize that this is, at bottom, a moral issue; we have met the enemy and they is Shell.

Meanwhile the tide of numbers continues. The week after the Rio conference limped to its conclusion, Arctic sea ice hit the lowest level ever recorded for that date. Last month, on a single weekend, Tropical Storm Debby dumped more than 20 inches of rain on Florida – the earliest the season’s fourth-named cyclone has ever arrived. At the same time, the largest fire in New Mexico history burned on, and the most destructive fire in Colorado’s annals claimed 346 homes in Colorado Springs – breaking a record set the week before in Fort Collins. This month, scientists issued a new study concluding that global warming has dramatically increased the likelihood of severe heat and drought – days after a heat wave across the Plains and Midwest broke records that had stood since the Dust Bowl, threatening this year’s harvest. You want a big number? In the course of this month, a quadrillion kernels of corn need to pollinate across the grain belt, something they can’t do if temperatures remain off the charts. Just like us, our crops are adapted to the Holocene, the 11,000-year period of climatic stability we’re now leaving… in the dust.

This story is from the August 2nd, 2012 issue of Rolling Stone.

Programa de computador mimetiza evolução humana (Fapesp)

Software desenvolvido na USP de São Carlos cria e seleciona programas geradores de Árvores de Decisão, ferramentas capazes de fazer previsões. Pesquisa foi premiada nos Estados Unidos, no maior evento de computação evolutiva (Wikimedia)

16/08/2012

Por Karina Toledo

Agência FAPESP – Árvores de Decisão são ferramentas computacionais que conferem às máquinas a capacidade de fazer previsões com base na análise de dados históricos. A técnica pode, por exemplo, auxiliar o diagnóstico médico ou a análise de risco de aplicações financeiras.

Mas, para ter a melhor previsão, é necessário o melhor programa gerador de Árvores de Decisão. Para alcançar esse objetivo, pesquisadores do Instituto de Ciências Matemáticas e de Computação (ICMC) da Universidade de São Paulo (USP), em São Carlos, se inspiraram na teoria evolucionista de Charles Darwin.

“Desenvolvemos um algoritmo evolutivo, ou seja, que mimetiza o processo de evolução humana para gerar soluções”, disse Rodrigo Coelho Barros, doutorando do Laboratório de Computação Bioinspirada (BioCom) do ICMC e bolsista da FAPESP.

A computação evolutiva, explicou Barros, é uma das várias técnicas bioinspiradas, ou seja, que buscam na natureza soluções para problemas computacionais. “É notável como a natureza encontra soluções para problemas extremamente complicados. Não há dúvidas de que precisamos aprender com ela”, disse Barros.

Segundo Barros, o software desenvolvido em seu doutorado é capaz de criar automaticamente programas geradores de Árvores de Decisão. Para isso, faz cruzamentos aleatórios entre os códigos de programas já existentes gerando “filhos”.

“Esses ‘filhos’ podem eventualmente sofrer mutações e evoluir. Após um tempo, é esperado que os programas de geração de Árvores de Decisão evoluídos sejam cada vez melhores e nosso algoritmo seleciona o melhor de todos”, afirmou Barros.

Mas enquanto o processo de seleção natural na espécie humana leva centenas ou até milhares de anos, na computação dura apenas algumas horas, dependendo do problema a ser resolvido. “Estabelecemos cem gerações como limite do processo evolutivo”, contou Barros.

Inteligência artificial

Em Ciência da Computação, é denominada heurística a capacidade de um sistema fazer inovações e desenvolver técnicas para alcançar um determinado fim.

O software desenvolvido por Barros se insere na área de hiper-heurísticas, tópico recente na área de computação evolutiva que tem como objetivo a geração automática de heurísticas personalizadas para uma determinada aplicação ou conjunto de aplicações.

“É um passo preliminar em direção ao grande objetivo da inteligência artificial: o de criar máquinas capazes de desenvolver soluções para problemas sem que sejam explicitamente programadas para tal”, detalhou Barros.

O trabalho deu origem ao artigo A Hyper-Heuristic Evolutionary Algorithm for Automatically Designing Decision-Tree Algorithms, premiado em três categorias na Genetic and Evolutionary Computation Conference (GECCO), maior evento da área de computação evolutiva do mundo, realizado em julho na Filadélfia, Estados Unidos.

Além de Barros, também são autores do artigo os professores André Carlos Ponce de Leon Ferreira de Carvalho, orientador da pesquisa no ICMC, Márcio Porto Basgalupp, da Universidade Federal de São Paulo (Unifesp), e Alex Freitas, da University of Kent, no Reino Unido, que assumiu a co-orientação.

Os autores foram convidados a submeter o artigo para a revista Evolutionary Computation Journal, publicada pelo Instituto de Tecnologia de Massachusetts (MIT). “O trabalho ainda passará por revisão, mas, como foi submetido a convite, tem grande chance de ser aceito”, disse Barros.

A pesquisa, que deve ser concluída somente em 2013, também deu origem a um artigo publicado a convite no Journal of the Brazilian Computer Society, após ser eleito como melhor trabalho no Encontro Nacional de Inteligência Artificial de 2011.

Outro artigo, apresentado na 11ª International Conference on Intelligent Systems Design and Applications, realizada na Espanha em 2011, rendeu convite para publicação na revistaNeurocomputing.

14 Wacky “Facts” Kids Will Learn in Louisiana’s Voucher Schools (Mother Jones)

—By Deanna Pan | Tue Aug. 7, 2012 3:00 AM PDT

God Bless Our SchoolSeparation of church and what? Currier & Ives/Library of Congress

Thanks to a new law privatizing public education in Louisiana, Bible-based curriculum can now indoctrinate young, pliant minds with the good news of the Lord—all on the state taxpayers’ dime.

Under Gov. Bobby Jindal’s voucher program, considered the most sweeping in the country, Louisiana is poised to spend tens of millions of dollars to help poor and middle-class students from the state’s notoriously terrible public schools receive a private education. While the governor’s plan sounds great in the glittery parlance of the state’s PR machine, the program is rife with accountability problems that actually haven’t been solved by the new standards the Louisiana Department of Education adopted two weeks ago.

For one, of the 119 (mostly Christian) participating schools, Zack Kopplin, a gutsy college sophomore who’s taken to Change.org to stonewall the program, has identified at least 19that teach or champion creationist nonscience and will rake in nearly $4 million in public funding from the initial round of voucher designations.

Many of these schools, Kopplin notes, rely on Pensacola-based A Beka Book curriculum or Bob Jones University Press textbooks to teach their pupils Bible-based “facts,” such as the existence ofNessie the Loch Ness Monster and all sorts of pseudoscience that researcher Rachel Tabachnick and writer Thomas Vinciguerra have thankfully pored over so the rest of world doesn’t have to.

Here are some of my favorite lessons:

1. Dinosaurs and humans probably hung out: “Bible-believing Christians cannot accept any evolutionary interpretation. Dinosaurs and humans were definitely on the earth at the same time and may have even lived side by side within the past few thousand years.”—Life Science, 3rd ed., Bob Jones University Press, 2007

Much like Whoopi and Teddy in the cinematic classic Theodore Rex. Screenshot: YouTube

Much like tough cop Katie Coltrane and Teddy the T-rex in the direct-to-video hit Theodore Rex Screenshot: YouTube

2. Dragons were totally real: “[Is] it possible that a fire-breathing animal really existed? Today some scientists are saying yes. They have found large chambers in certain dinosaur skulls…The large skull chambers could have contained special chemical-producing glands. When the animal forced the chemicals out of its mouth or nose, these substances may have combined and produced fire and smoke.”—Life Science, 3rd ed., Bob Jones University Press, 2007

3“God used the Trail of Tears to bring many Indians to Christ.”—America: Land That I Love, Teacher ed., A Beka Book, 1994

4. Africa needs religion: “Africa is a continent with many needs. It is still in need of the gospel…Only about ten percent of Africans can read and write. In some areas the mission schools have been shut down by Communists who have taken over the government.”—Old World History and Geography in Christian Perspective, 3rd ed., A Beka Book, 2004

The literacy rate in Africa is "only about 10 percent"--give or take a few dozen percentage points. residentevil_stars2001/Flickr

The literacy rate in Africa is “only about 10 percent”…give or take a few dozen percentage pointsresidentevil_stars2001/Flickr

5. Slave masters were nice guys: “A few slave holders were undeniably cruel. Examples of slaves beaten to death were not common, neither were they unknown. The majority of slave holders treated their slaves well.”—United States History for Christian Schools, 2nd ed., Bob Jones University Press, 1991

Slaves and their masters: BFF 4lyfe!  Edward Williams Clay/Library of Congress

Doesn’t everyone look happy?! Edward Williams Clay/Library of Congress

6. The KKK was A-OK: “[The Ku Klux] Klan in some areas of the country tried to be a means of reform, fighting the decline in morality and using the symbol of the cross. Klan targets were bootleggers, wife-beaters, and immoral movies. In some communities it achieved a certain respectability as it worked with politicians.”—United States History for Christian Schools, 3rd ed., Bob Jones University Press, 2001

Just your friendly neighborhood Imperial Wizard! Unknown/Library of Congress

Just your friendly neighborhood Imperial Wizard Unknown/Library of Congress

7. The Great Depression wasn’t as bad as the liberals made it sound: “Perhaps the best known work of propaganda to come from the Depression was John Steinbeck’s The Grapes of Wrath…Other forms of propaganda included rumors of mortgage foreclosures, mass evictions, and hunger riots and exaggerated statistics representing the number of unemployed and homeless people in America.”—United States History: Heritage of Freedom, 2nd ed., A Beka Book, 1996

Definitely Photoshopped.  U.S. National Archives and Records Administration/Wikipedia

Definitely Photoshopped. U.S. National Archives and Records Administration/Wikipedia

8. SCOTUS enslaved fetuses: “Ignoring 3,500 years of Judeo-Christian civilization, religion, morality, and law, the Burger Court held that an unborn child was not a living person but rather the “property” of the mother (much like slaves were considered property in the 1857 case of Dred Scott v. Sandford).”—American Government in Christian Perspective, 2nd ed., A Beka Book, 1997

9. The Red Scare isn’t over yet: “It is no wonder that Satan hates the family and has hurled his venom against it in the form of Communism.”— American Government in Christian Perspective, 2nd ed., A Beka Book, 1997

Meanwhile, God sneezes glitter snot in the form of Capitalism. Catechetical Guild/Wikipedia

Catechetical Guild/Wikipedia

10. Mark Twain and Emily Dickinson were a couple of hacks: “[Mark] Twain’s outlook was both self-centered and ultimately hopeless…Twain’s skepticism was clearly not the honest questioning of a seeker of truth but the deliberate defiance of a confessed rebel.”—Elements of Literature for Christian Schools, Bob Jones University, 2001

“Several of [Emily Dickinson’s] poems show a presumptuous attitude concerning her eternal destiny and a veiled disrespect for authority in general. Throughout her life she viewed salvation as a gamble, not a certainty. Although she did view the Bible as a source of poetic inspiration, she never accepted it as an inerrant guide to life.”—Elements of Literature for Christian Schools, Bob Jones University, 2001

And her grammar was just despicable! Ugh! Todd-Bingham picture collection, 1837-1966 (inclusive)/ Manuscripts & Archives, Yale University

To say nothing of her poetry’s Syntax and Punctuation—how odious it is.Todd-Bingham picture collection, 1837-1966 (inclusive)/ Manuscripts & Archives, Yale University

11. Abstract algebra is too dang complicated: “Unlike the ‘modern math’ theorists, who believe that mathematics is a creation of man and thus arbitrary and relative, A Beka Bookteaches that the laws of mathematics are a creation of God and thus absolute…A Beka Bookprovides attractive, legible, and workable traditional mathematics texts that are not burdened with modern theories such as set theory.”—ABeka.com

Maths is hard! Screenshot: MittRomney.com

MATHS: Y U SO HARD? Screenshot: MittRomney.com

12Gay people “have no more claims to special rights than child molesters or rapists.”—Teacher’s Resource Guide to Current Events for Christian Schools, 1998-1999, Bob Jones University Press, 1998

13. “Global environmentalists have said and written enough to leave no doubt that their goal is to destroy the prosperous economies of the world’s richest nations.”Economics: Work and Prosperity in Christian Perspective, 2nd ed., A Beka Book, 1999

Plotting world destruction, BRB.  Lynn Freeny, Department of Energy/Flickr

Plotting economic apocalypse, BRB Lynn Freeny, Department of Energy/Flickr

14. Globalization is a precursor to rapture: “But instead of this world unification ushering in an age of prosperity and peace, as most globalists believe it will, it will be a time of unimaginable human suffering as recorded in God’s Word. The Anti-christ will tightly regulate who may buy and sell.”—Economics: Work and Prosperity in Christian Perspective, 2nd ed., A Beka Book, 1999

He'll probably be in cahoots with the global environmentalists. Luca Signorelli/Wikipedia

Swapping insider-trading secrets is the devil’s favorite pastime. Luca Signorelli/WikipediaWhew! Seems extreme. But perhaps we shouldn’t be too surprised. Gov. Jindal, you remember,once tried to perform an exorcism on a college gal pal.

Para antropólogo, a ideia do “eu” precisa dar lugar à de rede (Valor)

Por Carla Rodrigues | Para o Valor, do Rio

7 de agosto de 2012

Divulgação / DivulgaçãoPremiado por sua teoria ator-rede, o francês Bruno Latour discute a relação entre seres humanos e não-humanos

Ele se autodefine como um antropólogo filosófico trabalhando sobre a sociologia. Na prática, o francês Bruno Latour, 65 anos, faz o que ele chama de “antropologia da modernidade”, ao voltar seu olhar para os discursos e práticas desse período, principalmente as científicas.

Dessa pesquisa resultou um de seus livros mais famosos, “Jamais Fomos Modernos – Ensaios de Antropologia Simétrica”, lançado no Brasil em 1994 (Editora 34).

Latour, que está no Brasil pela terceira vez, apresenta na quinta uma palestra gratuita em São Paulo, no Fronteiras do Pensamento, e acaba de participar do simpósio internacional “A Vida Secreta dos Objetos: Novos Cenários da Comunicação”, realizado em São Paulo, Rio e Salvador e que acabou ontem.

Para ele, é aqui que se dará a disputa pelo debate ambiental no século XXI. Hoje empenhado na causa ecológica, Latour é conhecido e premiado por sua teoria ator-rede, uma forma de pensar a relação entre humanos e não-humanos.

Diretor científico da área de pesquisas do Instituto de Estudos Políticos de Paris, integrante de uma geração de franceses formados no pós-guerra, Latour é frequentemente acusado de ser um relativista, crítica que ele rebate com facilidade. “Eu não conheço um ator participante da ciência que não seja um relativista”, afirma.

Valor: O senhor acredita que o Brasil ocupa um lugar especial no cenário mundial neste momento em que a Europa vive uma crise?

Bruno Latour: O Brasil faz parte de minha vida desde a minha infância, pois tive três irmãs que moraram no país, por razões diferentes. Acredito que a questão ecológica do século XXI vai ser decidida aqui. Há coisas que podem ser melhoradas na Europa, do ponto de vista ambiental, mas o verdadeiro cenário desse jogo será o Brasil, porque já é muito tarde para a Ásia e a África. A questão é saber se os intelectuais e os políticos brasileiros poderão ir além dos fundamentos da modernidade. Mas a grande questão ecológica se desenrolará aqui.

Valor: Sua teoria ator-rede se refere a seres humanos e não-humanos. É uma crítica ao humanismo? O que o legado humanista nos proporcionou de tão criticável?

Latour: O humanismo é uma forma limitada de pensar o grupo dos humanos, que vejo como dependentes de muitos outros seres que não são humanos. Uma definição que isole o humano dos seres que o fabricam – tanto as divindades religiosas quanto as coisas com as quais os humanos vivem, como as árvores, mas também o alumínio para fazer estes talheres – é uma visão estreia. A perspectiva humanista foi legítima em uma determinada época, se falarmos do humanismo da metade do século XIX até a metade do século XX, antes que os ecologistas tenham chamado nossa atenção para o problema ambiental. Mas hoje não há mais nenhum sentido falar em humanismo. Este tipo de humanismo não tem os elementos necessários para absorver as grandes questões políticas atuais. Não se pode, por exemplo, fazer uma teoria consciente do problema do clima com o pensamento moral de Kant. Precisamos pensar na composição na qual seres humanos e não-humanos se relacionam. O humanismo é uma versão ultrapassada dos problemas políticos que nos dizem respeito. Hoje, trata-se de ser inteiramente humanista, ou seja, incluir todos os seres que são necessários para a existência humana.

Valor: Um dos postulados da teoria ator-rede é que, quando uma pessoa age, mais alguém está agindo junto. O senhor poderia explicar como isso funciona?

Latour: Os humanos são envolvidos por muitos outros seres, e a ideia de que uma pessoa age autonomamente, com seus próprios objetivos, não funciona nem na economia, nem na religião, nem na psicologia nem em nenhuma outra situação. Portanto, a pergunta que a teoria ator-rede coloca é: quais são os outros seres ativos no momento em que alguém age? A antropologia e a sociologia que tento desenvolver se ocupa da pesquisa desses seres. Eu posso colocar a questão de um modo inverso: como, apesar das evidências de todos os numerosos seres que participam de uma ação, continua-se a pensar como se o único ator fosse o humano dotado de uma psicologia, ciente de si mesmo, calculador, autônomo, responsável? A antropologia no Brasil é particularmente capaz de entender que não há esse “eu”, esse sujeito individual e autônomo que age no mundo, o que é uma visão muito estreita. Tenho muito contato com outros antropólogos brasileiros, como o Eduardo Viveiros de Castro (UFRJ).

Valor: O senhor veio ao Brasil para participar de um simpósio sobre novas tecnologias de comunicação. Qual é a grande afinidade entre a sua teoria ator-rede e as teorias da comunicação?

Latour: Elas são próximas porque a teoria ator-rede é essencialmente uma teoria da multiplicidade de mediações, e esses pesquisadores estão interessados em discutir o domínio da mídia e das mediações. Aqueles que se interessam por mediação – de modo positivo, e não negativamente – encontram conceitos e métodos para trabalhar com a teoria ator-rede.

Valor: Por que os jornalistas estão sempre mencionados entre os integrantes importantes da teoria ator-rede?

Latour: A formatação de informações desempenha um papel muito importante no espaço público, no qual se situa o espaço político. Não conheço muitos estudos sobre jornalismo que sejam feitos a partir da teoria ator-rede, porque essas pesquisas geralmente são feitas do ponto de vista crítico, e a teoria ator-rede não é uma crítica. Muito frequentemente, os jornalistas são simplesmente acusados de deturpar um ideal de verdade que, se não houvesse a mediação, chegaria ao público a partir de uma transmissão transparente e direta. Cientistas, políticos e economistas gostam de dizer que, se não houvesse os jornalistas, a informação seria mais transparente, mais direta, menos comprometida.

Valor: A teoria ator-rede se transformou em muitas outras coisas – cada um dos pesquisadores do grupo original seguiu por um lado, e houve uma diáspora. O senhor ainda se reconhece como um teórico da ator-rede?

Latour: O grupo original nunca foi muito unido, mas se reuniu em um momento em que a sociologia percebeu que havia negligenciado a técnica, a ciência, e os seres não-humanos. Foi uma tomada de consciência das ciências sociais de que o século XX nos legou uma série de questões – como a da dominação e a da exploração -, mas sempre com uma visão sociocentrada. A teoria ator-rede vem a ser a evidência de que é preciso se interessar pela vida secreta dos objetos.

Valor: Refaço ao senhor uma pergunta que está no livro “A Esperança de Pandora” (Edusc): de onde provém a oposição entre o campo da razão e o campo da força?

Latour: Fiz uma genealogia dessa oposição, que remonta à falsa disputa entre os sofistas e os filósofos e organizou o debate nos países ocidentais. Pretendi suspender essa separação e colocar a questão sobre qual é a força dos dispositivos racionais. Foi assim que comecei minha antropologia da ciência. E há uma segunda pergunta: quais são as razões da relação de força política, religiosa, econômica? A distinção entre força e razão faz parte de um conjunto de antigas dicotomias que não são mais capazes de nos orientar quando falamos da questão científica. Nessa dicotomia, supõe-se que a razão vai unificar a discussão. Mas, se a razão já teve esse poder, atualmente não tem mais, e precisamos encontrar outras ferramentas intelectuais para nos orientar nessa disputa. É o que eu chamo de cartografia da controvérsia. Essa é hoje uma grande questão para a democracia.

Valor: Afirmar que a ciência é social é uma forma de relativizar os resultados científicos?

Latour: Esse é um mal-entendido sobre o significado da palavra social. Evidentemente, dizer que os fatos são sociais não equivale a dizer que esse garfo é uma fabricação social – isso não faria sentido. Eu digo que esse garfo é resultado de um processo industrial que inclui uma legislação, empresas, indústrias – o que é totalmente diferente. A ciência faz parte de um coletivo – estou propositalmente evitando usar a palavra social – do mundo. Há quem acredite que a ciência, particularmente as ciências naturais, é absoluta. Mas esses são os religiosos da ciência, não os participantes da ciência. Não conheço um ator participante da ciência que não seja um relativista ou, melhor dizendo, um relacionista, porque ele sabe que conhecer é estabelecer relações dentro de um quadro de referências. A crítica aos relativistas, feita pelos absolutistas, é frequente, mas essa não é uma discussão produtiva. A discussão que me interessa é: como estabelecer as relações entre os quadros de referência, as culturas, os modos de existência, as formas de vida? Não conheço quem que, desse ponto de vista, critique o relativismo.

Valor: Pode-se resumir seu livro “Jamais Fomos Modernos” como uma crítica à modernidade. O senhor mantém as mesmas críticas em relação aos pós-modernos?

Latour: Sim. Os pós-modernos tiveram a sensibilidade de perceber que havia qualquer coisa de complicada na modernidade, mas é o mesmo movimento. Simplesmente há um retorno a alguns dos problemas que a modernidade não havia tratado, mas não há um retorno às raízes da modernidade.

Carla Rodrigues, professora da Universidade Federal Fluminense (UFF) e da Pontifícia Universidade Católica do Rio (PUC-Rio), é doutora em filosofia e pesquisadora do CNPq

© 2000 – 2012. Todos os direitos reservados ao Valor Econômico S.A. . Verifique nossos Termos de Uso em http://www.valor.com.br/termos-de-uso. Este material não pode ser publicado, reescrito, redistribuído ou transmitido por broadcast sem autorização do Valor Econômico. 

Post Normal Science: Deadlines (Climate Etc.)

Posted on August 3, 2012

by Steven Mosher

Science has changed. More precisely, in post normal conditions the behavior of people doing science has changed.

Ravetz describes a post normal situation by the following criteria:

  1. Facts are uncertain
  2. Values are in conflict
  3. Stakes are high
  4. Immediate action is required

The difference between Kuhnian normal science, or the behavior of those doing science under normal conditions, and post normal science is best illustrated by example. We can use the recent discovery of the Higgs Boson as an example. Facts were uncertain–they always are to a degree; no values were in conflict; the stakes were not high; and, immediate action was not required. What we see in that situation is those doing science acting as we expect them to, according to our vague ideal of science. Because facts are uncertain, they listen to various conflicting theories. They try to put those theories to a test. They face a shared uncertainity and in good faith accept the questions and doubts of others interested in the same field. Their participation in politics is limited to asking for money. Because values are not in conflict no theorist takes the time to investigate his opponent’s views on evolution or smoking or taxation. Because the field of personal values is never in play, personal attacks are minimized. Personal pride may be at stake, but values rarely are. The stakes for humanity in the discovery of the Higgs are low: at least no one argues that our future depends upon the outcome. No scientist straps himself to the collider and demands that it be shut down. And finally, immediate action is not required; under no theory is the settling of the uncertainty so important as to rush the result. In normal science, according to Kuhn,  we can view the behavior of those doing science as puzzle solving. The details of a paradigm are filled out slowly and deliberately.

The situation in climate science are close to the polar opposite of this. That does not mean and should not be construed as a criticism of climate science or its claims. The simple point is this: in a PNS situation, the behavior of those doing science changes. To be sure much of their behavior remains the same. They formulate theories; they collect data, and they test their theories against the data. They don’t stop doing what we notional  describe as science. But, as foreshadowed above in the description of how high energy particle physicists behave, one can see how that behavior changes in a PNS situation. There is uncertainty, but the good faith that exists in normal science, the faith that other people are asking questions because they actually want the answer is gone. Asking questions, raising doubts, asking to see proof becomes suspect in and of itself. And those doing science are faced with a question that science cannot answer: Does this person really want the answer or are they amerchant of doubt? Such a question never gets asked in normal science. Normal science doesn’t ask this question because science cannot answer it.

Because values are in conflict the behavior of those doing science changes. In normal science no one would care if Higgs was a Christian or an atheist. No one would care if he voted liberal or conservative; but because two different value systems are in conflict in climate science, the behavior of those doing science changes. They investigate each other. They question motives. They form tribes.  And because the stakes are high the behavior of those doing science changes as well. They protest; they take money from lobby groups on both sides and worse of all they perform horrendous raps on youTube. In short, they become human; while those around them canonize them or demonize them and their findings become iconized or branded as hoaxes.

This brings us to the last aspect of a PNS situation: immediate action is required. This perhaps is the most contentious aspect of PNS, in fact I would argue it is thedefining characteristic. In all PNS situations it is almost always the case the one side sees the need for action, given the truth of their theory, while the doubtersmust of necessity see no need for immediate action. They must see no need for immediate action because their values are at risk and because the stakes are high. Another way to put this is as follows. When you are in a PNS situation, all sides must deny it. Those demanding immediate action, deny it by claiming more certainty*than is present; those refusing immediate action, do so by increasing demands for certainty. This leads to a centralization and valorization of the topic of uncertainty, and epistemology becomes a topic of discussion for those doing science. That is decidedly not normal science.

The demand for immediate action, however, is broader than simply a demand that society changes. In a PNS situation the behavior of those doing science changes. One of the clearest signs that you are in PNS is the change in behavior around deadlines. Normal science has no deadline. In normal science, the puzzle is solved when it is solved. In normal science there may be a deadline to shut down the collider for maintenance. Nobody rushes the report to keep the collider running longer than it should. And if a good result is found, the schedules can be changed to accommodate the scienceBroadly speaking, science drives the schedule; the schedule doesn’t drive the science.

The climategate mails are instructive here. As one reads through the mails it’s clear that the behavior of those doing science is not what one would call disinterested patient puzzle solving. Human beings acting in a situation where values are in conflict and stakes are high will engage in behavior that they might not otherwise. Those changes are most evident in situations surrounding deadlines. The point here is not to rehash The Crutape Lettersbut rather to relook at one incident ( there are others, notably around congressional hearings ) where deadlines came into play. The deadline in question was the deadline for submitting papers for consideration. As covered in The Crutape Letters and in The Hockeystick Illusion, the actions taken by those doing science around the“Jesus Paper” is instructive. In fact, were I to rewrite the Crutape letters I would do it from the perspective of PNS, focusing on how the behavior of those doing science deviated from the ideals of openness, transparency and letting truth come on its own good time.

Climategate is about FOIA. There were two critical paths for FOIA: one sought data, the other sought the emails of scientists. Not quite normal. Not normal in that data is usually shared; not normal in that we normally respect the privacy of those doing science. But this is PNS, and all bets are off. Values and practices from other fields, such as business and government,  are imported into the culture of science: Data hoarding is defended using IP and confidentiality agreements. Demanding private mail is defended using values imported from performing business for the public. In short, one sign that a science is post normal, is the attempt to import values and procedures from related disciplines. Put another way, PNS poses the question of governance. Who runs science and how should they run it.

The “Jesus paper” in a nutshell can be explained as follows. McIntyre and McKittrick had a paper published in the beginning of 2005. That paper needed to be rebutted in order to make Briffa’s job of writing chapter 6 easier. However, there was a deadline in play. Papers had to be accepted by a date certain. At one point Steven Schneider suggested the creation of a new category, a novelty–  provisionally accepted — so that the “jesus paper” could make the deadline. McIntyre covers the issue here. One need not re-adjudicate whether or not the IPCC rules were broken. And further these rules have nothing whatsoever ever to do with the truth of the claims in that paper. This is not about the truth of the science. What is important is the importation of the concept of a deadline into the search for truth. What is important is that the behavior of those doing science changes. Truth suddenly cares about a date. Immediate action is required. In this case immediate action is taken to see to it that the paper makes it into the chapter. Normal science takes no notice of deadlines. In PNS, deadlines matter.

Last week we saw another example of deadlines and high stakes changing the behavior of those doing science. The backstory here explains .   It appears to me that the behavior of those involved changed from what I have known it to be. It changed because they perceived that immediate action was required. A deadline had to be met. Again, as with the Jesus paper, the facts surrounding the releasedo not go to the truth of the claims. In normal science, a rushed claimed might very well get the same treatment as an unrushed claim: It will be evaluated on its merits. In PNS, either the rush to meet an IPCC deadline– as in the case of the Jesus paper, or the rush to be ready for congress –as in the Watts case, is enoughfor some doubt the science.  What has been testified to in Congress by Christy, a co author, may very well be true. But in this high stakes arena, where facts are uncertain and values are in conflict, the behavior of those doing science can and does change. Not all their behavior changes. They still observe and test and report. But the manner in which they do that changes. Results are rushed and data is held in secret. Deadlines change everything. Normal science doesn’t operate this way; if it does, quality can suffer. And yet, the demand for more certainty than is needed, the bad faith game of delaying action by asking questions, precludes a naïve return to science without deadlines.

The solution that Ravetz suggests is extended peer review and a recognition of the importance of quality. In truth, the way out of a PNS situation is not that simple. The first step out of a PNS situation is the recognition that one is in the situation to begin with. Today, few people embroiled in this debate would admit that the situation has changed how they would normally behave. An admission that this isn’t working is a cultural crisis for science. No one has the standing to describe how one should conduct science in a PNS situation. No one has the standing to chart the path out of a PNS situation. The best we can do is describe what we see. Today, I observe that deadlines change the behavior of those doing science. We see that in climategate; we see that in the events of the past week. That’s doesn’t entail anything about the truth of science performed under pressure. But it should make us pause and consider if truth will be found any faster by rushing the results and hiding the data.

*I circulated a copy of this to Michael Tobis to get his reaction. MT took issue with this characterization. MT, I believe, originated the argument that our uncertainty is a reason for action. It is true that while the certainty about the science  has been a the dominant piece of the rhetoric, there has been a second thread of rhetoric that bases action in the uncertainty about sensitivity. I would call this certainty shifting. While the uncertainty about facts of sensitivity are accepted in this path of argument the certainty is shifted to certainty about values and certainty about impacts. In short, the argument becomes that while we are uncertain about sensitivity the certainty we have about large impacts and trans-generational obligations necessitates action.

Projeto proíbe uso de animais em pesquisas se houver sofrimento (Agência Câmara)

JC e-mail 4551, de 31 de Julho de 2012.

Está em análise na Câmara o Projeto de Lei 2905/11, do deputado Roberto De Lucena (PV-SP), que proíbe o uso de animais em pesquisas quando eles forem submetidos a algum tipo de sofrimento físico ou psicológico.

A proibição vale para estudos relacionados à produção de cosméticos, perfumes, produtos para higiene pessoal, limpeza doméstica, lavagem de roupas, de suprimentos de escritório, de protetores solares, além de vitaminas e suplementos.

Atualmente a Lei dos Crimes Ambientais (9.605/98), que define punições para quem praticar atividade lesiva ao meio ambiente, criminaliza apenas a realização de experiência dolorosa ou cruel em animal vivo, ainda que para fins didáticos ou científicos, quando existirem recursos alternativos.

Pelo projeto, quem não cumprir a determinação ficará sujeito às penalidades previstas na lei de crimes ambientais. No caso de provocar o sofrimento de animais durante pesquisa, a pessoa poderá pegar de três meses a um ano de prisão, além de ser multada.

Declaração Universal – O autor do projeto lembra que a Declaração Universal dos Direitos dos Animais, estabelecida pela Organização das Nações Unidas para a Educação, Ciência e a Cultura (Unesco) em 1978, prevê que experimentos que causem sofrimento físico ou psicológico violam os direitos dos animais e que métodos alternativos devem ser desenvolvidos e sistematicamente implementados.

“O ideal seria dispormos de técnicas alternativas ao uso de animais em toda atividade de ensino e pesquisa. A cura para muitas doenças depende de pesquisas médicas que utilizam animais e não podem ainda ser realizadas por métodos alternativos. Mas o que dizer, entretanto, de pesquisas relacionadas, por exemplo, à produção de cosméticos? Cosméticos não são produtos essenciais para a vida e a saúde humana. Não há, neste caso, nenhuma justificativa para tolerarmos o sofrimento de milhares de animais”, disse o parlamentar.

Tramitação – A proposição tramita em conjunto com o PL 4548/98 e outras oito propostas, que estão prontas para serem votadas em Plenário.

The Conversion of a Climate-Change Skeptic (N.Y.Times)

OP-ED CONTRIBUTOR

By RICHARD A. MULLER

Published: July 28, 2012

Berkeley, Calif.

CALL me a converted skeptic. Three years ago I identified problems in previous climate studies that, in my mind, threw doubt on the very existence of global warming. Last year, following an intensive research effort involving a dozen scientists, I concluded that global warming was real and that the prior estimates of the rate of warming were correct. I’m now going a step further: Humans are almost entirely the cause.

My total turnaround, in such a short time, is the result of careful and objective analysis by the Berkeley Earth Surface Temperature project, which I founded with my daughter Elizabeth. Our results show that the average temperature of the earth’s land has risen by two and a half degrees Fahrenheit over the past 250 years, including an increase of one and a half degrees over the most recent 50 years. Moreover, it appears likely that essentially all of this increase results from the human emission of greenhouse gases.

These findings are stronger than those of the Intergovernmental Panel on Climate Change, the United Nations group that defines the scientific and diplomatic consensus on global warming. In its 2007 report, the I.P.C.C. concluded only that most of the warming of the prior 50 years could be attributed to humans. It was possible, according to the I.P.C.C. consensus statement, that the warming before 1956 could be because of changes in solar activity, and that even a substantial part of the more recent warming could be natural.

Our Berkeley Earth approach used sophisticated statistical methods developed largely by our lead scientist, Robert Rohde, which allowed us to determine earth land temperature much further back in time. We carefully studied issues raised by skeptics: biases from urban heating (we duplicated our results using rural data alone), from data selection (prior groups selected fewer than 20 percent of the available temperature stations; we used virtually 100 percent), from poor station quality (we separately analyzed good stations and poor ones) and from human intervention and data adjustment (our work is completely automated and hands-off). In our papers we demonstrate that none of these potentially troublesome effects unduly biased our conclusions.

The historic temperature pattern we observed has abrupt dips that match the emissions of known explosive volcanic eruptions; the particulates from such events reflect sunlight, make for beautiful sunsets and cool the earth’s surface for a few years. There are small, rapid variations attributable to El Niño and other ocean currents such as the Gulf Stream; because of such oscillations, the “flattening” of the recent temperature rise that some people claim is not, in our view, statistically significant. What has caused the gradual but systematic rise of two and a half degrees? We tried fitting the shape to simple math functions (exponentials, polynomials), to solar activity and even to rising functions like world population. By far the best match was to the record of atmospheric carbon dioxide, measured from atmospheric samples and air trapped in polar ice.

Just as important, our record is long enough that we could search for the fingerprint of solar variability, based on the historical record of sunspots. That fingerprint is absent. Although the I.P.C.C. allowed for the possibility that variations in sunlight could have ended the “Little Ice Age,” a period of cooling from the 14th century to about 1850, our data argues strongly that the temperature rise of the past 250 years cannot be attributed to solar changes. This conclusion is, in retrospect, not too surprising; we’ve learned from satellite measurements that solar activity changes the brightness of the sun very little.

How definite is the attribution to humans? The carbon dioxide curve gives a better match than anything else we’ve tried. Its magnitude is consistent with the calculated greenhouse effect — extra warming from trapped heat radiation. These facts don’t prove causality and they shouldn’t end skepticism, but they raise the bar: to be considered seriously, an alternative explanation must match the data at least as well as carbon dioxide does. Adding methane, a second greenhouse gas, to our analysis doesn’t change the results. Moreover, our analysis does not depend on large, complex global climate models, the huge computer programs that are notorious for their hidden assumptions and adjustable parameters. Our result is based simply on the close agreement between the shape of the observed temperature rise and the known greenhouse gas increase.

It’s a scientist’s duty to be properly skeptical. I still find that much, if not most, of what is attributed to climate change is speculative, exaggerated or just plain wrong. I’ve analyzed some of the most alarmist claims, and my skepticism about them hasn’t changed.

Hurricane Katrina cannot be attributed to global warming. The number of hurricanes hitting the United States has been going down, not up; likewise for intense tornadoes. Polar bears aren’t dying from receding ice, and the Himalayan glaciers aren’t going to melt by 2035. And it’s possible that we are currently no warmer than we were a thousand years ago, during the “Medieval Warm Period” or “Medieval Optimum,” an interval of warm conditions known from historical records and indirect evidence like tree rings. And the recent warm spell in the United States happens to be more than offset by cooling elsewhere in the world, so its link to “global” warming is weaker than tenuous.

The careful analysis by our team is laid out in five scientific papers now online atBerkeleyEarth.org. That site also shows our chart of temperature from 1753 to the present, with its clear fingerprint of volcanoes and carbon dioxide, but containing no component that matches solar activity. Four of our papers have undergone extensive scrutiny by the scientific community, and the newest, a paper with the analysis of the human component, is now posted, along with the data and computer programs used. Such transparency is the heart of the scientific method; if you find our conclusions implausible, tell us of any errors of data or analysis.

What about the future? As carbon dioxide emissions increase, the temperature should continue to rise. I expect the rate of warming to proceed at a steady pace, about one and a half degrees over land in the next 50 years, less if the oceans are included. But if China continues its rapid economic growth (it has averaged 10 percent per year over the last 20 years) and its vast use of coal (it typically adds one new gigawatt per month), then that same warming could take place in less than 20 years.

Science is that narrow realm of knowledge that, in principle, is universally accepted. I embarked on this analysis to answer questions that, to my mind, had not been answered. I hope that the Berkeley Earth analysis will help settle the scientific debate regarding global warming and its human causes. Then comes the difficult part: agreeing across the political and diplomatic spectrum about what can and should be done.

Richard A. Muller, a professor of physics at the University of California, Berkeley, and a former MacArthur Foundation fellow, is the author, most recently, of “Energy for Future Presidents: The Science Behind the Headlines.”

*   *   *

Climate change study forces sceptical scientists to change minds (The Guardian)

Earth’s land shown to have warmed by 1.5C over past 250 years, with humans being almost entirely responsible

Leo Hickman
guardian.co.uk, Sunday 29 July 2012 14.03 BST

Prof Richard MullerProf Richard Muller considers himself a converted sceptic following the study’s surprise results. Photograph: Dan Tuffs for the Guardian

The Earth’s land has warmed by 1.5C over the past 250 years and “humans are almost entirely the cause”, according to a scientific study set up to address climate change sceptics’ concerns about whether human-induced global warming is occurring.

Prof Richard Muller, a physicist and climate change sceptic who founded the Berkeley Earth Surface Temperature (Best) project, said he was surprised by the findings. “We were not expecting this, but as scientists, it is our duty to let the evidence change our minds.” He added that he now considers himself a “converted sceptic” and his views had undergone a “total turnaround” in a short space of time.

“Our results show that the average temperature of the Earth’s land has risen by 2.5F over the past 250 years, including an increase of 1.5 degrees over the most recent 50 years. Moreover, it appears likely that essentially all of this increase results from the human emission of greenhouse gases,” Muller wrote in an opinion piece for the New York Times.

Can scientists in California end the war on climate change?
Study finds no grounds for climate sceptics’ concerns
Video: Berkeley Earth tracks climate change
Are climate sceptics more likely to be conspiracy theorists?

The team of scientists based at the University of California, Berkeley, gathered and merged a collection of 14.4m land temperature observations from 44,455 sites across the world dating back to 1753. Previous data sets created by Nasa, the US National Oceanic and Atmospheric Administration, and the Met Office and the University of East Anglia’s climate research unit only went back to the mid-1800s and used a fifth as many weather station records.

The funding for the project included $150,000 from the Charles G Koch Charitable Foundation, set up by the billionaire US coal magnate and key backer of the climate-sceptic Heartland Institute thinktank. The research also received $100,000 from the Fund for Innovative Climate and Energy Research, which was created by Bill Gates.

Unlike previous efforts, the temperature data from various sources was not homogenised by hand – a key criticism by climate sceptics. Instead, the statistical analysis was “completely automated to reduce human bias”. The Best team concluded that, despite their deeper analysis, their own findings closely matched the previous temperature reconstructions, “but with reduced uncertainty”.

Last October, the Best team published results that showed the average global land temperature has risen by about 1C since the mid-1950s. But the team did not look for possible fingerprints to explain this warming. The latest data analysis reached much further back in time but, crucially, also searched for the most likely cause of the rise by plotting the upward temperature curve against suspected “forcings”. It analysed the warming impact of solar activity – a popular theory among climate sceptics – but found that, over the past 250 years, the contribution of the sun has been “consistent with zero”. Volcanic eruptions were found to have caused short dips in the temperature rise in the period 1750–1850, but “only weak analogues” in the 20th century.

“Much to my surprise, by far the best match came to the record of atmospheric carbon dioxide, measured from atmospheric samples and air trapped in polar ice,” said Muller. “While this doesn’t prove that global warming is caused by human greenhouse gases, it is currently the best explanation we have found, and sets the bar for alternative explanations.”

Muller said his team’s findings went further and were stronger than the latest report published by the Intergovernmental Panel on ClimateChange.

In an unconventional move aimed at appeasing climate sceptics by allowing “full transparency”, the results have been publicly released before being peer reviewed by the Journal of Geophysical Research. All the data and analysis is now available to be freely scrutinised at the Bestwebsite. This follows the pattern of previous Best results, none of which have yet been published in peer-reviewed journals.

When the Best project was announced last year, the prominent climate sceptic blogger Anthony Watts was consulted on the methodology. He stated at the time: “I’m prepared to accept whatever result they produce, even if it proves my premise wrong.” However, tensions have since arisen between Watts and Muller.

Early indications suggest that climate sceptics are unlikely to fully accept Best’s latest results. Prof Judith Curry, a climatologist at the Georgia Institute of Technology who runs a blog popular with climate sceptics and who is a consulting member of the Best team, told the Guardian that the method used to attribute the warming to human emissions was “way over-simplistic and not at all convincing in my opinion”. She added: “I don’t think this question can be answered by the simple curve fitting used in this paper, and I don’t see that their paper adds anything to our understanding of the causes of the recent warming.”

Prof Michael Mann, the Penn State palaeoclimatologist who has faced hostility from climate sceptics for his famous “hockey stick” graph showing a rapid rise in temperatures during the 20th century, said he welcomed the Best results as they “demonstrated once again what scientists have known with some degree of certainty for nearly two decades”. He added: “I applaud Muller and his colleagues for acting as any good scientists would, following where their analyses led them, without regard for the possible political repercussions. They are certain to be attacked by the professional climate change denial crowd for their findings.”

Muller said his team’s analysis suggested there would be 1.5 degrees of warming over land in the next 50 years, but if China continues its rapid economic growth and its vast use of coal then that same warming could take place in less than 20 years.

“Science is that narrow realm of knowledge that, in principle, is universally accepted,” wrote Muller. “I embarked on this analysis to answer questions that, to my mind, had not been answered. I hope that the Berkeley Earth analysis will help settle the scientific debate regarding global warming and its human causes. Then comes the difficult part: agreeing across the political and diplomatic spectrum about what can and should be done.”

Ciência e cultura, o que elas têm em comum? (Jornal da Ciência)

JC e-mail 4549, de 27 de Julho de 2012.

A pergunta foi tema da mesa-redonda “Divulgação da Ciência e da Cultura”, realizada na 64ª Reunião Anual da Sociedade Brasileira para o Progresso da Ciência (SBPC), que termina hoje (27), em São Luís.

Para Ildeu de Castro Moreira, diretor de Popularização e Difusão da Ciência e Tecnologia do Ministério da Ciência, Tecnologia e Inovação (MCTI) e conselheiro da SBPC, o debate sobre a relação da ciência com a arte é muito importante porque são duas facetas fundamentais da cultura humana. “Ciência, arte e cultura têm em comum a criatividade inerente ao ser humano”, definiu. Ele explica que arte e ciência são atividades humanas e sociais baseadas na criatividade e curiosidade.

Físico e divulgador científico, Ildeu falou sobre o “imaginário científico presente na mente de artistas”, e explicou que a ciência também tem preocupação estética e guarda semelhanças com a arte. Para ele, há beleza nas teorias científicas. “Equações matemáticas e fórmulas físicas são lindas. Podem parecer chatas em sala de aula, mas contando com a ajuda do olhar de um artista é possível mostrar essa beleza. É preciso aprender a olhar a beleza da ciência, assim como temos que aprender a olhar muita coisa na arte contemporânea”, exemplifica.

Para Ildeu, as conexões entre ciência e arte são importantes para fazer a divulgação científica chegar mais facilmente ao público. Em sua exposição, ele mostrou manifestações artísticas que falam de ciência, dando exemplos de poesias, músicas, enredos de escolas de samba, ditos populares e cordel.

Público infantil – Em sua apresentação na mesa-redonda, Luisa Medeiros Massarani, jornalista e chefe do Museu da Vida da Fiocruz, no Rio de Janeiro, falou sobre iniciativas de divulgação científicas voltadas para o público infantil. “A experiência tem demonstrado uma grande receptividade das crianças, maior do que a de adultos e adolescentes. Principalmente devido à curiosidade da criança, que são consideradas como ‘cientistas naturais'”, explica.

Luisa falou sobre o crescimento de museus de ciências no País, que atualmente são cerca de 200, embora ainda estejam concentrados em algumas regiões. “Os museus têm apelo incrível para as crianças e são importantes também para o divulgador que vê na hora a reação da criança”, revela. Apesar de os museus terem grande parte do público formado por crianças, Luisa afirma que é preciso pensar em espaços específicos para elas, desde a redução do tamanho dos móveis até atividades interativas adequadas.

Ela defende que a criança deve ser encarada como ator social importante no processo de divulgação científica. “Falar de divulgação científica para criança não é falar de ciência unilateralmente, é preciso que a criança seja ator importante e protagonista do processo”, explica ao dizer que a experiência de uma feira de ciência, ou a visita a um museu fica na memória da criança e pode influenciar sua formação, além de provocar e despertar o interesse pela ciência.

A chefe do Museu da Vida citou exposições, livros e publicações voltadas para o público infantil. E destacou a importância de fazer avaliações junto às crianças depois dessas experiências, para saber qual caminho seguir.

Ildeu aproveitou para sugerir que artistas participem mais ativamente das reuniões da SBPC, não somente como um evento paralelo, como a SBPC Cultural, mas como integrantes de mesas e debates com os cientistas. A ideia é aproveitar o público da Reunião, que alcança 15, 20 mil pessoas para falar dessa relação.

(Jornal da Ciência)

Stop bullying the ‘soft’ sciences (L.A.Times)

OP-ED

The social sciences are just that — sciences.

By Timothy D. Wilson

July 12, 2012

Sociology studentA student is seen at the UC Irvine archive doing research for her sociology dissertation. (Los Angeles Times / July 9, 2009)

Once, during a meeting at my university, a biologist mentioned that he was the only faculty member present from a science department. When I corrected him, noting that I was from the Department ofPsychology, he waved his hand dismissively, as if I were a Little Leaguer telling a member of the New York Yankees that I too played baseball.

There has long been snobbery in the sciences, with the “hard” ones (physics, chemistry, biology) considering themselves to be more legitimate than the “soft” ones ( psychology, sociology). It is thus no surprise that many members of the general public feel the same way. But of late, skepticism about the rigors of social science has reached absurd heights.

The U.S. House of Representativesrecently voted to eliminate funding for political science research through the National Science Foundation. In the wake of that action, an opinion writer for the Washington Post suggested that the House didn’t go far enough. The NSF should not fund any research in the social sciences, wrote Charles Lane, because “unlike hypotheses in the hard sciences, hypotheses about society usually can’t be proven or disproven by experimentation.”

Lane’s comments echoed ones by Gary Gutting in the Opinionator blog of the New York Times. “While the physical sciences produce many detailed and precise predictions,” wrote Gutting, “the social sciences do not. The reason is that such predictions almost always require randomized controlled experiments, which are seldom possible when people are involved.”

This is news to me and the many other social scientists who have spent their careers doing carefully controlled experiments on human behavior, inside and outside the laboratory. What makes the criticism so galling is that those who voice it, or members of their families, have undoubtedly benefited from research in the disciplines they dismiss.

Most of us know someone who has suffered from depression and sought psychotherapy. He or she probably benefited from therapies such as cognitive behavioral therapy that have been shown to work in randomized clinical trials.

Problems such as child abuse and teenage pregnancy take a huge toll on society. Interventions developed by research psychologists, tested with the experimental method, have been found to lower the incidence of child abuse and reduce the rate of teenage pregnancies.

Ever hear of stereotype threat? It is the double jeopardy that people face when they are at risk of confirming a negative stereotype of their group. When African American students take a difficult test, for example, they are concerned not only about how well they will do but also about the possibility that performing poorly will reflect badly on their entire group. This added worry has been shown time and again, in carefully controlled experiments, to lower academic performance. But fortunately, experiments have also showed promising ways to reduce this threat. One intervention, for example, conducted in a middle school, reduced the achievement gap by 40%.

If you know someone who was unlucky enough to be arrested for a crime he didn’t commit, he may have benefited from social psychological experiments that have resulted in fairer lineups and interrogations, making it less likely that innocent people are convicted.

An often-overlooked advantage of the experimental method is that it can demonstrate what doesn’t work. Consider three popular programs that research psychologists have debunked: Critical Incident Stress Debriefing, used to prevent post-traumatic stress disorders in first responders and others who have witnessed horrific events; the D.A.R.E. anti-drug program, used in many schools throughout America; and Scared Straight programs designed to prevent at-risk teens from engaging in criminal behavior.

All three of these programs have been shown, with well-designed experimental studies, to be ineffective or, in some cases, to make matters worse. And as a result, the programs have become less popular or have changed their methods. By discovering what doesn’t work, social scientists have saved the public billions of dollars.

To be fair to the critics, social scientists have not always taken advantage of the experimental method as much as they could. Too often, for example, educational programs have been implemented widely without being adequately tested. But increasingly, educational researchers are employing better methodologies. For example, in a recent study, researchers randomly assigned teachers to a program called My Teaching Partner, which is designed to improve teaching skills, or to a control group. Students taught by the teachers who participated in the program did significantly better on achievement tests than did students taught by teachers in the control group.

Are the social sciences perfect? Of course not. Human behavior is complex, and it is not possible to conduct experiments to test all aspects of what people do or why. There are entire disciplines devoted to the experimental study of human behavior, however, in tightly controlled, ethically acceptable ways. Many people benefit from the results, including those who, in their ignorance, believe that science is limited to the study of molecules.

Timothy D. Wilson is a professor of psychology at the University of Virginia and the author of “Redirect: The Surprising New Science of Psychological Change.”

Scientific particles collide with social media to benefit of all (Irish Times)

The Irish Times – Thursday, July 12, 2012

xxx Large Hadron Collider at Cern: the research body now has 590,000 followers on Twitter

xxx Large Hadron Collider at Cern: the research body now has 590,000 followers on Twitter

MARIE BORAN

IN 2008 CERN switched on the Large Hadron Collider (LHC) in Geneva – around the same time it sent out its first tweet. Although the first outing of the LHC didn’t go according to plan, the Twitter account gained 10,000 followers within the first day, according to James Gillies, head of communications at Cern.

Speaking at the Euroscience Open Forum in Dublin this week, Gillies explained the role social media plays in engaging the public with the particle physics research its laboratory does. The Twitter account now has 590,000 followers and Cern broke important news via it in March 2010 by joyously declaring: “Experiment have seen collisions.”

“Why do we communicate at Cern? If you talk to the scientists who work there they will tell you it’s a good thing to do and they all want to do it,” Gillies said, adding that Cern is publicly funded so engaging with the people who pay the bills is important.

When the existence of the Higgs particle was announced last week, it wasn’t an exclusive press event. Live video was streamed across the web, questions were taken not only from journalists but also from Twitter followers, and Cern used this as a chance to announce jobs via Facebook.

While Cern appears to be the social media darling of the science world, other research institutes and scientists are still weighing up the pros and cons of platforms like Facebook, Twitter or YouTube.

There is a certain stigma attached to social networking sites, not just because much of the content is perceived as banal, but also because too much tweeting could be damaging to your image as a scientist.

Bora Zivkovic is blogs editor at Scientific American, organiser of the fast-growing science conference ScienceOnline and speaker at the social media panel this Saturday at the Euroscience Open Forum. He says the adoption of social media by scientists is slow but growing.

“Academics are quite risk-averse and are shy about trying new things that have a perceived potential to remove the edge they may have in the academic hierarchy, either through lost time or lost reputation.”

Zivkovic talks about fear of the “Sagan effect”, named after the late Carl Sagan. A talented astronomer and astrophysicist, he was loved by the public but snubbed by the science community.

“Many still see social media as self-promotion, which is still in some scientific circles viewed as a negative thing to do. The situation is reminiscent of the very slow adoption of email by researchers back in the early 1990s.

“Once the scientists figure out how to include social media in their daily workflow, realise it does not take away from their time but actually makes them more effective in reaching their academic goals, and realise that the ‘Sagan effect’ on reputation is a thing of the past, they will readily incorporate social media into their normal work.”

Many researchers still rely heavily on specialist mailing lists. The broadcast capability on social media is far greater and bespoke, claims Dr Matthew Rowe, research associate at the Knowledge Media Institute with the Open University.

“If I was to email people about some recent work I would presume that it would be marked as spam. However, if I was to announce the release of some work through social media, then a debate and conversation could evolve surrounding the topic; I have seen this happen many times on Facebook.”

Conversations on social media sites are often seen as trivial – for scientists, the end goal is “publish or perish”. Results must be published in a reputable academic journal and preferably cited by those in their area.

Twitter, it seems, can help. A 2011 paper from researcher Gunther Eysenbach found a correlation between Twitter activity and highly cited articles. The microblogging site may help citation rate or serve as a measure of how “citable” your paper may be.

In addition, a 2010 survey on Twitter found one-third of academics said they use it for sharing information with peers, communicating with students or as a real-time news source.

For some the argument for social media is the potential for connecting with volunteers and providing valuable data from the citizen scientist. Yolanda Melero Cavero’s MinkApp has connected locals with an effort to control the mink population in Scotland.

“The most interesting thing about MinkApp, for me, was the fact that the scientist was able to get 600 volunteers for her ecological study. Social media has the grassroots potential to engage with willing volunteers,” says Nancy Salmon, researcher at the department of occupational therapy at the University of Limerick.

Rowe gives some sage social media advice for academics about keeping on topic and your language jargon-free.

But there’s always room for humour as demonstrated by the Higgs boson jokes on Twitter and Facebook last week. As astronomer Phil Platt tweeted: “I’ve got 99.9999% problems, but a Higgs ain’t one.”

Hunter-gatherers, Westerners use same amount of energy, contrary to theory (PLoS)

Lindsay Morton
Public Library of Science

25-Jul-2012

Results contradict previously held idea that rising obesity is due to lowered energy expenditure

Modern lifestyles are generally quite different from those of our hunter-gatherer ancestors, a fact that some claim as the cause of the current rise in global obesity, but new results published July 25 in the open access journal PLoS ONE find that there is no difference between the energy expenditure of modern hunter-gatherers and Westerners, casting doubt on this theory.

The research team behind the study, led by Herman Pontzer of Hunter College in New York City, along with David Raichlen of the University of Arizona and Brian M. Wood of Stanford measured daily energy expenditure (calories per day) among the Hadza, a population of traditional hunter-gatherers living in the open savannah of northern Tanzania. Despite spending their days trekking long distances to forage for wild plants and game, the Hadza burned no more calories each day than adults in the U.S. and Europe. The team ran several analyses accounting for the effects of body weight, body fat percentage, age, and gender. In all analyses, daily energy expenditure among the Hadza hunter-gatherers was indistinguishable from that of Westerners. The study was the first to measure energy expenditure in hunter-gatherers directly; previous studies had relied entirely on estimates.

These findings upend the long-held assumption that our hunter-gatherer ancestors expended more energy than modern populations, and challenge the view that obesity in Western populations results from decreased energy expenditure. Instead, the similarity in daily energy expenditure across a broad range of lifestyles suggests that habitual metabolic rates are relatively constant among human populations. This in turn supports the view that the current rise in obesity is due to increased food consumption, not decreased energy expenditure.

The authors emphasize that physical exercise is nonetheless important for maintaining good health. In fact, the Hadza spend a greater percentage of their daily energy budget on physical activity than Westerners do, which may contribute to the health and vitality evident among older Hadza. Still, the similarity in daily energy expenditure between Hadza hunter-gatherers and Westerners suggests that we have more to learn about human physiology and health, particularly in non-Western settings.

“These results highlight the complexity of energy expenditure. It’s not simply a function of physical activity,” says Pontzer. “Our metabolic rates may be more a reflection of our shared evolutionary past than our diverse modern lifestyles.”

Citation: Pontzer H, Raichlen DA, Wood BM, Mabulla AZP, Racette SB, et al. (2012) Hunter-Gatherer Energetics and Human Obesity. PLoS ONE7(7): e40503. doi:10.1371/journal.pone.0040503

European Commission backs calls for open access to scientific research (The Guardian)

Move follows announcement by UK government that it wants all taxpayer-funded research to be free to view by 2014

Reuters/guardian.co.uk, Tuesday 17 July 2012 14.41 BST
Neelie Kroes

Neelie Kroes, European Commission vice-president for digital agenda, said: ‘Taxpayers should not have to pay twice for scientific research.’ Photograph: Georges Gobet/AFP/Getty Images

The European Commission, which controls one of the world’s largest science budgets, has backed calls for free access to publicly fundedresearch in a move that could force a major change in the business model for publishers such as Reed Elsevier.

“Taxpayers should not have to pay twice for scientific research and they need seamless access to raw data,” said Neelie Kroes, European Commission vice-president for digital agenda.

The EC saidon Tuesday that open access will be a “general principle” applied to grants awarded through the €80bn Horizon 2020 programme for research and innovation.

From 2014 all articles produced with funding from Horizon 2020 will have to be accessible and the goal is for 60% of European publicly funded research to be available by 2016.

The news follows the announcement by the British government that it wants all taxpayer-funded research to be free to view by 2014. David Willets, the universities and science minister told the Gaurdian: “If the taxpayer has paid for this research to happen, that work shouldn’t be put behind a paywall before a British citizen can read it.”

The most prestigious academic journals, such as Nature, Science and Cell, earn the bulk of their revenues through subscriptions from readers.

They have lucrative deals with university libraries, worth about £150m to £200m a year in the UK, to give access to the same scientists who produce and review, usually without payment, the research they publish.

Open-access journals, such as the Public Library of Science, are ofteninternet-based and charge researchers a fee for publication, allowing free access for anyone after publication.

The open-access market has been growing rapidly over the past decade but still only accounts for about 3% of the £5.1bn global market for scholarly journals.

The subscription model has come under attack from some scientists, who argue that publishing companies are making fat profits on the back of taxpayer-funded research.

Elsevier publishes more than 2,000 journals with a staff of about 7,000. It made a profit last year of £768m on revenues of £2.1bn, giving a margin of about 37%.

Publishers argue that quality does not come cheap and their subscription charges reflect the need to maintain large editorial departments and databases of published research.

Máire Geoghegan-Quinn, European commissioner for research, innovation and science, swept this argument aside. “We must give taxpayers more bang for their buck,” she said in a statement. “Open access to scientific papers and data is an important means of achieving this.”

The commission’s move follows recent news that the European medicines regulator will open its data vaults to allow independent researchers to scrutinise results from drug companies’ trials.

“The EU’s decision to adopt a similar policy to that of the UK will mean that the transition time from subscription-based to open-access publishing will be substantially reduced,” Professor Adam Tickell, who was involved in a recent UK government-commissioned report on the issue, told Reuters.

Tickell, of the University of Birmingham, predicted a rapid and substantial reduction in the cost of subscriptions, adding: “With the support of the EU, UK government and major charities, such as the Wellcome Trust, open access to research findings will soon be a reality.”

A Century Of Weather Control (POP SCI)

Posted 7.19.12 at 6:20 pm – http://www.popsci.com

 

Keeping Pilots Updated, November 1930

It’s 1930 and, for obvious reasons, pilots want regular reports on the weather. What to do? Congress’s solution was to give the U.S. Weather Bureau cash to send them what they needed. It was a lot of cash, too: $1.4 million, or “more than one third the sum it spend annually for all of its work.”

About 13,000 miles of airway were monitored for activity, and reports were regularly sent via the now quaintly named “teletype”–an early fax machine, basically, that let a typed message be reproduced. Pilots were then radioed with the information.

From the article “Weather Man Makes the Air Safe.”

 

Battling Hail, July 1947

We weren’t shy about laying on the drama in this piece on hail–it was causing millions in damage across the country and we were sick of it. Our writer says, “The war against hail has been declared.” (Remember: this was only two years after World War II, which was a little more serious. Maybe our patriotism just wouldn’t wane.)

The idea was to scatter silver iodide as a form of “cloud seeding”–turning the moisture to snow before it hails. It’s a process that’s still toyed with today.

From the article “The War Against Hail.”

 

Hunting for a Tornado “Cure,” March 1958

1957 was a record-breaking year for tornadoes, and PopSci was forecasting even rougher skies for 1958. As described by an official tornado watcher: ‘”They’re coming so fast and thick … that we’ve lost count.'”

To try to stop it, researchers wanted to learn more. Meteorologists asked for $5 million more a year from Congress to be able to study tornadoes whirling through the Midwest’s Tornado Alley, then, hopefully, learn what they needed to do to stop them.

From the article “What We’re Learning About Tornadoes.”

 

Spotting Clouds With Nimbus, November 1963

Weather satellites were a boon to both forecasters and anyone affected by extreme weather. The powerful Hurricane Esther was discovered two days before anything else spotted it, leaving space engineers “justifiably proud.” The next satellite in line was the Nimbus, which Popular Science devoted multiple pages to covering, highlighting its ability to photograph cloud cover 24 hours a day and give us better insight into extreme weather.

Spoiler: the results really did turn out great, with Nimbus satellites paving the way for modern GPS devices.

From the article “The Weather Eye That Never Blinks.”

 

Saving Money Globally With Forecasts, November 1970

Optimism for weather satellites seemed to be reaching a high by the ’70s, with Popular Science recounting all the disasters predicted–how they “saved countless lives through early hurricane warnings”–and now even saying they’d save your vacation.

What they were hoping for then was an accurate five-day forecast for the world, which they predicted would save billions and make early warnings even better.

From the article “How New Weather Satellites Will Give You More Reliable Forecasts.”

 

Extreme Weather Alerts on the Radio, July 1979

Those weather alerts that come on your television during a storm–or at least one radio version of those–were documented byPopular Science in 1979. But rather than being something that anyone could tune in to, they were specialized radios you had to purchase, which seems like a less-than-great solution to the problem. But at this point the government had plans to set up weather monitoring stations near 90 percent of the country’s population, opening the door for people to find out fast what the weather situation was.

From the article “Weather-Alert Radios–They Could Save Your Life.”

 

Stopping “Bolts From the Blue,” May 1990

Here Popular Science let loose a whooper for anyone with a fear of extreme weather: lightning kills a lot more people every year than you think, and sometimes a lightning bolt will come and hit you even when there’s not a storm. So-called “bolts from the blue” were a part of the story on better predicting lightning, a phenomenon more manic than most types of weather. Improved sensors played a major part in better preparing people before a storm.

From the article “Predicting Deadly Lightning.”

 

Infrared Views of Weather, August 1983

Early access to computers let weather scientists get a 3-D, radar-based view of weather across the country. The system culled information from multiple sources and placed it in one viewable display. (The man pictured looks slightly bored for how revolutionary it is.) The system was an attempt to take global information and make it into “real-time local predictions.”

From the article “Nowcasting: New Weather Computers Pinpoint Deadly Storms.”

 

Modernizing the National Weather Service, August 1997

A year’s worth of weather detection for every American was coming at the price of “a Big Mac, fries, and a Coke,” the deputy director of the National Weather Service said in 1997. The computer age better tied together the individual parts of weather forecasting for the NWS, leaving a unified whole that could grab complicated meteorological information and interpret it in just a few seconds.

From the article “Weather’s New Outlook.”

 

Modeling Weather With Computers, September 2001

Computer simulations, we wrote, would help us predict future storms more accurately. But it took (at the time) the largest supercomputer around to give us the kinds of models we wanted. Judging by the image, we might’ve already made significant progress on the weather modeling front.

Anarchists attack science (Nature)

Armed extremists are targeting nuclear and nanotechnology workers.

Leigh Phillips
28 May 2012

Investigations of the shooting of nuclear-engineering head Roberto Adinolfi have confirmed the involvement of an eco-anarchist group. P. RATTINI/AFP/GETTY

A loose coalition of eco-anarchist groups is increasingly launching violent attacks on scientists.

A group calling itself the Olga Cell of the Informal Anarchist Federation International Revolutionary Front has claimed responsibility for the non-fatal shooting of a nuclear-engineering executive on 7 May in Genoa, Italy. The same group sent a letter bomb to a Swiss pro-nuclear lobby group in 2011; attempted to bomb IBM’s nanotechnology laboratory in Switzerland in 2010; and has ties with a group responsible for at least four bomb attacks on nanotechnology facilities in Mexico. Security authorities say that such eco-anarchist groups are forging stronger links.

On 11 May, the cell sent a four-page letter to the Italian newspaper Corriere della Sera claiming responsibility for the shooting of Roberto Adinolfi, the chief executive of Ansaldo Nucleare, the nuclear-engineering subsidiary of aerospace and defence giant Finmeccanica. Believed by authorities to be genuine, the letter is riddled with anti-science rhetoric. The group targeted Adinolfi because he is a “sorcerer of the atom”, it wrote. “Adinolfi knows well that it is only a matter of time before a European Fukushima kills on our continent.”

“Science in centuries past promised us a golden age, but it is pushing us towards self-destruction and total slavery,” the letter continues. “With this action of ours, we return to you a tiny part of the suffering that you, man of science, are pouring into this world.” The group also threatened to carry out further attacks.

The Italian Ministry of the Interior has subsequently beefed up security at thousands of potential political, industrial and scientific targets. The measures include assigning bodyguards to 550 individuals.

The Olga Cell, named after an imprisoned Greek anarchist, is part of the Informal Anarchist Federation, which, in April 2011, claimed responsibility for sending a parcel bomb that exploded at the offices of the Swiss nuclear lobby group, Swissnuclear, in Olten. A letter found in the remains of the bomb demanded the release of three individuals who had been detained for plotting an attack on IBM’s flagship nanotechnology facility in Zurich earlier that year. In a situation report published this month, the Swiss Federal Intelligence Service explicitly linked the federation to the IBM attack.

The Informal Anarchist Federation argues that technology, and indeed civilization, is responsible for the world’s ills, and that scientists are the handmaidens of capitalism. “Finmeccanica means bio- and nanotechnology. Finmeccanica means death and suffering, new frontiers of Italian capitalism,” the letter reads.

Gathering momentum
The cell says that it is uniting with eco-anarchist groups in other countries, including Mexico, Chile, Greece and the United Kingdom. Mexico has already seen similar attacks: in August 2011, a group called Individuals Tending Towards Savagery sent a parcel bomb that wounded two nanotechnology researchers at the Monterrey Institute of Technology. One received burns to his legs and a perforated eardrum and the other had his lung pierced by shrapnel (G. Herrera Corral Nature 476,373; 2011). The package contained enough explosive to collapse part of the building, according to police, but failed to detonate properly.

Earlier that year, the same group sent two bombs to the nanotechnology facility at the Polytechnic University of the Valley of Mexico. One was intercepted before anyone could be harmed, but the second detonated, injuring a security guard. It is not clear how closely the group is tied to the Informal Anarchist Federation, but in online forums the two bodies offer “direct support” for each other’s activities and talk of a “blossoming” of a more organized eco-anarchist movement.

In the wake of the Mexican bombings, the Monterrey Institute installed metal detectors, began to use police sniffer dogs and started random inspections of vehicles and packages. After a letter bomb addressed to a nanotechnology researcher at the Polytechnic University of Pachuca in Hidalgo exploded in December last year, the institute installed a perimeter fence and scanners, and campuses across the state heightened security measures.

Italian police investigating the shooting say that they are concerned about the rise in violent action by anarchist groups amid Europe’s economic crisis. On 23 May, for example, members of the Informal Anarchist Federation attacked railway signals in Bristol, UK, causing severe transport delays. An online message from the group said that the targets had been chosen to disrupt employees of the Ministry of Defence and defence-technology businesses in the area, including Raytheon and QinetiQ.

The Swiss report also noted signs of “an increasing degree of international networking between perpetrators”. The level of risk to scientists depends on their field of work, says Simon Johner, a spokesman for the Swiss Federal Intelligence Service. “We are not able to tell them what to do. We can only make them aware of the dangers. It’s up to institutions to take preventative actions.” The agency is working with police forces, businesses and research communities to assess and tackle the threat.

“These people do not represent mainstream opinion. But I am still pretty frightened by this violence,” says Michael Hagmann, a biochemist and head of corporate communications for the Swiss Federal Laboratories for Materials Science and Technology near Zurich, a public-sector partner of the IBM facility that also does nanotechnology research.

“Just a few weeks after the attempted bombing, we were due to have a large conference on nanotechnology and we were really quite nervous” about going ahead with it, Hagmann says. “But we concluded that the public discussion was more important and didn’t want to scare people by having 20 police guarding us. It would have sent the wrong message.”

Nature 485, 561 (31 May 2012) doi:10.1038/485561a

*   *   *

Published online 22 August 2011 | Nature 476, 373 (2011) | doi:10.1038/476373a

Column: World View

Stand up against the anti-technology terrorists

Home-made bombs are being sent to physicists in Mexico. Colleagues around the world should ensure their own security, urges Gerardo Herrera Corral.

Gerardo Herrera Corral

My elder brother, Armando Herrera Corral, was this month sent a tube of dynamite by terrorists who oppose his scientific research. The home-made bomb, which was in a shoe-box-sized package labelled as an award for his personal attention, exploded when he pulled at the adhesive tape wrapped around it. My brother, director of the technology park at the Monterrey Institute of Technology in Mexico, was standing at the time, and suffered burns to his legs and a perforated eardrum. More severely injured by the blast was his friend and colleague Alejandro Aceves López, whom my brother had gone to see in his office to share a cup of coffee and open the award. Aceves López was sitting down when my brother opened the package; he took the brunt of the explosion in his chest, and shrapnel pierced one of his lungs.

Both scientists are now recovering from their injuries, but they were extremely fortunate to survive. The bomb failed to go off properly, and only a fraction of the 20-centimetre-long cylinder of dynamite ignited. The police estimate that the package contained enough explosive to take down part of the building, had it worked as intended.

The next day, I, too, was sent a suspicious package. I have been advised by the police not to offer details of why the package was judged of concern, but it arrived by an unusual procedure, and on a Sunday. It tested positive for explosives, and was taken away by the bomb squad, which declared a false alarm after finding that the parcel contained only books. My first reaction was to leave the country. Now, I am confused as to how I should respond.

As an academic scientist, why was my brother singled out in this way? He does not work in a field that is usually considered high-risk for terrorist activity, such as medical research on animals. He works on computer science, and Aceves López is an expert in robotics. I am a high-energy physicist and coordinate the Mexican contribution to research using the Large Hadron Collider at CERN, Europe’s particle-physics laboratory; I have worked in the field for 15 years.

An extremist anarchist group known as Individuals Tending to Savagery (ITS) has claimed responsibility for the attack on my brother. This is confirmed by a partially burned note found by the authorities at the bomb site, signed by the ITS and with a message along the lines of: “If this does not get to the newspapers we will produce more explosions. Wounding or killing teachers and students does not matter to us.”

In statements posted on the Internet, the ITS expresses particular hostility towards nano­technology and computer scientists. It claims that nanotechnology will lead to the downfall of mankind, and predicts that the world will become dominated by self-aware artificial-intelligence technology. Scientists who work to advance such technology, it says, are seeking to advance control over people by ‘the system’. The group praises Theodore Kaczynski, the Unabomber, whose anti-technology crusade in the United States in 1978–95 killed three people and injured many others.

The group’s rhetoric is absurd, but I urge colleagues around the world to take the threat that it poses to researchers seriously. Information gathered by Mexican federal authorities and Interpol link it to actions in countries including Spain, France and Chile. In April this year, the ITS sent a bomb — similar to the one posted to my brother — to the head of the Nanotechnology Engineering Division at the Polytechnic University of Mexico Valley in Tultitlan, although that device did not explode. In May, the university received a second parcel bomb, with a message reading: “This is not a joke: last month we targeted Oscar Camacho, today the institution, tomorrow who knows? Open fire on nanotechnology and those who support it!”

“I believe that terror should not succeed in establishing fear and imposing conduct.”

The scientific community must be made aware of such organizations, and of their capacity for destruction. Nanotechnology-research institutes and departments, companies and professional associations must beef up their security procedures, particularly on how they receive and accept parcels and letters.

I would like to stand up and speak in this way because I believe that terror should not succeed in establishing fear and imposing conduct that takes us far from the freedom we enjoy. I would like the police to take these events seriously; they are becoming a real threat to society. I would also like to express my solidarity with the Monterrey Institute of Technology — the institution that gave me both financial support to pursue my undergraduate studies and high-level academic training.

To oppose technology is not an unacceptable way to think. We may well debate the desirability of further technical development in our society. Yet radical groups such as the ITS overlook a crucial detail: it is not technology that is the problem, but how we use it. After Alfred Nobel invented dynamite he became a rich man, because it found use in mining, quarrying, construction and demolition. But people can also decide to put dynamite into a parcel and address it to somebody with the intention of killing them.

Gerardo Herrera Corral is a physicist at the Research and Advanced Studies Centre of the National Polytechnic Institute of Mexico in Mexico City.

Disorderly Conduct: Probing the Role of Disorder in Quantum Coherence (Science Daily)

ScienceDaily (July 19, 2012) — A new experiment conducted at the Joint Quantum Institute (JQI)* examines the relationship between quantum coherence, an important aspect of certain materials kept at low temperature, and the imperfections in those materials. These findings should be useful in forging a better understanding of disorder, and in turn in developing better quantum-based devices, such as superconducting magnets.

Figure 1 (top): Two thin planes of cold atoms are held in an optical lattice by an array of laser beams. Still another laser beam, passed through a diffusing material, adds an element of disorder to the atoms in the form of a speckle pattern. Figure 2 (bottom): Interference patterns resulting when the two planes of atoms are allowed to collide. In (b) the amount of disorder is just right and the pattern is crisp. In (c) too much disorder has begun to wash out the pattern. In (a) the pattern is complicated by the presence of vortices in the among the atoms, vortices which are hard to see in this image taken from the side. (Credit: Matthew Beeler)

Most things in nature are imperfect at some level. Fortunately, imperfections — a departure, say, from an orderly array of atoms in a crystalline solid — are often advantageous. For example, copper wire, which carries so much of the world’s electricity, conducts much better if at least some impurity atoms are present.

In other words, a pinch of disorder is good. But there can be too much of this good thing. The issue of disorder is so important in condensed matter physics, and so difficult to understand directly, that some scientists have been trying for some years to simulate with thin vapors of cold atoms the behavior of electrons flowing through solids trillions of times more dense. With their ability to control the local forces over these atoms, physicists hope to shed light on more complicated case of solids.

That’s where the JQI experiment comes in. Specifically, Steve Rolston and his colleagues have set up an optical lattice of rubidium atoms held at temperature close to absolute zero. In such a lattice atoms in space are held in orderly proximity not by natural inter-atomic forces but by the forces exerted by an array of laser beams. These atoms, moreover, constitute a Bose Einstein condensate (BEC), a special condition in which they all belong to a single quantum state.

This is appropriate since the atoms are meant to be a proxy for the electrons flowing through a solid superconductor. In some so called high temperature superconductors (HTSC), the electrons move in planes of copper and oxygen atoms. These HTSC materials work, however, only if a fillip of impurity atoms, such as barium or yttrium, is present. Theorists have not adequately explained why this bit of disorder in the underlying material should be necessary for attaining superconductivity.

The JQI experiment has tried to supply palpable data that can illuminate the issue of disorder. In solids, atoms are a fraction of a nanometer (billionth of a meter) apart. At JQI the atoms are about a micron (a millionth of a meter) apart. Actually, the JQI atom swarm consists of a 2-dimensional disk. “Disorder” in this disk consists not of impurity atoms but of “speckle.” When a laser beam strikes a rough surface, such as a cinderblock wall, it is scattered in a haphazard pattern. This visible speckle effect is what is used to slightly disorganize the otherwise perfect arrangement of Rb atoms in the JQI sample.

In superconductors, the slight disorder in the form of impurities ensures a very orderly “coherence” of the supercurrent. That is, the electrons moving through the solid flow as a single coordinated train of waves and retain their cohesiveness even in the midst of impurity atoms.

In the rubidium vapor, analogously, the slight disorder supplied by the speckle laser ensures that the Rb atoms retain their coordinated participation in the unified (BEC) quantum wave structure. But only up to a point. If too much disorder is added — if the speckle is too large — then the quantum coherence can go away. Probing this transition numerically was the object of the JQI experiment. The setup is illustrated in figure 1.

And how do you know when you’ve gone too far with the disorder? How do you know that quantum coherence has been lost? By making coherence visible.

The JQI scientists cleverly pry their disk-shaped gas of atoms into two parallel sheets, looking like two thin crepes, one on top of each other. Thereafter, if all the laser beams are turned off, the two planes will collide like miniature galaxies. If the atoms were in a coherent condition, their collision will result in a crisp interference pattern showing up on a video screen as a series of high-contrast dark and light stripes.

If, however, the imposed disorder had been too high, resulting in a loss of coherence among the atoms, then the interference pattern will be washed out. Figure 2 shows this effect at work. Frames b and c respectively show what happens when the degree of disorder is just right and when it is too much.

“Disorder figures in about half of all condensed matter physics,” says Steve Rolston. “What we’re doing is mimicking the movement of electrons in 3-dimensional solids using cold atoms in a 2-dimensional gas. Since there don’t seem to be any theoretical predictions to help us understand what we’re seeing we’ve moved into new experimental territory.”

Where does the JQI work go next? Well, in figure 2a you can see that the interference pattern is still visible but somewhat garbled. That arises from the fact that for this amount of disorder several vortices — miniature whirlpools of atoms — have sprouted within the gas. Exactly such vortices among electrons emerge in superconductivity, limiting their ability to maintain a coherent state.

The new results are published in the New Journal of Physics: “Disorder-driven loss of phase coherence in a quasi-2D cold atom system,” by M C Beeler, M E W Reed, T Hong, and S L Rolston.

Another of the JQI scientists, Matthew Beeler, underscores the importance of understanding the transition from the coherent state to incoherent state owing to the fluctuations introduced by disorder: “This paper is the first direct observation of disorder causing these phase fluctuations. To the extent that our system of cold atoms is like a HTSC superconductor, this is a direct connection between disorder and a mechanism which drives the system from superconductor to insulator.”

Society’s Response to Climate Change Is Critical (Science Daily)

ScienceDaily (July 18, 2012) — Lancaster University (UK) scientists have proposed a new way of considering society’s reactions to global warming by linking societal actions to temperature change.

Using this framework to analyse climate change policies aimed at avoiding dangerous climate change, they suggest that society will have to become fifty times more responsive to global temperature change than it has been since 1990.

The researchers, Dr Andy Jarvis, Dr David Leedal and Professor Nick Hewitt from the Lancaster Environment Centre, also show that if global energy use continues to grow as it has done historically, society would have to up its decarbonization efforts from its historic (160 year) value of 0.6% per year to 13% per year.

Dr Andy Jarvis said: “In order to avoid dangerous climate change, society will have to become much more responsive to the risks and damages that growth in global greenhouse gas emissions impose.”

The research, published in Nature Climate Change on 15 July has found that the global growth of new renewable sources of energy since 1990 constitutes a climate-society feedback of a quarter percent per year in the growth rate of CO2 emissions per degree temperature rise.

Professor Nick Hewitt said “If left unmanaged, the climate damages that we experience will motivate society to act to a greater or lesser degree. This could either amplify the growth in greenhouse gas emissions as we repair these damages or dampen them through loss of economic performance. Both are unpredictable and potentially dangerous.”

Dummies guide to the latest “Hockey Stick” controversy (Real Climate)

http://www.realclimate.org

 — gavin @ 18 February 2005

by Gavin Schmidt and Caspar Amman

Due to popular demand, we have put together a ‘dummies guide’ which tries to describe what the actual issues are in the latest controversy, in language even our parents might understand. A pdf version is also available. More technical descriptions of the issues can be seen here and here.

This guide is in two parts, the first deals with the background to the technical issues raised byMcIntyre and McKitrick (2005) (MM05), while the second part discusses the application of this to the original Mann, Bradley and Hughes (1998) (MBH98) reconstruction. The wider climate science context is discussed here, and the relationship to other recent reconstructions (the ‘Hockey Team’) can be seen here.

NB. All the data that were used in MBH98 are freely available for download atftp://holocene.evsc.virginia.edu/pub/sdr/temp/nature/MANNETAL98/ (and also as supplementary data at Nature) along with a thorough description of the algorithm.
Part I: Technical issues:

1) What is principal component analysis (PCA)?

This is a mathematical technique that is used (among other things) to summarize the data found in a large number of noisy records so that the essential aspects can more easily seen. The most common patterns in the data are captured in a number of ‘principal components’ which describe some percentage of the variation in the original records. Usually only a limited number of components (‘PC’s) have any statistical significance, and these can be used instead of the larger data set to give basically the same description.

2) What do these individual components represent?

Often the first few components represent something recognisable and physical meaningful (at least in climate data applications). If a large part of the data set has a trend, than the mean trend may show up as one of the most important PCs. Similarly, if there is a seasonal cycle in the data, that will generally be represented by a PC. However, remember that PCs are just mathematical constructs. By themselves they say nothing about the physics of the situation. Thus, in many circumstances, physically meaningful timeseries are ‘distributed’ over a number of PCs, each of which individually does not appear to mean much. Different methodologies or conventions can make a big difference in which pattern comes up tops. If the aim of the PCA analysis is to determine the most important pattern, then it is important to know how robust that pattern is to the methodology. However, if the idea is to more simply summarize the larger data set, the individual ordering of the PCs is less important, and it is more crucial to make sure that as many significant PCs are included as possible.

3) How do you know whether a PC has significant information?

PC significanceThis determination is usually based on a ‘Monte Carlo’ simulation (so-called because of the random nature of the calculations). For instance, if you take 1000 sets of random data (that have the same statistical properties as the data set in question), and you perform the PCA analysis 1000 times, there will be 1000 examples of the first PC. Each of these will explain a different amount of the variation (or variance) in the original data. When ranked in order of explained variance, the tenth one down then defines the 99% confidence level: i.e. if your real PC explains more of the variance than 99% of the random PCs, then you can say that this is significant at the 99% level. This can be done for each PC in turn. (This technique was introduced by Preisendorfer et al. (1981), and is called the Preisendorfer N-rule).

The figure to the right gives two examples of this. Here each PC is plotted against the amount of fractional variance it explains. The blue line is the result from the random data, while the blue dots are the PC results for the real data. It is clear that at least the first two are significantly separated from the random noise line. In the other case, there are 5 (maybe 6) red crosses that appear to be distinguishable from the red line random noise. Note also that the first (‘most important’) PC does not always explain the same amount of the original data.

4) What do different conventions for PC analysis represent?

Some different conventions exist regarding how the original data should be normalized. For instance, the data can be normalized to have an average of zero over the whole record, or over a selected sub-interval. The variance of the data is associated with departures from the whatever mean was selected. So the pattern of data that shows the biggest departure from the mean will dominate the calculated PCs. If there is an a priori reason to be interested in departures from a particular mean, then this is a way to make sure that those patterns move up in the PC ordering. Changing conventions means that the explained variance of each PC can be different, the ordering can be different, and the number of significant PCs can be different.

5) How can you tell whether you have included enough PCs?

This is rather easy to tell. If your answer depends on the number of PCs included, then you haven’t included enough. Put another way, if the answer you get is the same as if you had used all the data without doing any PC analysis at all, then you are probably ok. However, the reason why the PC summaries are used in the first place in paleo-reconstructions is that using the full proxy set often runs into the danger of ‘overfitting’ during the calibration period (the time period when the proxy data are trained to match the instrumental record). This can lead to a decrease in predictive skill outside of that window, which is the actual target of the reconstruction. So in summary, PC selection is a trade off: on one hand, the goal is to capture as much variability of the data as represented by the different PCs as possible (particularly if the explained variance is small), while on the other hand, you don’t want to include PCs that are not really contributing any more significant information.

Part II: Application to the MBH98 ‘Hockey Stick’

1) Where is PCA used in the MBH methodology?

When incorporating many tree ring networks into the multi-proxy framework, it is easier to use a few leading PCs rather than 70 or so individual tree ring chronologies from a particular region. The trees are often very closely located and so it makes sense to summarize the general information they all contain in relation to the large-scale patterns of variability. The relevant signal for the climate reconstruction is the signal that the trees have in common, not each individual series. In MBH98, the North American tree ring series were treated like this. There are a number of other places in the overall methodology where some form of PCA was used, but they are not relevant to this particular controversy.

2) What is the point of contention in MM05?

MM05 contend that the particular PC convention used in MBH98 in dealing with the N. American tree rings selects for the ‘hockey stick’ shape and that the final reconstruction result is simply an artifact of this convention.

3) What convention was used in MBH98?

MBH98 were particularly interested in whether the tree ring data showed significant differences from the 20th century calibration period, and therefore normalized the data so that the mean over this period was zero. As discussed above, this will emphasize records that have the biggest differences from that period (either positive of negative). Since the underlying data have a ‘hockey stick’-like shape, it is therefore not surprising that the most important PC found using this convention resembles the ‘hockey stick’. There are actual two significant PCs found using this convention, and both were incorporated into the full reconstruction.

PC1 vs PC44) Does using a different convention change the answer?

As discussed above, a different convention (MM05 suggest one that has zero mean over the whole record) will change the ordering, significance and number of important PCs. In this case, the number of significant PCs increases to 5 (maybe 6) from 2 originally. This is the difference between the blue points (MBH98 convention) and the red crosses (MM05 convention) in the first figure. Also PC1 in the MBH98 convention moves down to PC4 in the MM05 convention. This is illustrated in the figure on the right, the red curve is the original PC1 and the blue curve is MM05 PC4 (adjusted to have same variance and mean). But as we stated above, the underlying data has a hockey stick structure, and so in either case the ‘hockey stick’-like PC explains a significant part of the variance. Therefore, using the MM05 convention, more PCs need to be included to capture the significant information contained in the tree ring network.

This figure shows the difference in the final result whether you use the original convention and 2 PCs (blue) and the MM05 convention with 5 PCs (red). The MM05-based reconstruction is slightly less skillful when judged over the 19th century validation period but is otherwise very similar. In fact any calibration convention will lead to approximately the same answer as long as the PC decomposition is done properly and one determines how many PCs are needed to retain the primary information in the original data.

different conventions
5) What happens if you just use all the data and skip the whole PCA step?

This is a key point. If the PCs being used were inadequate in characterizing the underlying data, then the answer you get using all of the data will be significantly different. If, on the other hand, enough PCs were used, the answer should be essentially unchanged. This is shown in the figure below. The reconstruction using all the data is in yellow (the green line is the same thing but with the ‘St-Anne River’ tree ring chronology taken out). The blue line is the original reconstruction, and as you can see the correspondence between them is high. The validation is slightly worse, illustrating the trade-off mentioned above i.e. when using all of the data, over-fitting during the calibration period (due to the increase number of degrees of freedom) leads to a slight loss of predictability in the validation step.

No PCA comparison

6) So how do MM05 conclude that this small detail changes the answer?

MM05 claim that the reconstruction using only the first 2 PCs with their convention is significantly different to MBH98. Since PC 3,4 and 5 (at least) are also significant they are leaving out good data. It is mathematically wrong to retain the same number of PCs if the convention of standardization is changed. In this case, it causes a loss of information that is very easily demonstrated. Firstly, by showing that any such results do not resemble the results from using all data, and by checking the validation of the reconstruction for the 19th century. The MM version of the reconstruction can be matched by simply removing the N. American tree ring data along with the ‘St Anne River’ Northern treeline series from the reconstruction (shown in yellow below). Compare this curve with the ones shown above.

No N. American tree rings

As you might expect, throwing out data also worsens the validation statistics, as can be seen by eye when comparing the reconstructions over the 19th century validation interval. Compare the green line in the figure below to the instrumental data in red. To their credit, MM05 acknowledge that their alternate 15th century reconstruction has no skill.

validation period

7) Basically then the MM05 criticism is simply about whether selected N. American tree rings should have been included, not that there was a mathematical flaw?

Yes. Their argument since the beginning has essentially not been about methodological issues at all, but about ‘source data’ issues. Particular concerns with the “bristlecone pine” data were addressed in the followup paper MBH99 but the fact remains that including these data improves the statistical validation over the 19th Century period and they therefore should be included.

Hockey Team *used under GFDL license8) So does this all matter?

No. If you use the MM05 convention and include all the significant PCs, you get the same answer. If you don’t use any PCA at all, you get the same answer. If you use a completely different methodology (i.e. Rutherford et al, 2005), you get basically the same answer. Only if you remove significant portions of the data do you get a different (and worse) answer.

9) Was MBH98 the final word on the climate of last millennium?

Not at all. There has been significant progress on many aspects of climate reconstructions since MBH98. Firstly, there are more and better quality proxy data available. There are new methodologies such as described in Rutherford et al (2005) or Moberg et al (2005) that address recognised problems with incomplete data series and the challenge of incorporating lower resolution data into the mix. Progress is likely to continue on all these fronts. As of now, all of the ‘Hockey Team’ reconstructions (shown left) agree that the late 20th century is anomalous in the context of last millennium, and possibly the last two millennia.

The climate of the climate change debate is changing (The Guardian)

Quantifying how greenhouse gases contribute to extreme weather is a crucial step in calculating the cost of human influence

Myles Allen

guardian.co.uk, Wednesday 11 July 2012 12.08 BST

Climate change could trap hundreds of millions in disaster areas, report claims

This week, climate change researchers were able to attribute recent examples of extreme weather to the effects of human activity on the planet’s climate systems for the first time. Photograph: Rizwan Tabassum/AFP/Getty Images

The climate may have changed this week. Not the physical climate, but the climate of the climate change debate. Tuesday marked thepublication of a series of papers examining the factors behind extreme weather events in 2011. Nothing remarkable about that, you might think, except, if all goes well, this will be the first of a regular, annual assessment quantifying how external drivers of climate contribute to damaging weather.

Some of these drivers, like volcanoes, are things we can do nothing about. But others, like rising levels of greenhouse gases, we can. And quantifying how greenhouse gases contribute to extreme weather is a crucial step in pinning down the real cost of human influence on climate. While most people think of climate change in terms of shrinking ice-sheets and slowly rising sea levels, it is weather events that actually do harm.

This week also saw a workshop in Oxford for climate change negotiators from developing countries. Again, nothing remarkable about that except, for the first time, the issue of “loss and damage” was top of the agenda. For years negotiations have been over emission reductions and sharing the costs of adaptation. Now the debate is turning to: who is going to pay for damage done?

It is a good time to ask, since the costs that can unambiguously be attributed to human-induced climate change are still relatively small. Although Munich Re estimates that weather events in 2011 cost more than $100bn and claimed many thousands of lives, only a few of these events were clearly made more likely by human influence. Others may have been made less likely, but occurred anyway – chance remains the single dominant factor in when and where a weather event occurs. For the vast majority of events, we simply don’t yet know either way.

Connecting climate change and specific weather events is only one link in the causal chain between greenhouse gas emissions and actual harm. But it is a crucial link. If, as planned, the assessment of 2011 becomes routine, we should be able to compare actual weather-related damage, in both good years and bad, with the damage that might have been in a world without human influence on climate. This puts us well on our way to a global inventory of climate change impacts. And as soon as that is available, the question of compensation will not be far behind.

The presumption in climate change negotiations is that “countries with historically high emissions” would be first in line to foot the bill for loss and damage. There may be some logic to this, but if you are an African (or Texan) farmer hit by greenhouse-exacerbated drought, is the European or American taxpayer necessarily the right place to look for compensation? As any good lawyer knows, there is no point in suing a man with empty pockets.

The only institution in the world that could deal with the cost of climate change without missing a beat is the fossil fuel industry: BP took a $30bn charge for Deepwater Horizon, very possibly more than the total cost of climate change damages last year, and was back in profit within months. Of the $5 trillion per year we currently spend on fossil energy, a small fraction would take care of all the loss and damage attributable to climate change for the foreseeable future several times over.

Such a pay-as-you-go liability regime would not address the impacts of today’s emissions on the 22nd century. Governments cannot wash their hands of this issue entirely. But we have been so preoccupied with the climate of the 22nd century that we have curiously neglected to look after the interests of those being affected by climate change today.

So rather than haggling over emission caps and carbon taxes, why not start with a simple statement of principle: standard product liability applies to anyone who sells or uses fossil fuels, including liability for any third-party side-effects. There is no need at present to say what these side-effects might be – indeed, the scientific community does not yet know. But we are getting there.

Um novo bóson à vista (FAPESP)

Físicos do Cern descobriram nova partícula que parece ser o bóson de Higgs

MARCOS PIVETTA | Edição Online 19:46 4 de julho de 2012

Colisões de prótons nos quais se observa quatro elétrons de alta energia (linhas verdes e torres vermelhas). O evento mostra características esperadas do decaimento de um bóson de Higgs mas também é coerente com processos de fundo do modelo padrão

de Lindau (Alemanha)*

O maior laboratório do mundo pode ter encontrado a partícula que dá massa a todas as outras partículas, o tão procurado bóson de Higgs. Era a peça que faltava para completar um quebra-cabeça científico chamado modelo padrão, o arcabouço teórico formulado nas últimas décadas para explicar as partículas e forças presentes na matéria visível do Universo. Depois de analisarem trilhões de colisões de prótons produzidas em 2011 e em parte deste ano no Grande Acelerador de Hádrons (LHC), físicos dos dois maiores experimentos tocados de forma independente no Centro Europeu de Energia Nuclear (Cern) anunciaram nesta quarta-feira (4), nos arredores de Genebra (Suíça), a descoberta de uma nova partícula que tem quase todas as características do bóson de Higgs, embora ainda não possam assegurar com certeza de que se trata especificamente desse ou de algum outro tipo de bóson.

“Observamos em nossos dados sinais claros de uma nova partícula na região de massa em torno de 126 GeV (Giga-elétron-volts)”, disse a física Fabiola Gianotti, porta-voz do experimento Atlas. “Mas precisamos um pouco mais de tempo para prepararmos os resultados para publicação.” As informações provenientes de outro experimento feito no Cern, o CMS, são praticamente idênticas. “Os resultados são preliminares, mas os sinais que vemos em torno da região com massa de 125 GeV são dramáticos. É realmente uma nova partícula. Sabemos que deve ser um bóson e é o bóson mais pesado que achamos”, afirmou o porta-voz do experimento CMS, o físico Joe Incandela. Se tiver mesmo uma massa de 125 ou 126 GeV, a nova partícula será tão pesada quanto um átomo do elemento químico iodo.

Em ambos os casos experimentos, o grau de confiabilidade das análises estatísticas atingiu o nível que os cientistas chamam de 5 sigma. Nesses casos, a chance de erro é de uma em três milhões. Ou seja, com esse nível de certeza, é possível falar que houve uma descoberta, só não se conhece em detalhes a natureza da partícula encontrada. “É incrível que essa descoberta tenha acontecido durante a minha vida”, comenta Peter Higgs, o físico teórico britânico que, há 50 anos, ao lado de outros cientistas, previu a existência desse tipo de bóson. Ainda neste mês, um artigo com os dados do LHC deverá ser submetido a uma revista científica. Até o final do ano, quando acelerador será fechado para manutenção por ao menos um ano e meio, mais dados devem ser produzidos pelos dois experimentos.

“Estou rindo o dia todo”
Em Lindau, uma pequena cidade do sul da Alemanha à beira do lago Constance na divisa com a Áustria e a Suíça, onde ocorre nesta semana o 62º Encontro de Prêmios Nobel, os pesquisadores comemoraram a notícia vinda dos experimentos no Cern. Como o tema do encontro deste ano era física, não faltaram laureados com a maior honraria da ciência para comentar o feito. “Não sabemos se é o bóson (de Higgs), mas é um bóson”, disse o físico teórico David J. Gross, da Universidade de Califórnia, ganhador do Nobel de 2004 pela descoberta da liberdade assintótica. “Estou rindo o dia todo.” O físico experimental Carlo Rubia, ex-diretor geral do Cern e ganhador do Nobel de 1984 por trabalhos que levaram à identificação de dois tipos de bósons (o W e Z), foi na mesma linha de raciocínio. “Estamos diante de um marco”, afirmou.

Talvez com um entusiasmo um pouco menor, mas ainda assim reconhecendo a enorme importância do achado no Cern, dois outros Nobel deram sua opinião sobre a notícia do dia. “É algo que esperávamos há anos”, afirmou o físico teórico holandês Martinus Veltman, que recebeu o prêmio em 1999. “O modelo padrão ganhou um degrau maior de validade.” Para o cosmologista americano George Smoot, ganhador do Nobel de 2006 pela descoberta da radiação cósmica de fundo (uma relíquia do Big Bang, a explosão primordial que criou o Universo), ainda deve demorar uns dois ou três anos para os cientistas realmente saberem que tipo de nova partícula foi realmente descoberta. Se a nova partícula não for o bóson de Higgs, Smoot disse que seria “maravilhoso se fosse algo relacionado com a matéria escura”, um misterioso componente que, ao lado da matéria visível e da ainda mais desconhecida energia escura, seria um dos pilares do Universo.

Não é possível medir de forma direta partículas com as propriedades do bóson de Higgs, mas sua existência, ainda que fugaz, deixaria rastros, que, estes sim, poderiam ser detectados num acelerador de partículas tão potente como o LHC. Instáveis e fugazes, os bósons de Higgs sobrevivem uma ínfima fração de segundo – até decaírem e virarem partículas menos pesadas, que, por sua vez, decaem também e dão origem a partículas ainda mais leves. O modelo padrão prevê que, em função de sua massa, os bósons de Higgs devem decair em diferentes canais, ou seja, em distintas combinações de partículas mais leves, como dois fótons ou quatro léptons. Nos experimentos feitos no Cern, dos quais participaram cerca de 6 mil físicos, foram encontradas evidências quase inequívocas das formas de decaimento que seriam a assinatura típica dos bóson de Higgs.

*O jornalista Marcos Pivetta viajou a Lindau a convite do Daad (Serviço Alemão de Intercâmbio Acadêmico)

Para evitar catástrofes ambientais (FAPERJ)

Vilma Homero

05/07/2012

 Nelson Fernandes / UFRJ
 
  Novos métodos podem prever onde e quando
ocorrerão deslizamentos na região serrana

Quando várias áreas de Nova Friburgo, Petrópolis e Teresópolis sofreram deslizamentos, em janeiro de 2011, soterrando mais de mil pessoas em toneladas de lama e destroços, a pergunta que ficou no ar foi se o desastre poderia ter sido minimizado. No que depender do Instituto de Geociências da Universidade Federal do Rio de Janeiro (UFRJ), as consequências provocadas por cataclismas ambientais como esses poderão ser cada vez menores. Para isso, os pesquisadores estão desenvolvendo uma série de projetos multidisciplinares para viabilizar sistemas de análise de riscos. Um deles é o Prever, que, com suporte de programas computacionais, une os avanços alcançados em metodologias de sensoriamento remoto, geoprocessamento, geomorfologia e geotecnia, à modelagem matemática para a previsão do tempo em áreas mais suscetíveis a deslizamentos, como a região serrana. “Embora a realidade dos vários municípios daquela região seja bastante diferente, há em comum uma falta de metodologias voltadas à previsão para esse tipo de risco. O fundamental agora é desenvolver métodos capazes de prever a localização espacial e temporal desses processos. Ou seja, saber “onde” e “quando” esses deslizamentos podem ocorrer”, explica o geólogo Nelson Ferreira Fernandes, professor do Departamento de Geografia da UFRJ e Cientista do Nosso Estado da FAPERJ.Para elaborar métodos de previsão de risco, em tempo real, que incluam movimentos de massa deflagrados em resposta a entradas pluviométricas, os pesquisadores estão traçando um mapeamento, realizado a partir de sucessivas imagens captadas por satélites, que são cruzadas com mapas geológicos e geotécnicos. O Prever combina modelos de simulação climática e de previsão de eventos pluviométricos extremos, desenvolvidos na área da meteorologia, com modelos matemáticos de previsão, mais as informações desenvolvidos pela geomorfologia e pela geotecnia, que nos indicam as áreas mais suscetíveis a deslizamentos. Assim, podemos elaborar traçar previsões de risco, em tempo real, classificando os resultados de acordo com a gravidade desse risco, que varia continuamente, no espaço e no tempo”, explica Nelson.

Para isso, os Departamentos de Geografia, Geologia e Meteorologia do Instituto de Geociências da UFRJ se unem à Faculdade de Geologia da Universidade do Estado do Rio de Janeiro (Uerj) e ao Departamento de Engenharia Civil da Pontifícia Universidade Católica (PUC-Rio). Com a sobreposição de informações, pode-se apontar, nas imagens resultantes, as áreas mais sensíveis a deslizamentos. “Somando esses conhecimentos acadêmicos aos dados de órgãos estaduais, como o Núcleo de Análise de Desastres (Nade), do Departamento de Recursos Minerais (DRM-RJ), responsável pelo apoio técnico à Defesa Civil, estaremos não apenas atualizando constantemente os mapas usados hoje pelos órgãos do governo do estado e pela Defesa Civil, como estaremos também facilitando um planejamento mais preciso para a tomada de decisões.”

 Divulgação / UFRJ
Uma simulação mostra em imagem a possibilidade de
um deslizamento de massas na
 região de Jacarepaguá

Esse novo mapeamento também significa melhor qualidade e maior precisão e mais detalhamento de imagens. “Obviamente, com melhores instrumentos em mãos, o que quer dizer mapas mais detalhados e precisos, os gestores públicos também poderão planejar e agir de forma mais acurada e em tempo real”, afirma Nelson. Segundo o pesquisador, esses mapas precisam ter atualização constante para acompanhar a dinâmica da interferência da ocupação humana sobre a topografia das várias regiões. “Isso vem acontecendo seja pelo corte de encostas, seja pela ocupação de áreas aterradas ou pelas mudanças em consequência da drenagem de rios. Tudo isso altera a topografia e, no caso de chuvas mais fortes e prolongadas, pode tornar determinados solos mais propensos a deslizamentos ou a alagamentos e enchentes”, exemplifica Nelson.Mas os sistemas de análises de desastres e riscos ambientais também compreendem outras linhas de pesquisa. No Prever, se trabalha em duas linhas de ação distintas. “Uma delas é a de clima, em que detectamos as áreas em que haverá um aumento pluviométrico a longo prazo e fornecemos informações a órgãos de decisão e planejamento. Outra é a previsão de curtíssimo prazo, o chamadonowcasting.” No caso de previsão de longo prazo, a professora Ana Maria Bueno Nunes, do Departamento de Meteorologia da mesma universidade, vem trabalhando no projeto “Implementação de um Sistema de Modelagem Regional: Estudos de Tempo e Clima”, sob sua coordenação, com a proposta de uma reconstrução do hidroclima da América do Sul, uma extensão daquele projeto.

“Unindo dados sobre precipitação fornecidos por satélite às informações das estações atmosféricas, é possível, através de modelagem computacional, traçar estimativas de precipitação. Assim, podemos não apenas saber quando haverá chuvas de intensidade mais forte, ou mais prolongadas, como também observar em mapas passados qual foi a convergência de fatores que provocou uma situação de desastre. A reconstrução é uma forma de estudar o passado para entender cenários atuais que se mostrem semelhantes. E, com isso, ajudamos a melhorar os modelos de previsão”, afirma Ana. Estas informações, que a princípio servirão para uso acadêmico e científico, permitirão que se tenha dados cada vez mais detalhados de como se formam grandes chuvas, aquelas que são capazes de provocar inundações em determinadas áreas. “Isso permitirá não apenas compreender melhor as condições em que certas situações de calamidade acontecem, como prever quando essas condições podem se repetir. Com o projeto, estamos também formando recursos humanos ainda mais especializados nessa área”, avalia a pesquisadora, cujo trabalho conta com recursos de um Auxílio à Pesquisa (APQ 1).

Também integrante do projeto, o professor Gutemberg Borges França, da UFRJ, explica que existem três tipos de previsão meteorológica: a sinótica – que traça previsões numa média de 6h até sete dias, cobrindo alguns milhares de km, como o continente sul-americano; a de mesoescala, que faz previsões sobre uma média de 6h a dois dias, cobrindo algumas centenas de km, como o estado do Rio de Janeiro; e a de curto prazo, ou nowcasting, que varia de poucos minutos até 3h a 6h, sobre uma área específica de poucos km, como a região metropolitana do Rio de Janeiro, por exemplo.

Se previsões de longo prazo são importantes, as de curto prazo, ou nowcasting, também são. Segundo Gutemberg, os atuais modelos numéricos de previsão ainda são deficientes para realizar a previsão de curto prazo, que termina sendo feita em grande parte com base na experiência do meteorologista, pela interpretação das informações de várias fontes de dados disponíveis, como imagens de satélites; de estações meteorológicas de superfície e altitude; de radar e sodar (Sonic Detection and Ranging), e modelos numéricos. “No entanto, o meteorologista carece ainda hoje de ferramentas objetivas que possam auxiliá-lo na integração dessas diversas informações para realizar uma previsão de curto prazo mais acurada”, argumenta Gutemberg.Atualmente, o Rio de Janeiro já dispõe de estações de recepção de satélites, estação de altitude – radiosondagem – que geram perfis atmosféricos, estações meteorológicas de superfície e radar. O Laboratório de Meteorologia Aplicada do Departamento de Meteorologia, da UFRJ, está desenvolvendo, desde 2005, ferramentas de previsão de curto prazo, utilizando inteligência computacional, visando o aprimoramento das previsões de eventos meteorológicos extremos para o Rio de Janeiro. “Com inteligência computacional, temos essa informação em tempo mais curto e de forma mais acurada.”, resume.

© FAPERJ – Todas as matérias poderão ser reproduzidas, desde que citada a fonte.

This summer is ‘what global warming looks like’ (AP) + related & reactions

Jul 3, 1:10 PM EDT

By SETH BORENSTEIN
AP Science Writer

AP PhotoAP Photo/Matthew Barakat

WASHINGTON (AP) — Is it just freakish weather or something more? Climate scientists suggest that if you want a glimpse of some of the worst of global warming, take a look at U.S. weather in recent weeks.

Horrendous wildfires. Oppressive heat waves. Devastating droughts. Flooding from giant deluges. And a powerful freak wind storm called a derecho.

These are the kinds of extremes experts have predicted will come with climate change, although it’s far too early to say that is the cause. Nor will they say global warming is the reason 3,215 daily high temperature records were set in the month of June.

Scientifically linking individual weather events to climate change takes intensive study, complicated mathematics, computer models and lots of time. Sometimes it isn’t caused by global warming. Weather is always variable; freak things happen.

And this weather has been local. Europe, Asia and Africa aren’t having similar disasters now, although they’ve had their own extreme events in recent years.

But since at least 1988, climate scientists have warned that climate change would bring, in general, increased heat waves, more droughts, more sudden downpours, more widespread wildfires and worsening storms. In the United States, those extremes are happening here and now.

So far this year, more than 2.1 million acres have burned in wildfires, more than 113 million people in the U.S. were in areas under extreme heat advisories last Friday, two-thirds of the country is experiencing drought, and earlier in June, deluges flooded Minnesota and Florida.

“This is what global warming looks like at the regional or personal level,” said Jonathan Overpeck, professor of geosciences and atmospheric sciences at the University of Arizona. “The extra heat increases the odds of worse heat waves, droughts, storms and wildfire. This is certainly what I and many other climate scientists have been warning about.”

Kevin Trenberth, head of climate analysis at the National Center for Atmospheric Research in fire-charred Colorado, said these are the very record-breaking conditions he has said would happen, but many people wouldn’t listen. So it’s I told-you-so time, he said.

As recently as March, a special report an extreme events and disasters by the Nobel Prize-winning Intergovernmental Panel on Climate Change warned of “unprecedented extreme weather and climate events.” Its lead author, Chris Field of the Carnegie Institution and Stanford University, said Monday, “It’s really dramatic how many of the patterns that we’ve talked about as the expression of the extremes are hitting the U.S. right now.”

“What we’re seeing really is a window into what global warming really looks like,” said Princeton University geosciences and international affairs professor Michael Oppenheimer. “It looks like heat. It looks like fires. It looks like this kind of environmental disasters.”

Oppenheimer said that on Thursday. That was before the East Coast was hit with triple-digit temperatures and before a derecho – a large, powerful and long-lasting straight-line wind storm – blew from Chicago to Washington. The storm and its aftermath killed more than 20 people and left millions without electricity. Experts say it had energy readings five times that of normal thunderstorms.

Fueled by the record high heat, this was among the strongest of this type of storm in the region in recent history, said research meteorologist Harold Brooks of the National Severe Storm Laboratory in Norman, Okla. Scientists expect “non-tornadic wind events” like this one and other thunderstorms to increase with climate change because of the heat and instability, he said.

Such patterns haven’t happened only in the past week or two. The spring and winter in the U.S. were the warmest on record and among the least snowy, setting the stage for the weather extremes to come, scientists say.

Since Jan. 1, the United States has set more than 40,000 hot temperature records, but fewer than 6,000 cold temperature records, according to the National Oceanic and Atmospheric Administration. Through most of last century, the U.S. used to set cold and hot records evenly, but in the first decade of this century America set two hot records for every cold one, said Jerry Meehl, a climate extreme expert at the National Center for Atmospheric Research. This year the ratio is about 7 hot to 1 cold. Some computer models say that ratio will hit 20-to-1 by midcentury, Meehl said.

“In the future you would expect larger, longer more intense heat waves and we’ve seen that in the last few summers,” NOAA Climate Monitoring chief Derek Arndt said.

The 100-degree heat, drought, early snowpack melt and beetles waking from hibernation early to strip trees all combined to set the stage for the current unusual spread of wildfires in the West, said University of Montana ecosystems professor Steven Running, an expert on wildfires.

While at least 15 climate scientists told The Associated Press that this long hot U.S. summer is consistent with what is to be expected in global warming, history is full of such extremes, said John Christy at the University of Alabama in Huntsville. He’s a global warming skeptic who says, “The guilty party in my view is Mother Nature.”

But the vast majority of mainstream climate scientists, such as Meehl, disagree: “This is what global warming is like, and we’ll see more of this as we go into the future.”

Intergovernmental Panel on Climate Change report on extreme weather: http://ipcc-wg2.gov/SREX/

U.S. weather records:

http://www.ncdc.noaa.gov/extremes/records/

Seth Borenstein can be followed at http://twitter.com/borenbears

© 2012 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed. Learn more about our Privacy Policy and Terms of Use.

*   *   *

July 3, 2012

To Predict Environmental Doom, Ignore the Past

http://www.realclearscience.com

By Todd Myers

The information presented here cannot be used directly to calculate Earth’s long-term carrying capacity for human beings because, among other things, carrying capacity depends on both the affluence of the population being supported and the technologies supporting it. – Paul Ehrlich, 1986

One would expect scientists to pause when they realize their argument about resource collapse makes the king of environmental catastrophe, Paul Ehrlich, look moderate by comparison. Ehrlich is best known for a 40-year series of wildly inaccurate predictions of looming environmental disaster. Yet he looks positively reasonable compared to a paper recently published in the scientific journal Nature titled “Approaching a state shift in Earth’s biosphere.”

The paper predicts we are rapidly approaching a moment of “planetary-scale critical transition,” due to overuse of resources, climate change and other human-caused environmental damage. As a result, the authors conclude, this will “require reducing world population growth and per-capita resource use; rapidly increasing the proportion of the world’s energy budget that is supplied by sources other than fossil fuels,” and a range of other drastic policies. If these sound much like the ideas proposed in the 1970s by Ehrlich and others, like The Club of Rome, it is not a coincidence. TheNature paper is built on Ehrlich’s assumptions and cites his work more than once.

The Nature article, however, suffers from numerous simple statistical errors and assumptions rather than evidence. Its authors do nothing to deal with the fundamental mistakes that led Ehrlich and others like him down the wrong path so many times. Instead, the paper simply argues that with improved data, this time their predictions of doom are correct.

Ultimately, the piece is a good example of the great philosopher of science Thomas Kuhn’s hypothesis, written 50 years ago, that scientists often attempt to fit the data to conform to their particular scientific paradigm, even when that paradigm is obviously flawed. When confronted with failure to explain real-world phenomena, the authors of the Nature piece have, as Kuhn described in The Structure of Scientific Revolutions, devised “numerous articulations and ad hoc modifications of their theory in order to eliminate any apparent conflict.” Like scientists blindly devoted to a failed paradigm, the Nature piece simply tries to force new data to fit a flawed concept.

“Assuming this does not change”

During the last half-century, the world has witnessed a dramatic increase in food production. According to the U.N.’s Food and Agriculture Organization, yields per acre of rice have more than doubled, corn yields are more than one-and-a-half times larger than 50 years ago, and wheat yields have almost tripled. As a result, even as human population has increased, worldwide hunger has declined.

Despite these well-known statistics, the authors of the Nature study assume not only no future technological improvements, but that none have occurred over the last 200 years. The authors simply choose one data point and then project it both into the past and into the future. The authors explain the assumption that underlies their thesis in the caption to a graphic showing the Earth approaching environmental saturation. They write:

“The percentages of such transformed lands… when divided by 7,000,000,000 (the present global human population) yield a value of approximately 2.27 acres (0.92 ha) of transformed land for each person. That value was used to estimate the amount of transformed land that probably existed in the years 1800, 1900 and 1950, and which would exist in 2025 and 2045 assuming conservative population growth and that resource use does not become any more efficient.” (emphasis added)

In other words, the basis for their argument ignores the easily accessible data from the last half century. They take a snapshot in time and mistake it for a historical trend. In contrast to their claim of no change in the efficient use of resources, it would be difficult to find a time period in the last millennium when resource use did not become more efficient.

Ironically, this is the very error Ehrlich warns against in his 1986 paper – a paper the authors themselves cite several times. Despite Ehrlich’s admonition that projections of future carrying capacity are dependent upon technological change, the authors of the Nature article ignore history to come to their desired conclusion.

A Paradigm of Catastrophe

What would lead scientists to make such simplistic assumptions and flat-line projections? Indeed, what would lead Nature editors to print an article whose statistical underpinnings are so flawed? The simple belief in the paradigm of inevitable environmental catastrophe: humans are doing irreparable damage to the Earth and every bit of resource use moves us closer to that catastrophe. The catastrophe paradigm argues a simple model that eventually we will run out of space and resources, and determining the date of ultimate doom is a simple matter of doing the math.

Believing in this paradigm also justifies exaggeration in order to stave off the serious consequences of collapse. Thus, they describe the United Nations’ likely population estimate for 2050 as “the most conservative,” without explaining why. They claim “rapid climate change shows no signs of slowing” without providing a source citation for the claim, and despite an actual slowing of climate change over the last decade.

The need to avoid perceived global catastrophe also encourages the authors to blow past warning signs that their analysis is not built on solid foundations – as if the poor history of such projections were not already warning enough. Even as they admit the interactions “between overlapping complex systems, however, are providing difficult to characterize mathematically,” they base their conclusions on the simplest linear mathematical estimate that assumes nothing will change except population over the next 40 years. They then draw a straight line, literally, from today to the environmental tipping point.

Why is such an unscientific approach allowed to pass for science in a respected international journal? Because whatever the argument does not supply, the paradigm conveniently fills in. Even if the math isn’t reliable and there are obvious counterarguments, “everyone” understands and believes in the underlying truth – we are nearing the limits of the planet’s ability to support life. In this way the conclusion is not proven but assumed, making the supporting argument an impenetrable tautology.

Such a circumstance creates the conditions of scientific revolutions, where the old paradigm fails to explain real-world phenomena and is replaced by an alternative. Given the record of failure of the paradigm of resource catastrophe, dating back to the 1970s, one would hope we are moving toward such a change. Unfortunately, Nature and the authors of the piece are clinging to the old resource-depletion model, simply trying to re-work the numbers.

Let us hope policymakers recognize the failure of that paradigm before they make costly and dangerous policy mistakes that impoverish billions in the name of false scientific assumptions.

Todd Myers is the Environmental Director of the Washington Policy Center and author of the book Eco-Fads.

*   *   *

Washington Policy Center exposed: Todd Myers

The Washington Policy Center labels itself as a non-partisan think tank. It’s a mischaractization to say the least but that is their bread and butter. Based in Seattle, with a director in Spokane, the WPC’s mission is to “promote free-market solutions through research and education.” It makes sense they have an environmental director in the form of Todd Myers who has a new book called“Eco-Fads: How The Rise Of Trendy Environmentalism Is Harming The Environment.” You know, since polar bears love to swim.


From the WPC’s newsletter:

Wherever we turn, politicians, businesses and activists are promoting the latest fashionable “green” policy or product. Green buildings, biofuels, electric cars, compact fluorescent lightbulbs and a variety of other technologies are touted as the next key step in protecting the environment and promoting a sustainable future. Increasingly, however, scientific and economic information regarding environmental problems takes a back seat to the social and personal value of being seen and perceived as “green.”

As environmental consciousness has become socially popular, eco-fads supplant objective data. Politicians pick the latest environmental agenda in the same way we choose the fall fashions – looking for what will yield the largest benefit with our public and social circles.

Eco-Fads exposes the pressures that cause politicians, businesses, the media and even scientists to fall for trendy environmental fads. It examines why we fall for such fads, even when we should know better. The desire to “be green” can cloud our judgment, causing us to place things that make us appear green ahead of actions that may be socially invisible yet environmentally responsible.

By recognizing the range of forces that have taken us in the wrong direction, Eco-Fads shows how we can begin to get back on track, creating a prosperous and sustainable legacy for our planet’s future. Order Eco-Fads today for $26.95 (tax and shipping included).

This is what the newsletter doesn’t tell you about Todd Myers.

Myers has spoken at the Heartland Institute’s International Conference on Climate Change. In case you didn’t know, the Heartland Institute has received significant funding from ExxonMobil, Phillip Morris and numerous other corporations and conservative foundations with vested interest in the so-called debate around climate change. That conference was co-sponsored by numerous prominent climate change denier groups, think tanks and lobby groups, almost all of which have received money from the oil industry.

Why not just call it the Washington Fallacy Center? For a litte more background, including ties back to the Koch Brothers, go HERE. In fact, Jack Kemp calls it “The Heritage Foundation of the Northwest.”

*   *   *

 

Did climate change ’cause’ the Colorado wildfires?

By David Roberts

29 Jun 2012 1:50 PM

http://grist.org

Photo by USAF.

The wildfires raging through Colorado and the West are unbelievable. As of yesterday there were 242 fires burning, according to the National Interagency Fire Center. Almost 350 homes have been destroyed in Colorado Springs, where 36,000 people have been evacuated from their homes. President Obama is visiting today to assess the devastation for himself.

Obviously the priority is containing the fires and protecting people. But inevitably the question is going to come up: Did climate change “cause” the fires? Regular readers know that this question drives me a little nuts. Pardon the long post, but I want to try to tackle this causation question once and for all.

What caused the Colorado Springs fire? Well, it was probably a careless toss of a cigarette butt, or someone burning leaves in their backyard, or a campfire that wasn’t properly doused. [UPDATE:Turns out it was lightning.] That spark, wherever it came from, is what triggered the cascading series of events we call “a fire.” It was what philosophers call the proximate cause, the most immediate, the closest.

All the other factors being discussed — the intense drought covering the state, the dead trees left behind by bark beetles, the high winds — are distal causes. Distal causes are less tightly connected to their effects. The dead trees didn’t make any particular fire inevitable; there can be no fire without a spark. What they did is make it more likelythat a fire would occur. Distal causes are like that: probabilistic. Nonetheless, our intuitions tell us that distal causes are in many ways more satisfactory explanations. They tell us something about themeaning of events, not just the mechanisms, which is why they’re also called “ultimate” causes. It’s meaning we usually want.

When we say, “the fires in Colorado were caused by unusually dry conditions, high winds, and diseased trees,” no one accuses us of error or imprecision because it was “really” the matches or campfires that caused them. We are not expected to say, “no individual fire can be definitively attributed to hot, windy conditions, but these are the kinds of fires we would expect to see in those conditions.” Why waste the words? We are understood to be talking about distal causes.

When we talk about, not fires themselves, but the economic and socialimpacts of fires, the range of distal causes grows even broader. For a given level of damages, it’s not enough to have dry conditions and dead trees, not even enough to have fire — you also have to take into account the density of development, the responsiveness of emergency services, and the preparedness of communities for prevention or evacuation.

So if we say, “the limited human toll of the Colorado fires is the result of the bravery and skill of Western firefighters,” no one accuses us of error or imprecision because good firefighting was only one of many contributors to the final level of damages. Everything from evacuation plans to the quality of the roads to the vagaries of the weather contributed in some way to that state of affairs. But we are understood to be identifying a distal cause, not giving a comprehensive account of causation.

What I’m trying to say is, we are perfectly comfortable discussing distal causes in ordinary language. We don’t require scientistic literalism in our everyday talk.

The reason I’m going through all this, you won’t be surprised, is to tie it back to climate change. We know, of course, that climate change was not the proximate cause of the fires. It was a distal cause; it made the fires more likely. That much we know with a high degree of confidence, as this excellent review of the latest science by Climate Communication makes clear.

One can distinguish between distal causes by their proximity to effects. Say the drought made the fires 50 percent more likely than average June conditions in Colorado. (I’m just pulling these numbers out of my ass to illustrate a point.) Climate change maybe only made the fires 1 percent more likely. As a cause, it is more distal than the drought. And there are probably causes even more distal than climate change. Maybe the exact tilt of the earth’s axis this June made the fires 0.0001 percent more likely. Maybe the location of a particular proton during the Big Bang made them 0.000000000000000001 percent more likely. You get the point.

With this in mind, it’s clear that the question as it’s frequently asked — “did climate change cause the fires?” — is not going to get us the answer we want. If it’s yes or no, the answer is “yes.” But that doesn’t tell us much. What people really want to know when they ask that question is, “how proximate a cause is climate change?”

When we ask the question like that, we start to see why climate is such a wicked problem. Human beings, by virtue of their evolution, physiology, and socialization, are designed to heed causes within a particular range between proximate and distal. If I find my kid next to an overturned glass and a puddle of milk and ask him why the milk is spilled, I don’t care about the neurons firing and the muscles contracting. That’s too proximate. I don’t care about humans evolving with poor peripheral vision. That’s too distal. I care about my kid reaching for it and knocking it over. That’s not the only level of causal explanation that is correct, but it’s the level of causal explanation that is most meaningful to me.

For a given effect — a fire, a flood, a dead forest — climate change is almost always too distal a cause to make a visceral impression on us. We’re just not built to pay heed to those 1 percent margins. It’s too abstract. The problem is, wildfires being 1 percent more likely averaged over the whole globe actually means a lot more fires, a lot more damage, loss, and human suffering. Part of managing the Anthropocene is finding ways of making distal causes visceral, giving them a bigger role in our thinking and institutions.

That’s what the “did climate change cause XYZ?” questions are always really about: how proximate a cause climate change is, how immediate its effects are in our lives, how close it is.

There is, of course, a constant temptation among climate hawks to exaggerate how proximate it is, since, all things being equal, proximity = salience. But I don’t think that simply saying “climate change caused the fires” is necessarily false or exaggerated, any more than saying “drought caused the fires” is. The fact that the former strikes many people as suspect while the latter is immediately understood mostly just means that we’re not used to thinking of climate change as a distal cause among others.

That’s why we reach for awkward language like, “fires like this are consonant with what we would expect from climate change.” Not because that’s the way we discuss all distal causes — it’s clearly not — but simply because we’re unaccustomed to counting climate change among those causes. It’s an unfamiliar habit. As it grows more familiar, I suspect we’ll quit having so many of these tedious semantic disputes.

And I’m afraid that, in coming years, it will become all-too familiar.

*   *   *

 

Perspective On The Hot and Dry Continental USA For 2012 Based On The Research Of Judy Curry and Of McCabe Et Al 2004

http://pielkeclimatesci.wordpress.com

Photo is from June 26 2012 showing start of the June 26 Flagstaff firenear Boulder Colorado

I was alerted to an excellent presentation by Judy Curry [h/t to Don Bishop] which provides an informative explanation of the current hot and dry weather in the USA. The presentation is titled

Climate Dimensions of the Water Cycle by Judy Curry

First, there is an insightful statement by Judy where she writes in slide 5

CMIP century scale simulations are designed for assessing sensitivity to greenhouse gases using emissions scenarios They are not fit for the purpose of inferring decadal scale or regional climate variability, or assessing variations associated with natural forcing and internal variability. Downscaling does not help.

We need a much broader range of scenarios for regions (historical data, simple models, statistical models, paleoclimate analyses, etc). Permit creatively constructed scenarios as long as they can’t be falsified as incompatible with background knowledge.

With respect to the current hot and dry weather, the paper referenced by Judy in her Powerpoint talk

Gregory J. McCabe, Michael A. Palecki, and Julio L. Betancourt, 2004: Pacific and Atlantic Ocean influences on multidecadal drought frequency in the United States. PNAS 2004 101 (12) 4136-4141; published ahead of print March 11, 2004, doi:10.1073/pnas.0306738101

has the abstract [highlight added]

More than half (52%) of the spatial and temporal variance in multidecadal drought frequency over the conterminous United States is attributable to the Pacific Decadal Oscillation (PDO) and the Atlantic Multidecadal Oscillation (AMO). An additional 22% of the variance in drought frequency is related to a complex spatial pattern of positive and negative trends in drought occurrence possibly related to increasing Northern Hemisphere temperatures or some other unidirectional climate trend. Recent droughts with broad impacts over the conterminous U.S. (1996, 1999–2002) were associated with North Atlantic warming (positive AMO) and northeastern and tropical Pacific cooling (negative PDO). Much of the long-term predictability of drought frequency may reside in the multidecadal behavior of the North Atlantic Ocean. Should the current positive AMO (warm North Atlantic) conditions persist into the upcoming decade, we suggest two possible drought scenarios that resemble the continental-scale patterns of the 1930s (positive PDO) and 1950s (negative PDO) drought.

They also present the figure below with the title “Impact of AMO, PDO on 20-yr drought frequency (1900-1999)”.   The figures correspond to A: Warm PDO, cool AMO; B: Cool PDO, cool AMO; C: Warm PDO, warm AMO and D:  Cool PDO, warm AMO

The current Drought Monitor analysis shows a remarkable agreement with D, as shown below

As Judy shows in her talk (slide 8) since 1995 we have been in a warm phase of the AMO and have entered a cool phase of the PDO. This corresponds to D in the above figure.  Thus the current drought and heat is not an unprecedented event but part of the variations in atmospheric-ocean circulation features that we have seen in the past.  This reinforces what Judy wrote that

[w]e need a much broader range of scenarios for regions (historical data, simple models, statistical models, paleoclimate analyses

in our assessment of risks to key resources due to climate. Insightful discussions of the importance of these circulation features are also presented, as just a few excellent examples, by Joe Daleo  and Joe Bistardi on ICECAP, by Bob Tisdale at Bob Tisdale – Climate Observations, and in posts on Anthony Watts’s weblog Watts Up With That.

 

*   *   *

Hotter summers could be a part of Washington’s future

http://www.washingtonpost.com

By  and , Published: July 5

As relentless heat continues to pulverize Washington, the conversation has evolved from when will it end to what if it never does?

Are unbroken weeks of sweltering weather becoming the norm rather than the exception?

The answer to the first question is simple: Yes, it will end. Probably by Monday.

The answer to the second, however, is a little more complicated.

Call it a qualified yes.

“Trying to wrap an analysis around it in real time is like trying to diagnose a car wreck as the cars are still spinning,” said Deke Arndt, chief of climate monitoring at the National Climatic Data Center in Asheville, N.C. “But we had record heat for the summer season on the Eastern Seaboard in 2010. We had not just record heat, but all-time record heat, in the summer season in 2011. And then you throw that on top of this [mild] winter and spring and the year to date so far, it’s very consistent with what we’d expect in a warming world.”

Nothing dreadfully dramatic is taking place — the seasons are not about to give way to an endless summer.

Heat-trapping greenhouse gases pumped into the atmosphere may be contributing to unusually hot and long heat waves — the kind of events climate scientists have long warned will become more common. Many anticipate a steady trend of ever-hotter average temperatures as human activity generates more and more carbon pollution.

To some, the numbers recorded this month and in recent years fit together to suggest a balmy future.

“We had a warm winter, a cold spring and now a real hot summer,” said Jessica Miller, 21, a visitor from Ohio, as she sat on a bench beneath the trees in Lafayette Square. “I think the overall weather patterns are changing.”

Another visitor, who sat nearby just across from the White House, shared a similar view.

“I think it’s a natural changing of the Earth’s average temperatures,” said Joe Kaufman, a Pennsylvanian who had just walked over from Georgetown.

Arndt said he expects data for the first half of this year will show that it was the warmest six months on record. Experts predict that average temperatures will rise by 3 to 5 degrees by mid-century and by 6 to 10 degrees by the end of the century.

If that worst prediction comes true, 98 degrees will become the new normal at this time of year in Washington 88 years from now.

Will every passing year till then break records?

“Not so much record-breaking every year,” Arndt said. “But we’ll break records on the warm end more often than on the cold end, that’s for sure. As we continue to warm, we will be flirting with warm records much more than with cold records, and that’s what’s played out over much of the last few years.”

If the present is our future, it may be sizzling. The current heat wave has had eight consecutive days of 95-degree weather. The temperature may reach 106 on Saturday, and the first break will come Monday, when a few days of more seasonable highs in the upper 80s are expected.

The hot streak began June 28 and peaked the next day with a 104-degree record-breaker, the hottest temperature ever recorded here in June. That broke a record of 102 set in 1874 and matched in June 2011.

 

 

Political Scientists Are Lousy Forecasters (N.Y.Times)

OPINION

Katia Fouquet

By JACQUELINE STEVENS
Published: June 23, 2012

DESPERATE “Action Alerts” land in my in-box. They’re from the American Political Science Association and colleagues, many of whom fear grave “threats” to our discipline. As a defense, they’ve supplied “talking points” we can use to tell Congressional representatives that political science is a “critical part of our national science agenda.”

Political scientists are defensive these days because in May the House passed an amendment to a bill eliminating National Science Foundation grants for political scientists. Soon the Senate may vote on similar legislation. Colleagues, especially those who have received N.S.F. grants, will loathe me for saying this, but just this once I’m sympathetic with the anti-intellectual Republicans behind this amendment. Why? The bill incited a national conversation about a subject that has troubled me for decades: the government — disproportionately — supports research that is amenable to statistical analyses and models even though everyone knows the clean equations mask messy realities that contrived data sets and assumptions don’t, and can’t, capture.

It’s an open secret in my discipline: in terms of accurate political predictions (the field’s benchmark for what counts as science), my colleagues have failed spectacularly and wasted colossal amounts of time and money. The most obvious example may be political scientists’ insistence, during the cold war, that the Soviet Union would persist as a nuclear threat to the United States. In 1993, in the journal International Security, for example, the cold war historian John Lewis Gaddis wrote that the demise of the Soviet Union was “of such importance that no approach to the study of international relations claiming both foresight and competence should have failed to see it coming.” And yet, he noted, “None actually did so.” Careers were made, prizes awarded and millions of research dollars distributed to international relations experts, even though Nancy Reagan’s astrologer may have had superior forecasting skills.

Political prognosticators fare just as poorly on domestic politics. In a peer-reviewed journal, the political scientist Morris P. Fiorina wrote that “we seem to have settled into a persistent pattern of divided government” — of Republican presidents and Democratic Congresses. Professor Fiorina’s ideas, which synced nicely with the conventional wisdom at the time, appeared in an article in 1992 — just before the Democrat Bill Clinton’s presidential victory and the Republican 1994 takeover of the House.

Alas, little has changed. Did any prominent N.S.F.-financed researchers predict that an organization like Al Qaeda would change global and domestic politics for at least a generation? Nope. Or that the Arab Spring would overthrow leaders in Egypt, Libya and Tunisia? No, again. What about proposals for research into questions that might favor Democratic politics and that political scientists seeking N.S.F. financing do not ask — perhaps, one colleague suggests, because N.S.F. program officers discourage them? Why are my colleagues kowtowing to Congress for research money that comes with ideological strings attached?

The political scientist Ted Hopf wrote in a 1993 article that experts failed to anticipate the Soviet Union’s collapse largely because the military establishment played such a big role in setting the government’s financing priorities. “Directed by this logic of the cold war, research dollars flowed from private foundations, government agencies and military individual bureaucracies.” Now, nearly 20 years later, the A.P.S.A. Web site trumpets my colleagues’ collaboration with the government, “most notably in the area of defense,” as a reason to retain political science N.S.F. financing.

Many of today’s peer-reviewed studies offer trivial confirmations of the obvious and policy documents filled with egregious, dangerous errors. My colleagues now point to research by the political scientists and N.S.F. grant recipients James D. Fearon and David D. Laitin that claims that civil wars result from weak states, and are not caused by ethnic grievances. Numerous scholars have, however, convincingly criticized Professors Fearon and Laitin’s work. In 2011 Lars-Erik Cederman, Nils B. Weidmann and Kristian Skrede Gleditsch wrote in the American Political Science Review that “rejecting ‘messy’ factors, like grievances and inequalities,” which are hard to quantify, “may lead to more elegant models that can be more easily tested, but the fact remains that some of the most intractable and damaging conflict processes in the contemporary world, including Sudan and the former Yugoslavia, are largely about political and economic injustice,” an observation that policy makers could glean from a subscription to this newspaper and that nonetheless is more astute than the insights offered by Professors Fearon and Laitin.

How do we know that these examples aren’t atypical cherries picked by a political theorist munching sour grapes? Because in the 1980s, the political psychologist Philip E. Tetlock began systematically quizzing 284 political experts — most of whom were political science Ph.D.’s — on dozens of basic questions, like whether a country would go to war, leave NATO or change its boundaries or a political leader would remain in office. His book “Expert Political Judgment: How Good Is It? How Can We Know?” won the A.P.S.A.’s prize for the best book published on government, politics or international affairs.

Professor Tetlock’s main finding? Chimps randomly throwing darts at the possible outcomes would have done almost as well as the experts.

These results wouldn’t surprise the guru of the scientific method, Karl Popper, whose 1934 book “The Logic of Scientific Discovery” remains the cornerstone of the scientific method. Yet Mr. Popper himself scoffed at the pretensions of the social sciences: “Long-term prophecies can be derived from scientific conditional predictions only if they apply to systems which can be described as well-isolated, stationary, and recurrent. These systems are very rare in nature; and modern society is not one of them.”

Government can — and should — assist political scientists, especially those who use history and theory to explain shifting political contexts, challenge our intuitions and help us see beyond daily newspaper headlines. Research aimed at political prediction is doomed to fail. At least if the idea is to predict more accurately than a dart-throwing chimp.

To shield research from disciplinary biases of the moment, the government should finance scholars through a lottery: anyone with a political science Ph.D. and a defensible budget could apply for grants at different financing levels. And of course government needs to finance graduate student studies and thorough demographic, political and economic data collection. I look forward to seeing what happens to my discipline and politics more generally once we stop mistaking probability studies and statistical significance for knowledge.

Jacqueline Stevens is a professor of political science at Northwestern University and the author, most recently, of “States Without Nations: Citizenship for Mortals.”

A version of this op-ed appeared in print on June 24, 2012, on page SR6 of the New York edition with the headline: Political Scientists Are Lousy Forecasters.

What was he thinking? Study turns to ape intellect (AP)

By SETH BORENSTEIN-Associated Press Sunday, June 24, 2012

WASHINGTON (AP) – The more we study animals, the less special we seem.

Baboons can distinguish between written words and gibberish. Monkeys seem to be able to do multiplication. Apes can delay instant gratification longer than a human child can. They plan ahead. They make war and peace. They show empathy. They share.

“It’s not a question of whether they think _ it’s how they think,” says Duke University scientist Brian Hare. Now scientists wonder if apes are capable of thinking about what other apes are thinking.

The evidence that animals are more intelligent and more social than we thought seems to grow each year, especially when it comes to primates. It’s an increasingly hot scientific field with the number of ape and monkey cognition studies doubling in recent years, often with better technology and neuroscience paving the way to unusual discoveries.

This month scientists mapping the DNA of the bonobo ape found that, like the chimp, bonobos are only 1.3 percent different from humans.

Says Josep Call, director of the primate research center at the Max Planck Institute in Germany: “Every year we discover things that we thought they could not do.”

Call says one of his recent more surprising studies showed that apes can set goals and follow through with them.

Orangutans and bonobos in a zoo were offered eight possible tools _ two of which would help them get at some food. At times when they chose the proper tool, researchers moved the apes to a different area before they could get the food, and then kept them waiting as much as 14 hours. In nearly every case, when the apes realized they were being moved, they took their tool with them so they could use it to get food the next day, remembering that even after sleeping. The goal and series of tasks didn’t leave the apes’ minds.

Call says this is similar to a person packing luggage a day before a trip: “For humans it’s such a central ability, it’s so important.”

For a few years, scientists have watched chimpanzees in zoos collect and store rocks as weapons for later use. In May, a study found they even add deception to the mix. They created haystacks to conceal their stash of stones from opponents, just like nations do with bombs.

Hare points to studies where competing chimpanzees enter an arena where one bit of food is hidden from view for only one chimp. The chimp that can see the hidden food, quickly learns that his foe can’t see it and uses that to his advantage, displaying the ability to perceive another ape’s situation. That’s a trait humans develop as toddlers, but something we thought other animals never got, Hare said.

And then there is the amazing monkey memory.

At the National Zoo in Washington, humans who try to match their recall skills with an orangutan’s are humbled. Zoo associate director Don Moore says: “I’ve got a Ph.D., for God’s sake, you would think I could out-think an orang and I can’t.”

In French research, at least two baboons kept memorizing so many pictures _ several thousand _ that after three years researchers ran out of time before the baboons reached their limit. Researcher Joel Fagot at the French National Center for Scientific Research figured they could memorize at least 10,000 and probably more.

And a chimp in Japan named Ayumu who sees strings of numbers flash on a screen for a split-second regularly beats humans at accurately duplicating the lineup. He’s a YouTube sensation, along with orangutans in a Miami zoo that use iPads.

Rio+20 sem ciência (Mundo Sustentável, G1)

sáb, 16/06/12

por André Trigueiro

Depois de cinco dias reunidos na Pontifícia Universidade Católica do Rio (PUC-RJ), 500 cientistas de 75 países – seis deles Prêmios Nobel – produziram um relatório contundente em que resumem a situação do planeta. Entre outras informações, eles dizem que “há evidências científicas convincentes de que o atual modelo de desenvolvimento está minando a capacidade de o planeta responder às agressões do homem”. Manifestam preocupação com o fato de que “os níveis de produção e de consumo poderão causar mudanças irreversíveis e catastróficas para a humanidade”. Mas asseveram que “temos conhecimento e criatividade para construir um novo caminho. Entretanto, é preciso correr contra o tempo”.

O Prêmio Nobel de Química,Yuan Tse Lee, de Taiwan, foi escolhido pelos colegas para uma missão quase impossível: resumir em apenas dois minutos para os chefes de estado no Riocentro o que de mais importante aparece no relatório. Apenas 120 segundos serão suficientes para inspirar nas principais lideranças do mundo o devido senso de urgência? Bom, foi este o tempo definido pelo protocolo da ONU. Perguntei ao dr.Yuan qual seria a mensagem mais importante do relatório.

“Não temos muito mais tempo para transformar a sociedade, torná-la sustentável. Se continuarmos nesse ritmo, vai ficar cada vez pior. Entraremos numa grande enrascada”, disse ele, para em seguida arrematar com um lampejo de confiança no futuro:”Não temos o direito de ficar pessimistas. Estou feliz a de ver tantos jovens no Rio”.

Quem também estava no encontro foi o climatologista Carlos Nobre, que nesta semana teve a honra de escrever o editorial da prestigiada revista científica Science com o sugestivo título de “UNsustainable? (com as iniciais da ONU em maiúsculas no início da palavra “insustentável” em inglês) onde afirmou que o mundo “saiu da zona de segurança”. Perguntei a ele se a classe política está ouvindo os alertas dos cientistas.

“Nós estamos tendo dificuldade de comunicar a todos os tomadores de decisão o senso de urgência. Tempo talvez seja o recurso mais escasso na questão do desenvolvimento sustentável”. Ao ser indagado sobre o que estava em jogo, caso as recomendações dos cientistas não fossem consideradas pelos tomadores de decisão, o atual secretário de Políticas e Programas de Pesquisa e Desenvolvimento do Ministério da Ciência, Tecnologia e Inovação respondeu com indisfarçável preocupação. “O risco de excedermos alguns limites planetários existe. Os recursos não são infinitos e a capacidade da Terra absorver os choques também não. No caso do clima, por exemplo, provavelmente também já estamos operando fora da margem de segurança”.

Para Carlos Nobre, “a urgência da situação planetária requer decisões também urgentes e ações imediatas. Essa distância entre o que os cientistas percebem como urgente e a as respostas dadas pelo sistema político configura o descompasso”.

Deixei a PUC intrigado não apenas pela contundência de mais um alerta da comunidade científica, mas também pela ausência de jornalistas interessados em cobrir o maior evento paralelo da Rio+20 na área da ciência. Será que nós, profissionais de imprensa, também estamos em descompasso com as informações relevantes descortinadas pela comunidade científica? Será este um assunto restrito às mídias especializadas ou todos os jornalistas e comunicadores deveriam abrir mais espaços, especialmente em tempos de crise, para o que os cientistas estão dizendo? Vale a reflexão. E, sobretudo, a ação.